Monday, December 26, 2011

Big Blue Button Web Conferencing

A few days ago, I mentioned that I had found an awesome open source web conferencing software (ala WebEx) called Big Blue Button.  I am now looking at alternatives to BBB. 

Don't get me wrong, I really, really like Big Blue Button.  It just seems to me to be a bit more of a developers tool.  It doesn't integrate well with things that I would see a business needing, such as with calendaring systems for scheduling meetings in advance.  Granted, they do say it is intended for educational institutions and not for the corporate world.  There are some integrations already built for several Content Management Systems, but those are no good if you don't have a need for a CMS.  I even tried one of those integrations and found that it still lacked certain features that I wanted.  They do offer an API so you can perform such integrations with your existing applications, but if you're not a developer, you might just be wasting your time. 

Right now my focus is on a project called OpenMeetings.  Honestly, I don't like it as much as Big Blue Button, but it seems to fit all my needs.  I can even tell OpenMeetings to allow my Active Directory users to log in and schedule meetings instead of having to maintain a separate database of user accounts.  There are some things that bother me about OpenMeetings, though.  One of which is the lack of a "raise hand" button.  You can achieve the same result by clicking on a button that is used to request "Moderator" permissions.  It will show up on the screen as "<user> has a question".  It will have a check mark and an "X" next to it.  If the moderator clicks the check mark, then that user will be granted "Moderator" permissions for that meeting.  This can be misleading as it never says, "so and so is requesting new permissions", it just says "so and so has a question".  Little stuff like that always finds a way to irk me.

Overall, OpenMeetings does seem to put me much closer to an out-of-the-box experience than Big Blue Button.  And that's the reason I'm now putting my focus into using OpenMeetings.



Migrating Exchange Mailboxes from Exchange 2003 to Exchange 2010

After adding an Exchange 2010 server to my lab, I wanted to shut down my Exchange 2003 server.  Before I could do that, though, I needed to migrate the Exchange 2003 mailboxes to the new 2010 database.  Fortunately, this can be done from the EMC on the 2010 server.

Unfortunately, after I tried to migrate one of the mailboxes, I noticed that the process was failing after a couple of minutes.  I fired up the handy dandy Exchange 2010 Powershell prompt and used the command "get-moverequeststatistics -id <mailbox name>.  It showed me that it was failing at 95% completion due to a problem with msExchVersion.

I fired up ADSI Edit and located msExchVersion in the schema for that particular user.  Sure enough, it was null.  I guess the migration process was expecting some value there and failing because the field was empty.  I looked at that field on a user I had added an Exchange 2010 mailbox for and just copied the value for msExchVersion to each of my old 2003 mailbox users.  After doing that the migration worked without a hitch.

I'm sure there must be some way to correct this in one fell swoop, but since I didn't have that many accounts, I chose to just add the value to each user manually.  This would have been a pain in a large organization, though.  




Exchange 2010: Changes to New User Setup Process

Before, with Exchange 2003, adding new users with email addresses could be done completely from within Active Directory Users and Computers (ADUC).  You could even do this from, let's say, an XP workstation.  You just needed to have the adminpak installed for ADUC and the Exchange System Management tools from the Exchange 2003 installation media.  That process has now changed with Exchange 2010.

Now you can use ADUC to add a new user, but you can't add the mailbox for the user.  You have to use the Exchange Management Center (EMC) to do that.  Of course, since Exchange 2010 only comes in a 64-bit flavor, there is no EMC for 32-bit platforms.  So if your usual workstation doesn't have a 64-bit version of Windows above the XP version (XP won't work with Exchange 2010 EMC), then you are out of luck.  You'll have to remote into your Exchange server to manage the mailboxes for your organization.

You could start by using EMC to add a new user instead of using ADUC.  It will create the user and the mailbox.  However, it won't let you do things like modify group memberships, so you'll end up having to use ADUC anyway.

Maybe it's just me, but this seems like a step in the wrong direction. 



Gparted: Extend Virtual Windows Server Disk

I just setup a Windows Server 2008 machine in my lab to host Exchange 2010.  I use VirtualBox for my lab, by the way.  I had added two disks to this particular virtual machine, one for the OS and one for data.  The OS disk was only 20 GB, which ran out pretty quickly.  So I simply added yet another (much larger) virtual disk and booted using a Gparted Live CD.  Then I simply used the dd command to copy everything from the smaller disk onto the larger disk.

After the copy, the new disk showed that it had a 20 GB partition and the rest of the space on the new disk was not partitioned.  Still using Gparted, I chose to extend the partition on the disk to include the extra space.  All in all, this gave me 40 GB total.

Everything looked good, so I rebooted and was greeted with an error when trying to boot into Windows 2008.  No problem.  I booted the machine from the Windows 2008 install disc and chose the repair option.  In some other versions of Windows, choosing the repair option will cause Windows to attempt to discover and repair any errors it finds.  However, Windows 2008 is a little different.  It will give you some options including restoring from a recent backup or launching a command prompt.

I chose to launch the command prompt.  Immediately after opening the prompt, I simply changed my directory to "recovery" and ran "StartRep.exe".  It ran for a moment and after I restarted again I was able to boot my machine normally.  When it started back up, my OS disk was now showing as the full 40 GB in Windows.   


Thursday, December 22, 2011

Open Source Web Conferencing

I was recently asked to setup Skype at work for some clients so they could video conference with each other.  Being the way that I am, I immediately started thinking of alternatives to Skype.  I wanted to see if there was anything that was out there that offered more features and was free and/or open source.  I managed to find Big Blue Button today. 

With Big Blue Button, you can create a meeting and it will give you a link you can send to anyone you want to participate.  It looks like it was designed primarily for use with distance learning programs, but it could definitely be used by anyone looking for a good web conferencing software.  It can be used in conjunction with VOIP so callers can join a bridge and be part of the conference.  Other than that, you can just surf over to the web link to join a meeting. 

Once in a meeting, you can share a presentation with the meeting attendees (PDF and most MS Office file formats appear to work) and share your desktop, webcam, or microphone.  It supports a whiteboarding feature that lets you markup your presentation.  And of course, it has chat functionality that lets you either chat with everyone or have private conversations with attendees. 

We have always used WebEx where I work for business meetings and presentations, but I haven't been able to find anything that we use it for that we can't do with Big Blue Button.

I'm going to continue playing with it in my test lab.  I'll post here with any cool new features that I uncover while messing around with it. 




Fog Follow Up

I just wanted to take a moment to follow up with my last post on using the Fog Project to image workstations.  My last post centered around the use of Fog on a private LAN.  However, you can also use Fog on your existing network.  You would just need to install it and tell it not to use a local DHCP server.  You would instead need to configure your existing DHCP server with the appropriate options to PXE boot the workstations on your network. 

You can also include the Fog agent on your image, so workstations can check in with the Fog server periodically for tasks that should be performed.  This is not limited to imaging tasks, either.  You can schedule software installations such as Microsoft Office using Fog "snap-ins".  You can even instruct the workstations to perform virus scans using ClamAV.  It is a full-featured imaging solution that rivals any that I've ever used before, including pricier options that companies tend to choose because they might not know about Fog.  It even has a mobile interface that you can access from your smart phone.

However, I wouldn't rely on Fog alone for your inventory purposes.  For that, I'd have to recommend OCS Inventory NG.  It's also a free product that reminds me a lot of Microsoft's SMS server.  It also uses an agent/server model.  It can be used to deploy software and to maintain a single place for storing hardware and software inventory.  If you need to know which workstations have a certain version of Office installed, for instance, you would only need to check in OCS Inventory to get that information.  Very nice when it comes time to evaluate all your software licenses.

I'm thinking about creating some how-to's and maybe some videos on how to get up and running with Fog and OCS Inventory NG.  I'll see what I can come up with and when I have something worthwhile I'll post it on this blog. 


Sunday, December 11, 2011

Imaging Workstations with the Fog Project

I needed a plan for imaging a bunch of workstations for a new office that we're setting up.  That's when I remembered Fog.  Fog is a free and open source imaging solution that runs on Linux.  Its intended purpose is to aid with imaging Windows operating systems.  Since I'm dealing with Windows XP, that's perfect for my situation.

I ended up installing Fog 0.32 onto Ubuntu 10.04 LTS because it installs cleanly (read: no dependency hunting).  That machine acts as the imaging server and I can control it using the Fog web interface from any machine on the same network.  From there, I can create new images and deploy existing images.

Most of the image creation process is done outside of Fog, though.  I simply follow the guidelines for using Microsoft's Sysprep tool.  Then once I have done that, I choose to upload the image to Fog.  Once it's there, it is available to be deployed to other workstations.

My requirement so far has been that the solution will need to be completely off of our network, meaning it has to be on its own private LAN.  On top of that, I'm talking about imaging 100+ workstations.

Since it's on a private LAN, Fog supports using a local DHCP server on the machine you've installed it on.  This is the way I've got it setup now.  Other machines on the same private network will be able to PXE boot to a Fog menu.  The default option in the menu is to just boot the local hard drive.  Before you can image a machine, you have to choose the option in the menu to quick register the host.  After that, the machine shows up in the web interface and you can now send an image task to it.

However, I don't want to have to go around to over 100 PCs and manually choose to register them.  That's where Capone comes into play.  Capone is a Fog plugin that is actually included with Fog, but just needs to be enabled through the web interface.  Capone lets you associate an image with a particular PC type.  This can be a specific system identifier, BIOS vendor, etc.  Then anytime a matching PC is plugged into the imaging network and PXE boots, there will be a Capone option in the Fog menu that will deploy the image to the PC.  So far so good!

The new dilemma:  I don't want to go around to over 100 PCs and manually choose the Capone option from the menu, either.    Fortunately, you can change the default option by editing a file on the Fog server itself.  You can't, as far as I can tell, change this using the web interface.  But it can be done, that's what is important.  So I just set Capone to be the default option.  Now all I have to do is connect a bunch of machines to the same switch the Fog server is on and power them up.  As long as they are all set to PXE boot, they'll all start to receive the image.

Even though, there are no plans to move this onto our network at this time, there are some cool features that you can use with Fog to manage your already imaged workstations.  Plus you would have the ability to image PCs while they are on someone's desk without needing to move them to your private LAN.  Of course, to do any of this, the machines will need to be registered with Fog.  Capone doesn't do any type of registration.  After the machines are imaged and deployed, my thought is that I could use a script to collect each machine name and MAC address into a csv file.  Then, though the Fog web interface, I can upload the csv to add the hosts.  I would probably also build an additional Fog server to go on the public network.  I think it would be a good idea to keep the private server for large imaging tasks (e.g. large equipment purchases).

All in all, Fog is a fantastic solution.  It, in my opinion, beats the pants off Symantec Ghost and CloneZilla.  If you work for an organization that does a lot of imaging, I would highly suggest you at least try it out.  And kudos to the Fog developers.     


Sunday, November 27, 2011

Ubuntu and Mint "Open with other Application"

I just realized something that irks me a little bit about Ubuntu and Mint.  Now if you right click a file and choose "Open with other application", you can choose from a list of applications to use, but you can't browse your filesystem if the app you want isn't listed.  It's a real shame that they removed that functionality.  Even Windows still lets you do that.

Linux Mint 12 Released -- Still with Ubuntu

In case you weren't aware, it looks like Linux Mint 12 has been released.  But before I get into why I won't be switching to it, I'd like to give a new update on my experience with Ubuntu 11.10.

I'm actually still enjoying it on both my desktop and laptop.  The only thing that continuously annoys me is that sometimes when I have an application maximized and I go to close it, the Unity launcher pops up in the way.  Granted, this is my fault for moving the mouse to the left of the screen before moving it up to where the close button is located.  However, this could be solved simply by moving the close button back to the right side of the screen.  By the way, Linux Mint does this, giving us a more familiar interface for closing, maximizing, and minimizing windows. 

As far as the Unity dash goes, I wish it were a little more configurable.  Especially the primary dash screen.  I'd really like to be able to configure which apps show up there.  My only other dash related gripe is the Find Files section.  It only appears to be searching recently used files.  I would think that it would at least search all files on my hard disk and possibly any other mounted drives.  But, alas, it does not appear to do so.  Of course, I'm not really searching for applications that often because I have moved the few apps I use regularly to the launcher.  I'm sure this is the intended effect of Unity.  I've also found over time that users who have hundreds of applications installed probably only use a handful of them on a regular basis.  I guess I am one of those users, too.

Even though I don't really have any major complaints for the direction Ubuntu has chosen, it's hard not to be tempted by Linux Mint.  Version 12 gives us a Gnome 3 environment that looks and handles much like a Gnome 2 system.  In fact, it looks like the Mint devs have listened to what people hate about Gnome 3 and tried to make their system do the opposite of what the Gnome 3 developers had in mind.  And while I'm the first to admit that familiarity is extremely tempting, there is one major reason and one minor reason why I won't be switching to Linux Mint.

The minor reason is related to how often Linux Mint asks me to input my password.  I know this is to protect users from doing something stupid with their systems, but come on!  I bet if I looked at Mint sideways it would probably ask me to type my password for doing so.  It kind of reminds me of Windows Vista's UAC.  Ubuntu still asks for it, but it doesn't seem to do so quite as often.  An example is if you install Mint using the CD that doesn't have the extra codecs.  On the startup menu, there is an option to install the additional codecs.  When you click it, you'll be asked for your password.  A moment later, right before the codecs are installed, you'll have to type your password again.  It really should only ask once, right before anything new is installed.

The major reason is full disk encryption.  My personal laptop and my work laptop are one in the same.  Sure, my company would give me a laptop if I asked for it, but I prefer using my own and I have that option where I work.  Since I could have work related files on my laptop at any given time, the disk must be encrypted.  It's not a matter of how secure full disk encryption is, it's a matter of adhering to company policy.  Full disk encryption is something that is done during the installation of the operating system.  Everything except the /boot partition gets encrypted and protected by a passphrase.  Linux Mint does not give you this option.  The funny thing is that Ubuntu does give you that option on their alternate installation CD.  Since Mint uses Ubuntu as its base, you'd think they would offer a similar option or at least their own alternate CD.  However, they do not seem to offer any such thing, at least at this time.

At the end of the day, I'm satisfied with Ubuntu, but it would still be nice to have an alternative "easy to use" distro as a backup.  The fact that Linux Mint seems to be going the opposite direction of Ubuntu makes it an even more appealing candidate.  But the lack of full disk encryption as an installer option is what is keeping me away from it.  Other than that, I would suggest that you give it a try if you absolutely can't stand the current direction of Ubuntu or other Gnome 3 using distros.     







Saturday, November 19, 2011

Distro Hopping

I have been a Slackware user for many years.  I absolutely love it.  That has not changed.  However, I feel that I have changed.  Realizing that I'm spending more and more of my time experimenting and tweaking my system rather than using it, I decided that something needed to happen.  So I have installed Ubuntu 11.10 on my desktop.

Does this mean that I'm not an advanced Linux user?  No, it does not.  It just means that I'm tired of spending all day doing complicated things and solving complicated system problems as a systems admin, only to come home and do the same exact thing with my home computers.  Just because I'm a power user at work, doesn't mean I can't be afforded some simplicity in my home life.  I evaluated several different distros and arrived at Ubuntu because it is known for making things simple.  Tasks like installing video drivers that would require some command line effort in most other distros can be done in Ubuntu with a few clicks of the mouse.  I honestly don't have to think much, and I appreciate being able not to.  It lets me get back to what I should be doing...playing the role of an end-user.

I'm using the default Unity interface on Ubuntu.  At first I didn't like it, but it has grown on me.  I guess I had better get used to it because it looks like even Windows is going down the road of having a single look for desktop computers, tablets, and phones.  Although I like the Unity interface much more than the Metro interface (at least as it appears in the Windows 8 Developer Preview).  Of course, I don't own a tablet and my phone runs Android.

The Ubuntu Software Center has also made installing the applications that I commonly use a breeze.  I can use it to search for new applications and easily click to install them.

I did make some small tweaks to the LightDM login manager.  I wanted it to prompt me for both username and password instead of showing me a list of users to click on.  Then I wanted it to enable numlock automatically.  I use the number keypad a lot and didn't want the extra step of having to hit the numlock key.  Yep, I'm just that lazy.

As far as Unity goes, I haven't made a lot of changes other than adding/removing programs from the launcher.  I am a little annoyed, however, that Unity doesn't feature a slideshow-type wallpaper selection.  I know there are 3rd party applications that can automatically change my wallpaper for me, but they all have to run in the background in order to function.  I used a Python script that I found with a little Google-Fu, which uses the Contest wallpaper and background-1.xml to change the wallpaper (using the wallpaper image path I provided) at a specified interval.  However, the script doesn't support scaling the images, so some of my larger wallpaper images don't look right on the screen.  I'd love a script or command that will change the wallpaper randomly when given a path with images and for it to also support scaling.  Something I could just stick in the cron would be preferred.  Maybe something already exists like that and I just haven't found it yet.

All in all, it's been a good experience so far.  I was worried, especially with so many people jumping ship from Ubuntu because of disdain for Unity.  It is very different from what most of us are used to, but if you spend enough time with it, you'll see that it really isn't all that bad.  But I can only imagine that if Linux users, who are typically thought to be more advanced computer users, are this upset over a change like Unity then Windows users are going to hit the roof with anger when Windows 8 gets released with its Metro interface.   

Update:  I found out how to get the contest wallpaper slideshow to scale the images.  I just deleted the "<option>zoom</option>" line from the contest section of /usr/share/gnome-background-properties/ubuntu-wallpapers.xml.  After doing that, the option to choose tile, zoom , scale, etc., became available when selecting "Contest" in the wallpaper selection area of the Appearance configuration window. 

Sunday, October 30, 2011

Linux Kernel 3.1

I just finished making some upgrades to my desktop system at home. I added a Netgear WNDR4500 dual-band router to my network. This increases my LAN to gigabit network speeds and wireless N (at least for those devices I have that support those things). I transfer a lot of data over my network, so a 100Mbps network just wasn't cutting it anymore.

I also added 2 more sticks of DDR3 1600MHz to my machine, bringing the total amount of RAM up to 12 GB. And finally, I replaced my quad-core AMD Phenom II processor with a six-core Phenom II 1100t.

I'd really like to switch out my SATA hard disk with some SSD's, but I think I'm going to wait a while longer. It's a matter of capacity vs price point, and the price just isn't where I think it should be for SSD's.

After making all those hardware upgrades, I decided that I'd try updating some software. I remembered that kernel 3.1 had been released and decided that I'd give it a shot on my Slackware64 13.37 desktop.

It wasn't listed on the main page of kernel.org, but I was able to go to http://kernel.org.pub/ and find it there. After compiling the new kernel I rebooted, re-installed my NVIDIA driver, and everything seemed okay. After the next reboot or two I noticed that each time the system started that the clock was off by a few hours. I checked the BIOS on the machine and the time was set correctly. Under the new kernel, running hwclock also gave an error. I ended up recompiling the kernel because I had not selected RTC (Real Time Clock) in my kernel options last time. After that, the system time is fine under the new kernel.

I use NetworkManager with Slackware and have an entry to start it and then mount my NFS shares n /etc/rc.d/rc.M. This worked fine with the stock Slackware kernel, but now my NFS shares were not being mounted at boot. If I manually ran mount -a, they would mount. I ended up putting a 5 second sleep in rc.M in between starting NetworkManager and attempting to run the mount command. This worked like a charm.

Friday, July 1, 2011

Installing cups-bjnp in Slackware

I have a Canon multifunction printer/scanner/fax in my home office. After I recently re-installed Slackware64 13.37, I realized I still needed to setup the printer. I downloaded the latest release of cups-bjnp from Sourceforge and attempted to install it. Taking the lazy approach with the ./configure did not work, as it gave an error about not being able to find the CUPS backend directory. This is easily remedied by using the "--with-cupsbackenddir=" option. The full configure command looks like this:
./configure --with-cupsbackenddir=/usr/lib64/cups/backend
Then just fire up the cups daemon and a web browser and visit the local machine on port 631 to add the printer and manage your CUPS settings. You'll notice that you'll now be able to select to use a Canon bjnp connection type in addition to several others such as http and ipp.