GNU/Linux: Server Upgrade Problem Solving

Notice: This article is not specifically about GNU/Linux. It is under our GNU/Linux category because the server in the article was and is a GNU/Linux based server. Some portions of the article deal with solving upgrade problems for applications that run on the underlying GNU/Linux distribution. In summary, this is a hardware and software article.

Recently my company had the opportunity to upgrade a server to Mandriva 2010 that was running an old version of the Mandriva GNU/Linux distribution. The system had been in place running along nicely for a few years and had not been upgraded to a new release in all that time except for some security patches. Then it started hanging mysteriously whenever under load from users opening Squirrelmail with large amounts of mail in the INBOX. Looking at logs, checking settings and system files revealed nothing. However, once the system was taken off-line, brought in-house to ERACC and the cover removed we discovered there were several popped capacitors on the old motherboard. This was determined to be the source of the hangs:

Gigabyte Motherboard Blown Capacitors

This old Gigabyte motherboard was from near the beginning of the AMD dual-CPU era when one could first put together a system with two AMD Athlon MP CPUs in it. It had a pair of these installed (AMD Athlon MP 2400+) and 512 MB of RAM. The Gigabyte board also had two PCI 64-bit slots, one of which was in use with an Adaptec 29160 SCSI controller that controls two SCSI drives. These were in a Linux MD RAID1 configuration except for the “/boot” partition. The small business owner of the server did not want to buy an entirely new server due to the current poor economy (Thanks to our current USA presidential administration and a complicit Congress. The bums.) and cash-flow being so tight. A new server could easily end up costing well over a thousand dollars. So my company was given the task of replacing this old motherboard with another from the same time frame and then doing an upgrade on the installed OS. Searching the web turned up some “recovered” (a.k.a. used.) Tyan S2469GN dual-CPU boards. These were not new but they were the best we were able to find for this system.

Luckily this particular server only handles smtp send/receive, some webmail and serves a few HTML pages for a small off-shoot business of the parent business. It would not be catastrophic for it to be down for a while. So, we could take the time to get things right while trying to keep things as inexpensive as possible. The client ordered one of the S2469GN boards and called us to come get it when it came in. Once we had the S2469GN here we discovered it was just slightly too large for the existing case. The S2469GN is a full sized Extended ATX, full CEB specification motherboard (12″ x 13″). We also discovered that the Gigabyte motherboard used a 20-pin power plug where the S2469GN uses a 24-pin power plug and requires an 8-pin power plug as well. So, we informed the client he would need a different case and an ATX EPS12V power supply. A search of old cases and power supplies at our offices and at the client site did not turn up anything we could use.

A search of new cases turned up the Antec P193 to handle the full sized EATX S2469GN motherboard. A search of power supplies came up with the BFG GX-550 ATX12V 2.2 550 Watt modular power supply. Both were found at online retail shops at a decent price that would not break the budget for this job. (ERACC does not sell individual components, only complete systems and some software licenses.) The client ordered these and once again called us to come get them when they arrived. For the record, the Antec P193 is a beautiful, roomy, well designed case.

Assembly of the system went smoothly due to the excellent design of the P193 case. The Adaptec and RAID1 configured drives were installed. The floppy drive (It is beige! Ack!) and CD drive (Also beige! Good Grief!) were installed. Then all power and data cables were connected, tied off and routed for air flow. The S2469GN has on-board ATI video. Powering up the system went well except for the floppy drive which failed to be recognized. This was replaced with a new (Still beige though!) floppy drive. Then the system passed POST and the old Mandriva distribution booted without a hitch. Now it was time to upgrade the system to the latest Mandriva 2010 release.

The Linux MD configured RAID1 was accessed using a Mandriva 2010 Live CD. The Mandriva 2010 KDE4 Live CD was found to be too “fat” for the old system with 512 MB of RAM so we used the Mandriva 2010 Gnome Live CD which was not quite so bad. We mounted one of our NFS server shares and backed up the critical data and configuration from the system by copying the relevant files and directories to a subdirectory on the NFS share. Then the system was rebooted to the installed OS. We selected the next version up from the installed Mandriva version from online repositories using http://easyurpmi.zarb.org/old/ because a big upgrade jump from the old version to the new 2010 release would probably fail. Then began the process of running updates with urpmi –auto-update -v to get the new version, then getting the new kernel, rebooting, and doing it all over again for the next release in line.

After going through several of these upgrade/reboot cycles one of the reboots was done before getting the newer Linux kernel installed. This usually would not be a problem, but for some reason the system completely lost access to the RAID due to this. After talking it over it was determined the best choice to go on would be to do a fresh install. Sure, we could have managed to get the new kernel on there and gone on with the upgrade cycles. But a decision was made to recommend the fresh install to make sure old cruft was gone from the system. After all, we had a backup of the important data and configuration files. The client was contacted and gave the “go ahead”. We decided to get rid of the RAID and just use both disks as discrete drives. The primary disk would hold the /boot files, /root, etc/, opt/, and so on. The second drive would hold /home and /var/www.

After replacing the old CD drive with a used DVDRW drive (At least this one is silver and black.) a fresh install of Mandriva 2010 was done and the needed applications were installed. Including Apache, Postfix, Squirrelmail, Courier (authdaemon, pop and imap) and so on. The backup of /home was copied back over. Settings for daemons were copied and edited as needed. Then, once all was in place, we began testing the system. While doing testing it was discovered that the old IMAP with Squirrelmail had created mbox style mail boxes under /home/(username)/Mail/* while the new setup needs maildir style mail directories and files under /home/(username)/Maildir/*. This was a conundrum as we did not want the end-users to lose access to their archived mail that was in the /home/(username)/Mail/* mbox style files.

After a bit of research three tools were used to solve this problem. One was built in-house and calls the other two to do the work:

  • maildirmake – a tool included with the Courier-IMAP package.
  • mbox2mdir – a mailbox to maildir converter by Sergey A. Galin.
  • convertmbox – a bash script built in-house to use the other tools and get the job done.

The maildirmake tool will create a maildir structure that can be used with modern IMAP maildir servers. Here is how a basic maildir will appear:

/home/user/Maildir/
/home/user/Maildir/cur/
/home/user/Maildir/new/
/home/user/Maildir/tmp/

When instructed to create new “folders” the Courier-IMAP server will create subdirectories off this structure like so:

/home/user/Maildir/.Saved/
/home/user/Maildir/.Saved/cur/
/home/user/Maildir/.Saved/new/
/home/user/Maildir/.Saved/tmp/

Here are the contents of the convertmbox script:

#!/bin/bash
for i in *; do
case "$i" in
Trash) echo "Skipping Trash file." ;;
*) maildirmake $HOME/Maildir/."$i" && mbox2mdir ./"$i" $HOME/Maildir/."$i"/cur ;;
esac
done

The convertmbox script is gzipped in the URL above. Use gzip -d convertmbox.gz to extract it if getting it from the URL above.

To use this one logs in as root, uses “su – username” to switch to a user, changes to the directory containing mbox style mail files and types /path/to/convertmbox to convert the files. After verifying a successful conversion one may then use “rm -rf mbox_mail_directory” to get rid of the old mbox style files and containing directory.

Testing the system with Squirrelmail following conversion of the user’s mail files showed that the conversion was successful. The old Mail directories were then removed. Then the system was ready to be delivered and placed back in operation following on-site testing to make sure nothing was amiss.

This article has had this many unique accesses:

click for free hit counter
get a free hit counter

Notice: All comments here are approved by a moderator before they will show up. Depending on the time of day this can take several hours. Please be patient and only post comments once. Thank you.

GNU/Linux: Replacing a Dead Router with a Linux System

Earlier today I decided to upgrade the firmware on my SOHO’s Linksys WRT54G v5 router. I usually do such things on the weekends in case something breaks. It is a good thing I waited until the weekend this time. My Linksys WRT45G is now “bricked”. For some unknown reason the firmware update never finished although I waited for over an hour for it to complete. Of course no internet access was happening during this time and I could not get to any web sites to try to discover what I could do to fix the router.

Enter an old Dell Dimension XPS R400 PC that has been gathering dust in the closet. It has an 80GB Western Digital IDE drive, a Startech 10/100 NIC and 192MB of RAM in it. I received this old PC from a client that bought a new, custom built system from my company in October 2007. He no longer needed the Dell and was just going to trash it. Instead I convinced him to let me wipe the drive, install Mandriva 2008 on it and try to sell it on eBay. It did not sell when I listed it. The client did not want it back, so I just stuck it in the IT junk closet with several other old systems and flaky monitors. I decided to make this old Dell PC into my “new” router. Since it already has Mandriva 2008 on it I figured I could use that to get routing going and then upgrade the Mandriva later.

I also have an even older custom built PC that has been running a very old Mandriva for years as a file share and a Hylafax send / receive server. It has been giving drive errors so I knew it was going to need to be repaired soon. Once I decided to use the Dell I figured I would scavenge this old PC for its 3Com 10/100 NIC and its hard drive so I could easily copy the Hylafax settings to the “new” router PC. I began shutting things down and taking apart PC systems. After a bit of dust cleaning, parts rearranging and cable connecting I had the Dell ready to boot up with the 3Com NIC installed as a second NIC and the hard drive from the old Hylafax server in place. A small 5-port Linksys switch is taking the place of the built-in switch on the Linksys WRT54G.

I booted up the Dell and tried to login as root at a CLI “login:” prompt. However, I had forgotten the password. Luckily it has LILO boot loader on it and I know I can reboot with “linux single” on the boot line to get to a root prompt and reset the password for root. This was done and a few minutes later I was in the command line version of Mandriva Control Center (MCC) setting up the network. Then I go to set up “Internet connection sharing” and it keeps failing with an error stating it cannot find a network adapter when I choose the NIC that is connected to the internet in preparation for choosing the NIC that is connected to the LAN.

After scratching my head and thinking about this a bit I have an epiphany. The second NIC is on the internet and is probably configured in the firewall settings as the local network. Sure enough when I check the settings in /etc/shorewall/interfaces (Shorewall is a set of scripts included in Mandriva to manage the Linux iptables firewall for one.) the second NIC, eth1, is set as loc. Meaning it is set to be the local interface for the LAN instead of the WAN interface, called net, for the internet. Changing these around is a matter of a few seconds in ‘vim’. I then restart Shorewall with ‘service shorewall restart’ to reconfigure the iptables settings in memory. Then I can finish configuring “Internet connection sharing”. Once that is done I test sharing from my SOHO desktop PC and find I am back online. Total time from completely down to back online with a Linux system based router – about 3 hours.

Now that I am back online with a “new” Linux / iptables based router my next task will be to set up my port forwards and maybe some QoS (Quality of Service) settings for the company VoIP phone. I know how to do the port forwards but I have no clue how to set up QoS for a service. Time to do some web searching for that QoS stuff.

Edit Sat Jan 23 18:47:20 CST 2010: Fix some typographical errors.

This article has had this many unique views:

hit counter code
hit counter code

Notice: All comments here are approved by a moderator before they will show up. Depending on the time of day this can take several hours. Please be patient and only post comments once. Thank you.

The GNU/Linux “Chicken Little” Syndrome

You know the type. The technical reporter that tries to do something on GNU/Linux, cannot figure it out and thus states to the planet the equivalent of Chicken Little saying, “The sky is falling!”, regarding GNU/Linux. We see them over and over coming back to the same point, “Until ‘Linux’ solves [insert the technical reporter’s failure to do something here], it won’t be ready for prime time.” What a crock of compost.

In this case the technical reporter in question is Preston Gralla over at Computerworld Blogs. Specifically his recent article I just finished reading titled, Installing Firefox 3.6: One more reason Linux isn’t ready for the prime-time mass market. The problem here is that Mr. Gralla and those like him seem to think it is absolutely necessary to have the latest release of [insert software here] on [insert Linux distribution here]. When that is absolutely not the case in the majority of situations.

I run Mandriva 2010 at the moment on my desktop system here at the ERACC Intergalactic Spaceport and Karaoke Bar, otherwise known as my home office. I have been running releases of Mandriva for several years now. At first I too wanted to always have the latest, cutting edge release of every package out there. After a while I came to understand that if Mandriva package maintainers saw that a patch was necessary for an application I run then they would patch the version in the distribution and release the patched version in the update repository. If there were a new version of a software application that had security implications for a desktop user, then after testing the new version it would be included as an update for the life of that desktop release, usually 12 to 18 months. Long term desktop releases would get these updates if needed for their lifetime as well, usually 3 years. Then the next time I install updates I get the patched or new version.

I have come to appreciate and accept this. After all, it is highly unlikely that a zero day exploit would be found that could crack my Mandriva system from a user-space application, like we see happen so often on Microsoft systems. The default security in a GNU/Linux system makes creating a zero day exploit that can “pwn” a GNU/Linux desktop system slightly less difficult than a single person being the first to find the next Mersenne Prime[1][2] with pencil, paper and an abacus. Is it possible? Maybe, by a long shot. Is it likely? Not really. As a result, I can just be patient and wait for the new or patched software to appear in my update list. If I really want to be on the cutting edge, along with all the problems that may imply, I can install Mandriva’s Cooker version. This is the untested, it may break, it may slap you around with a large trout, developer version of Mandriva. Not recommended for the faint of heart and those who like their system to “just work”. Or I can go with a distribution like Gentoo Linux.

Honestly, I do not really want to be on the cutting edge. I want stable, known to be working with my distribution, software packages. For that I can wait for the updates or the next major Mandriva release. Regarding Firefox versions, I just updated to Firefox 3.5.7 a week or two ago using Mandriva’s updates. I do not see a pressing need to get Firefox 3.6 Right Now. I can wait for it. Mr. Gralla and his ilk can too, once they figure out how this GNU/Linux thing really works. Of course they can also stick with Microsoft and keep getting “pwned” with web based drive-by exploits that take advantage of Microsoft’s poor design decisions.

This article has had this many unique views:

hit counter code
hit counter code

Notice: All comments here are approved by a moderator before they will show up. Depending on the time of day this can take several hours. Please be patient and only post comments once. Thank you.

GNU/Linux Software I Use Regularly

I recently received an e-mail from a friend that has started using Ubuntu. He is rather new when it comes to running a GNU/Linux desktop and has asked me several questions. One of the questions was basically what software do I use and recommend. This is a serious question that a lot of new users will probably want to know.

Those of us who have been GNU/Linux desktop users for a long time take for granted the packages we install and use. As we have paid our dues to learn the ropes, one way we can help new users is to tell them what we use and make recommendations. It helps to have a base of software from which to start because there are so many choices under GNU/Linux a new user can easily become overwhelmed.

So, for my friend, and for all of you other new users out there, here is the software I use regularly.

My distribution of choice is Mandriva. Mandriva is a RPM based distribution and has several very well written tools to help one manage one’s desktop system. Since RPM is a requirement for Linux Standards Base (LSB) I prefer to stick with RPM based distributions. Mandriva was one of the first, if not the first, RPM based distribution to solve the “RPM dependency Hell” that so many encountered in the early days of RPM distributions.

My “desktop” runs a light window manager named fluxbox. I am not fond of Gnome nor KDE as they are too bloated to start with. Sure one can strip them down, but I would prefer to start light and add only what I want or need. Plus some of my friends I know that run Gnome and KDE do occasionally have broken desktops from trying to update them with the latest and greatest. Due to the complexity of the Desktop Environments (DE) like Gnome and KDE they can be a bear to try to upgrade. Especially for my friends that have jumped from an older primary version to a newer primary version like from KDE3 to KDE4. Just search the web and one can find story after story of upgrade PAIN going from KDE3 to KDE4. Due to upgrade problems under KDE one of my friends now says she has a new swear word, “KDE4”. With fluxbox I have never had such a problem and do not expect to ever have a broken “desktop” because of a fluxbox upgrade.

I monitor my system temperatures and fans with lm_sensors and the sensors krell in Gkrellm. Gkrellm also lets me see at a glance how much space is left on certain partitions I want to monitor. As well as showing me free RAM and other niceties like uptime and process usage.

I always have several xterm windows open to a bash command line. From these I can use dictd and the dict client to look up words and phrases from dictionaries I installed. Here is a little script I run from ‘root’ to install the dictionaries I want when I do a fresh install on new hardware:

#!/bin/bash
urpmi dictd-server dictd-utils dictd-client dictd-dicts-devils dictd-dicts-easton dictd-dicts-eng-fra dictd-dicts-foldoc dictd-dicts-fra-eng dictd-dicts-gazetteer dictd-dicts-gcide dictd-dicts-jargon dictd-dicts-vera dictd-dicts-web1913 dictd-dicts-wn dictd-dicts-world95

The urpmi command is one of those nice tools written for Mandriva that I mention. There are several urpm* commands one may use to manage software from the command line. Mandriva also has a nice GUI called ‘rpmdrake’ that one may run instead of command line versions. Both package systems allow one to search for packages. However, the command line urpm* tools do have a more robust search which can be combined with other command line tools to parse the output.

I use aiksaurus from the command line in one of the xterm windows for my Thesaurus. Here is some example output from aiksaurus:

aiksaurus newcomer
=== immigrant ================
arrival, arriviste, comer, emigrant, entrant, fledgling, greenhorn, immigrant, intruder, newcomer, outsider, parvenu, recruit, rookie, settler, squatter, tenderfoot, upstart, visitor

I believe there are GUI front ends available for both dictd and aiksaurus. But as I have never used them I will let others share about those in the comments.

I always have GNU Midnight Commander, mc, file manager running in one of the xterm windows. I prefer mc for most of my file management duties. It is lightweight and can run from a command line when one’s GUI has taken a nose dive. It is installed by default with Mandriva.

My web browsers, yes I use two regularly, are Firefox and Opera. I use Firefox primarily with Opera as my backup for rendering some broken sites that do not play well with Firefox. With Firefox I have NoScript as well as several other add-ons to block certain web annoyances that do annoy me. For example, I want to see Flash content only when I choose to see it. One of the Firefox add-ons is Flashblock. Flashblock will block Flash content but gives one a button to click to allow the content to run. This along with NoScript can really speed up access to certain sites that are rife with advertising screaming for one’s attention.

I use Kontact, yes it is a KDE application, which is a personal information manager that combines Kmail (e-mail), Knode (USENET news reader), calendar, contact manager, notes widget, ToDo list, Journal, and Akregator (RSS feed reader).

For instant messaging I use Kopete. Another KDE application. It allows me to contact friends, family and acquaintances on several instant messaging services including AIM, Jabber and Windows Live Messenger.

Xchat 2 is my IRC application of choice. I use it to connect to Freenode and a couple of other IRC networks to keep in touch with official project channels and support. Such as the #mandriva channel on Freenode for the times I need to ask a silly question instead of searching the web for the answer on my own.

My office suite is OpenOffice.org. I was pleasantly surprised recently to discover that OpenOffice.org Writer will now open WordPerfect 12 documents. With the contributions from IBM (I presume) it also will open my old Lotus WordPro documents. Naturally OpenOffice.org will open and edit Microsoft Word and Excel format files. When using Microsoft proprietary files I recommend saving them as Open Document Format (ODF) files whenever possible.

My financial management software is GNUcash. GNUcash does what I need to keep up with my personal finances and my small business finances. GNUcash does not have a “payroll” feature, yet. Since I do not need a payroll feature for my small business the ability to track accounts payable, accounts receivable and print professional looking invoices is enough for me.

I occasionally need to crop a picture or tweak a graphic for my web sites. My choice for that is The GNU Image Manipulation Program, a.k.a. The GIMP. I could not care less if The GIMP does not work like Adobe Photoshop. The GIMP does what I need it to do. All the graphics professionals that whine about needing Photoshop on GNU/Linux or they cannot use GNU/Linux miss the point of FOSS. They should get involved with The GIMP project and help add the features desired. If they cannot program they can at least test and provide feedback. In the end everyone wins with better The GIMP for all.

Those are the software packages I use most to Get Things Done. What about play time? I do have a few games I like when I need a break from reality. The games I play regularly are Wolfenstein Enemy Territory (3D FPS), Unreal Tournament 2004 (3D FPS), and Quake IV (3D FPS). These are three dimensional (3D) first person shooter (FPS), shoot ’em and blow ’em up games. I bought Unreal Tournament and Quake, but Wolfenstein Enemy Territory is “free”. When I feel less aggrieved with life I play around with Flight Gear (3D flight simulator) and TORCS (a 3D car racing game). All of these games run natively on GNU/Linux. I will only run games that run natively on GNU/Linux. I will even buy games that run natively on GNU/Linux. If a game does not run natively on GNU/Linux and requires WINE I won’t buy it nor will I “pirate” it to run it.

That is the list of software I use the most on my GNU/Linux PC. Feel free to share your own list of software in a comment.

This article has had this many unique views:

free hit counter code
free hit counter download

Notice: All comments here are approved by a moderator before they will show up. Depending on the time of day this can take several hours. Please be patient and only post comments once. Thank you.

Edit Fri Sep 11 11:16:54 CDT 2009: Clarify the line about FPS.

Opera on GNU/Linux – Moving an Account Reveals a Problem

I recently purchased the final part to build myself a new AMD Phenom Quad-core PC system to run Mandriva desktop GNU/Linux. I have been getting the parts a piece at at time over the past 12 months. I may go into the specifications of the new PC in a later article. For now I want to cover resolving a problem I had with Opera after my move to the new PC.

I create several accounts for myself on my desktop GNU/Linux system. Each account is used for a different purpose. Over the years I have ended up with two “personal” accounts. When moving to my new PC one of my goals was to consolidate the two separate “personal” user accounts for myself into one. This meant copying over settings and saved data from one of the accounts to the other account so as to not lose the settings. The applications in question were used only in one account on a regular basis so I did not have to worry about trying to merge settings. As I had been using this account I was copying for several years it would be a problem to recreate all the settings for all the applications I wanted to continue using. One of those applications for which I wanted to preserve the settings is Opera.

On GNU/Linux Opera stores its data in a hidden directory named .opera in each user account. Desktop GNU/Linux user accounts are typically under /home/username where username is the login name of each user. To preserve the settings I copied the .opera directory from the old account on the old PC using the shell file transfer capability in mc (GNU Midnight Commander file management utility) as root. I had to enable root ssh access on the old PC to do this. Giving remote ssh access to root is not recommended for regular day to day operations but is useful in situations like this. Then I ran chown -R newuser.newuser .opera on the copied directory to give it and the files below it the name and group of the new user.

The first time I ran Opera under the new account I received this popup error message:

Opera error - Store init failed

I do not use Opera Mail but obviously something “broke” when I copied the .opera directory. This error does not keep Opera from working but it is an irritant and I decided to find the problem and solve it. A quick perusal of a web search turned up information about this error but not anything that applied to my situation. Granted I only looked at three of the returned searches because I had an inspiration. I suddenly suspected that Opera uses hard coded paths in some of its files. I opened an xterm and ran grep -r “/home/oldusername” .opera which revealed my suspicions were correct.

The files that grep showed with hard coded paths are global.dat, opera6.adr, opera6.ini and pluginpath.ini. The fix is as simple as making sure Opera is closed, then opening each file in vim and using :%s/\/home\/oldusername/\/home\/newusername/g then 😡 to save and exit. Yes, there are ways to do this with awk, sed and the like but I don’t know those tools as well as I should (Sad, isn’t it?). But I do know vim so that is what I used. For the command line fearing user a GUI text editor like gedit or nedit would work as well. Some of the hard coded paths in pluginpath.ini pointed to directories that were only on the old PC so I just removed those. After editing these files I opened Opera and the error was gone.

Okay. I know some of you are itching to show your elite command line skills so feel free to provide your own solution to this problem with sed, awk or whatever favorite command line tools you use for these problems.

This article has had this many unique views:

free hit counter
hit counter download code

Notice: All comments here are approved by a moderator before they will show up. Depending on the time of day this can take several hours. Please be patient and only post comments once. Thank you.

Edit Sun Aug 16 12:07:39 CDT 2009: Repair a poorly worded sentence.

Linux? There Are Simply Too Many Versions!

Do you run one or more GNU/Linux desktops? Please participate in our poll. (Sorry, this poll is now gone.)

I have once again run across a "too many versions!" comment on another site. In this one a person going by the moniker "matt_chsi" states there are too many versions of Linux and that is why adoption of Linux is so poor. Further, this person states that he has "tried 6 or 7 different versions of Linux and there is no (sic) standards common between them except when it finally does install there is alot (sic) of software already there on install." If he truly thinks there are no standards common "between" (I would have said "among", but that's just me.) 6 or 7 "versions" (I would say "distributions".) of GNU/Linux then he did not examine them closely enough.

I will grant that I have only had experience of a handful of GNU/Linux distributions over the roughly 10 to 11 years of my exposure to GNU/Linux. However, they all have had a similar root directory structure to the Mandriva 2009.0 I am using today: bin/? boot/? dev/? etc/? home/? initrd/? lib/? media/? mnt/? opt/? proc/? root/? sbin/? srv/? sys/? tmp/? usr/? var/. I can know that configuration files are in etc/, programs are in usr/ or under opt/, libraries are under lib/ with logs and other run-time "stuff" under var/. That is a logical standard carried forward from long Unix tradition.

All the GNU/Linux desktop systems I have tried have had the desktop environments KDE and/or Gnome as well as alternatives like fluxbox, Enlightenment, and WindowMaker, to name a few. Again, this is "standard", especially among the top GNU/Linux distributions like Mandriva, the *buntus, Fedora, openSuSE and so on. The main menus may be different due to these being different distributions with different goals. But that the menus are not "standard" is not really a problem.

Actually, if all these distributions were truly radically different it would not be a problem for personal adoption of Free Open Source Software (FOSS) GNU/Linux distributions. How can I say such a thing? I can say this due to the simple fact that one can pick through these "free" distributions with impunity. It costs no money to do so. Find a distribution that one likes. Use that distribution. Once one does this why care that other distributions are different? One has the distribution one prefers at this point. All we in the FOSS community need do is adopt our acquaintances, friends, family and business associates that are interested in FOSS and help them along with the decision process.

No, the lack of "standards" for GNU/Linux is not the problem. The natural human resistance to change, corporate inertia and illegal (or at least unethical) business practices by certain large companies [1][2] are the problems. We cannot overcome resistance to change in people. Where we need to concentrate on change is with our children. Especially children locked in public education systems that are locked into proprietary operating systems [3][4]. Teach our children to use GNU/Linux and we change the future of the computing landscape.

Too many "versions" of GNU/Linux? No. Too little education about GNU/Linux? Definitely.

This article has had this many unique views:

votechdirect.com
Provided by votechdirect.com .

Notice: All comments here are approved by a moderator before they will show up. Depending on the time of day this can take several hours. Please be patient and only post comments once. Thank you.

Edit Sun Aug 2 11:48:22 CDT 2009: Remove the word "older" in regard to resistance to change.

Two Reasons the Command Line Trumps the Graphical User Interface

My inspiration for this article came from reading Akkana Peck’s Intro to Shell Programming: Writing a Simple Web Gallery at LinuxPlanet today.

Before I get into this I will state for the record I am not a text mode Luddite. I use a graphical user interface (GUI) every day. In fact I am using the fluxbox window manager GUI as I write this article with a WordPress GUI and Firefox GUI. I like my GUI chewy goodness as much as any visually stimulated human. However, for certain tasks a GUI is just not the best choice.

The first reason is twofold, quickness and convenience. I will use for this point GNU/Linux distribution software installation and removal. If one has one’s distribution repositories set up, knows the application one wants to install and knows the command line string to use for installation on one’s GNU/Linux distribution of choice then installation is much faster at the command line. For example I want to install K9Copy, a DVD duplication application not included or installed by default on my Mandriva Linux system and included in the Penguin Liberation Front (PLF) third party repositories for Mandriva. From the GUI installer under KDE I have to use the following steps.

  • Click the Menu button.
  • Click “Install & Remove Software”.
  • Provide the administrator (root) password.
  • Wait for the user interface to load …
  • Wait for the user interface to load …
  • Wait for the user interface to load … finally!
  • Click File > Update media
    Because I want to make sure I have the latest repository updates.
  • Wait for the repository database to be updated.
  • Type k9copy in the search bar.
  • Click the check box beside K9Copy.
  • Click the Apply button.
  • Wait for the application installation confirmation dialog.
    Dangit! I already said to do this once, now I have to say do it once more.
  • Click the Yes button (It is okay to continue, stupid GUI).
  • Finally get the application to install.
  • Wait for the GUI to reset after the install.
  • Close the GUI.

Doing this set of actions can take several minutes. On the other hand I can switch from my GUI to a console login with Ctrl+Alt+F1, login as the administrator (root) and type this at the command line prompt:

urpmi.update -a && urpmi k9copy

Then switch back to my GUI with Ctrl+Alt+F7 and conveniently continue typing this article while the program installs. The urpmi.update -a command tells my installer to update its sources. The && tells the shell to do the next thing only after the first one completes. The urpmi k9copy tells my installer to install that application. The Mandriva urpm* tools are smart enough to know that k9copy is k9copy-1.2.3-1plf2008.1.i586.rpm. All this will run in the background while I get stuff done. Now that I have finished this paragraph I can switch back to the console with Ctrl+Alt+F1 and exit from the administrator session.

The second reason the command line trumps the GUI is repetitive tasks. I could illustrate this here with a clever shell script. However, I think I will refer to Akkana Peck’s article I mention at the beginning of this article. Go read it if you have not. In summary Akkana shows how to use a shell script loop to modify a directory full of JEPG files with two of the ImageMagick command line strings. While one could do this with a GUI like The GIMP I would only recommend doing it with a very few files. If one needs to modify a few hundred graphics to be a standard size for a web site gallery then the command line tools Akkana shows how to use are going to save the day.

I have seen all the arguments that Joe Sixpack could not care less about a command line. That is absolutely fine since Mr. Sixpack is more than likely only wanting to browse the web, play a few games, send and receive e-mail and work on his genealogy. All these can be done in GNU/Linux just fine without ever needing to see a command line. However, should Mr. Sixpack ever need to create a family web gallery for the Sixpack family using a few hundred digital photographs from a few dozen different cameras he will have a big task on his hands. Then maybe, just maybe he will see Akkana Peck’s article and find out an easy way to get all those pictures the right size for his gallery using the much maligned command line. I am certain our friend Mr. Sixpack will be very happy to see that command line example from Akkana if he ever needs it.

Please feel free to comment and provide some of your favorite time saving or repetitive command line tasks.

This article has been accessed this many times:

link to mba-online-program.com
mba programs online

Edit: as of Fri Mar 13 17:31:30 UTC 2009 the route to hit-counter-download.com is not working. If the page seems to be taking a long time to finish loading this is why. Hopefully it will clear up soon.

Why I Think Open Source Will “Win” In The End

I am going to start this with some questions for you. Think about them and give honest answers to yourself. Have you ever called or e-mailed Microsoft or some other software manufacturer’s technical support about a problem as a user? What was your result? Did the technical support personnel begin with the assumption that you were the problem, not their software? In “the industry” this is known as Problem Exists Between Keyboard And Chair or PEBKAC (We “geeks” do love our acronyms.). Have you ever found a problem with some software that you knew was not due to “PEBKAC” and tried to get a response from the people that could fix it? Did you get a satisfactory conclusion?

I have done all the above both as a technical support person and as a plain old end user. I can tell you my results with Open Source people are much more satisfactory than my results with typical Closed Source companies like Microsoft.

Here is an example of getting results for a problem from Open Source folk as a plain old user. Recently my wife started an upgrade for one of our SOHO (Small Office / Home Office) computers from Mandriva Linux 2008.1 to Mandriva Linux 2009.0 over the internet. She was using the Mandriva GUI from her KDE desktop to do this. During the upgrade she ran into a problem with lack of space on one of the partitions that is needed for the upgrade. This caused the upgrade to fail while trying to download upgrade packages, although the upgrade software kept trying to get the new packages.

This has happened to me before so I had a work-around that I have used in the past to get around this problem. Of course it meant stopping the upgrade, doing some “arcane stuff” logged in as root (Known as “system administrator” for Microsoft users.) with symbolic linking at the dreaded command line interface, then restarting the upgrade. This time though, since my wife was affected by it, I had had enough of the problem and decided to do something about it.

After setting up the symbolic links to drive partitions with more space and restarting the upgrade I went to my PC where I have an IRC chat program always running and started asking polite questions on the Freenode IRC network in the #mandriva channel. Being polite is always my policy when dealing with technical support problems. One of the first responses was “file a bug report”. I took that advice, went to https://qa.mandriva.com/ and did just that. Within 24 hours of my bug report I received an e-mail notice that the bug report was accepted:

This report is considered to be a valid and complete bug report according to
the Mandriva Bug Policy. It is accepted on behalf of the maintainer.

Mandriva Triage Team

What followed was a satisfying “conversation” with the Mandriva Open Source team about the bug. They did not attempt to point fingers at me, blame my hardware, grind their teeth at the way I have my computers configured, or expound upon the phases of the moon to explain their lack of culpability for this problem. All things I have experienced with “professional” Closed Source support personnel. Well not the moon phases part, that is just my attempt at humor. No, the Mandriva Open Source folk handled my bug report quickly and professionally. I have no doubt that it will be resolved by the time the next Mandriva release is ready.

I will not go into my half remembered “horror stories” of my sessions with Closed Source technical support. I did not document those nor do I really want to recall them. I just remember a great deal of frustration dealing with Closed Source technical support when I knew the problem was their software. Sure, sometimes PEBKAC is true. But many times PEBKAC is used in “the industry” to explain away real problems with Closed Source software by support personnel. Getting an acknowledgment that a Closed Source software program has a real problem can be problematic to impossible for an end user. Certainly this could happen with Open Source projects as well. But to date I have never experienced a “brush off” from Open Source folk and I will be surprised when or if it does happen.

If you want to see my bug report and the responses from the Mandriva people it is at: https://qa.mandriva.com/show_bug.cgi?id=46520

If you want to talk about your good or bad experiences with Open Source or Closed Source support feel free to post a comment here. But, do provide proof of your experience as I have done with this article. If you have no proof, then people tend to think it didn’t happen.

Mandriva Linux and your Blackberry

Hi all. I know – long time, no article. Business has picked up around here so I have less time for web articles. However, I wanted to follow up on something and thought a web article would be a good result from my research.

I was recently asked if I knew about applications on Mandriva Linux to synchronize one’s Blackberry with PIM contacts and calendar. While I do not have a Blackberry myself I knew I had read about this somewhere. After a short time of searching I ran across the article Syncing your BlackBerry on Linux By Joe Barr on December 21, 2007. This was the article I remember reading. Joe’s article does not get distribution specific so I did a bit of checking with my Mandriva 2008.1 system and my wife’s Mandriva 2009.0 system. Here are the applications one might want to check out:

For Mandriva with Gnome based PIM under Evolution install multisync-gui “MultiSync is a program to synchronize calendars, addressbooks and other PIM data between programs on your computer and other computers, mobile devices, PDAs or cell phones. It relies on the OpenSync framework to do the actual synchronisation.”.

For Mandriva with KDE based PIM under Kontact install kdepim-kitchensync “kitchensync is a multiple backend sync program”.

For backing up data from one’s Blackberry install barry-gui “This package contains a graphical applications to backup and restore data from a BlackBerry device.”.

Also take a look at barry-opensync “Barry is a desktop toolset for managing your BlackBerry(tm) device. (BlackBerry is a registered trademark of Research in Motion Limited.) This package contains the opensync plugin to synchronize a BlackBerry with other devices and applications.”.

While this is not a step-by-step article it should point one in the direction one needs to begin getting one’s Blackberry synchronized with one’s desktop personal information manager. Please, if you try any of these out based on reading this article share your experience with others by coming back and commenting here.

This article has been accessed this many times:

online mba
executive mba

Mandriva Linux Used to Save a XP Professional PC

I will admit it, I like tabloid-like headlines. While these types of headlines are trite and irritating to many, they do get attention. Thus the title of this article is intended to grab attention. If you are reading this because the headline got your attention, you can see it worked.

I recently had to recover data from a very sick Dell Dimension computer running Microsoft XP Professional at one of our client sites. As usually happens in these cases Windows had chewed its’ hind legs off and was not working. What had occurred was one of the people at this site had plugged in a USB thumb-drive, a regular occurrence at this location, and the system went to a black screen. No BSOD, no error message, just a dead PC that had to be hard reset. Upon reboot the operating system reported it had crashed due to a “thermal event”. Then it loaded v-e-r-y s-l-o-w-l-y taking over an hour to show the desktop and never drawing the icons.

Since this problem was beyond the scope of the users to repair, my company was called to handle the problem. I am the technical support guy so I packed up my laptop (Loaded with Mandriva Linux of course.), grabbed my briefcase, picked up my organizer and headed out with coffee in hand. Once on site I spoke with the people about the system and opened it up to have a look inside. Since a “thermal event” was reported I was looking for excess dust (There was almost no dust!) or dead cooling fans (All were working.). This indicated to me there was probably not a “thermal event”. I took out my freshly burned Mandriva One 2009.0 Linux live CD with KDE and booted the troubled PC. After going through the selections of keyboard, locale, time and desktop (I chose Compiz Fusion.) the PC booted to Linux and I was able to play around spinning the Compiz Fusion cube for people in the office. I checked the thermal reporting in Linux and all the thermal monitoring was within the average. In other words, the hardware was fine.

After testing the hardware with Mandriva One 2009.0, I recommended that the data be backed up and that XP Professional be restored to factory condition from the Dell recovery partition. Then all updates be applied, programs reloaded and data restored. This was agreed upon and I took the sick PC to our office to begin the repair.

At the office I connected the PC to a KVM switch we use for working on sick computers. I booted Mandriva One 2009.0 again and immediately noticed the mouse was not working. This is a standard KVM with PS/2 type connectors for mouse and keyboard. I pulled a USB mouse from the shelf and connected it to one of the USB ports, moved it a bit and had a working mouse. Then I finished booting into a standard KDE desktop. I did not choose Compiz Fusion this time because, well, let us all admit, that is just eye-candy and not really necessary to get work done.

After booting I had to force mount the NTFS partition for XP Professional because it was showing an unclean shutdown. The Linux ntfs-3g driver will not normally mount an unclean NTFS partition for safety reasons. Safety for the NTFS partition, not for Linux. After mounting the partition I set up a NFS mount at the Mandriva One CLI for one of our server partitions used for backups of data from sick computers. Then I examined the XP Professional partition and using tar zcvf I created an archive of all the data from the XP Professional partition. This included the entire “Documents and Settings” tree as well as a few directories and files that had been created outside that directory structure. Yes, I know this will not respect certain file attributes, like hidden and system, that Microsoft operating systems expect to use. But it works to backup, restoring works and the data is preserved. That is good enough in almost all cases.

Then the Dell PC was rebooted and recovery was started using Ctrl F11 at the Dell boot splash screen. After recovery of XP Professional to factory condition the PC data was restored. Since many of the applications that had been installed no longer existed I spent a few minutes cleaning up errors when loading the XP Professional desktop. I also removed items from the menus by hand and reset the “DESKTOP” files found in several of the menus to hidden. Then began the long process of getting Microsoft updates, updates to the updates, updates to the updates to the updates … you get the picture.

After one set of updates the system began starting with very low graphics settings and giving BSOD at random when trying to reset the graphics to higher resolution. I suspected that an update from Microsoft had caused the original Intel graphics driver from the Dell recovery partition to begin having problems. A download and install of the latest Intel driver for the Intel graphic chip-set on the Dell fixed this problem.

To make a long story just a wee bit longer, the system was delivered. Programs were reinstalled and the PC is once again being used productively at the client’s office. Could I have used other tools to backup and restore the data? Sure, but my point here is that one can do this using a Linux distribution live CD with the tools included in almost all Linux distributions. Cost to you? Only your time learning what to do and how to do it.

Edit Fri Oct 24 21:19:04 UTC 2008: Fix a typo.

Edit Fri Oct 24 22:13:33 UTC 2008: Fix a poorly worded sentence.