Open Source: When Updates are NOT the Problem

I recently had a fun experience. My Mageia 1 Linux system seemed to be experiencing hard lockups requiring a push of the reset button to “resolve”. By “hard”, I mean no keyboard input, no program updates showing in X and sometimes no ping response from another PC on the LAN. I had run some updates, including a new kernel update, and these lockups appeared after running the updates. Cause and effect. Yes? Well, no. It turned out I was having a hardware problem. Here is how I figured that out.

All was well with the world and my PC on Friday, 11 November 2011. Okay, maybe not with the world, but my PC was humming along just fine. Then Saturday came along and it was Update Day. Since my PC is my business system as well as my personal system, I usually try to run my updates on the weekends to avoid down-time during the week. The updates completed successfully and these packages were updated:

flash-player-plugin- Sat 12 Nov 2011 04:16:02 PM CST
libmsn0.3-4.1-5.1.mga1                        Sat 12 Nov 2011 04:15:52 PM CST
kernel-source-latest-          Sat 12 Nov 2011 04:15:51 PM CST
kernel-desktop-latest-         Sat 12 Nov 2011 04:15:51 PM CST
kernel-desktop-devel-latest-   Sat 12 Nov 2011 04:15:50 PM CST
kernel-desktop-devel-  Sat 12 Nov 2011 04:15:40 PM CST
kernel-desktop-        Sat 12 Nov 2011 04:14:47 PM CST
kernel-source-         Sat 12 Nov 2011 04:14:31 PM CST

(Output from ‘rpm -qa –last |less’)

Of course, after getting a kernel update a reboot is required to load the new kernel. So I rebooted the system and everything seemed to be working fine. Sunday evening came along and I decided to play a bit of Unreal Tournament 2004, a.k.a. UT2004, and frag some bots for relaxation. Yeah, I am a “violent guy”, NOT. I was into the middle of a Capture the Flag run when my game froze hard. What was my first thought? Yup, you guessed it, “That kernel update has messed up my box!” It is sad how we humans jump to the wrong conclusions so quickly, is it not? How many folks would be happier if we all took a step back from our assumptions and reconsidered before acting? Not only that, jumping to the wrong conclusion in my case turned into a week of unnecessary frustration and angst.

I spent several wasted hours every day looking for a solution to my “kernel problem”. You may wonder why I started hunting for kernel problems. I was using an nVidia based graphics adapter running the non-free nVidia driver supplied with my distribution. During forensics after my first hang I saw this in /var/log/messages:

kernel: NVRM: os_schedule: Attempted to yield the CPU while in atomic or interrupt context

I started thinking the new kernel and nVidia driver Had A Problem and based on that poor assumption I forged ahead to find The Solution To The Problem. I will spare you the details of all the wrong turns and dead-ends I found during my week of agony. Let us just say, it was not fun trying to fix a problem that did not exist anywhere else but in my fevered imagination.

What happened to get me on the correct track? Yesterday, 22 November 2011, my system suddenly started hanging when not running a 3D game. I was just looking up parts to order for a new PC build for one of our Linux clients. This was the “Ah ha!” moment for me. I had not had this happen before under any Linux system except when I had a hardware problem. So, I shut down the PC, pulled my CD case out of my brief-case and loaded the Live Parted Magic CD to run hardware tests. The RAM was tested first – no errors. Then I rebooted to the PM GUI and ran GSmartControl disk tests simultaneously on my four SATA drives.

The first drive finished with no errors. While waiting for the other drives to finish, Parted Magic had a hard hang. My very next thought was, “It is the graphics card!” Because the only time I have had Parted Magic hang like that was when I encountered a bad graphics card on a client’s PC I was trying to diagnose. Yes, this is another bit of assumption. But this time it was correct.

However, I wanted to be sure I was following the correct path this time. So I did some more forensic investigation before marching ahead. What I did was reboot my PC off the boot drive to Mageia 1. Login to my console on my Linux router and ‘ssh’ in to my PC. Then ‘su’ to root and ‘tail -F /var/log/messages’ to watch what was happening while I used the PC. In a few minutes of use the PC “froze” and these were the last lines displayed in the log:

Nov 22 21:41:33 era4 kernel: NVRM: Xid (0000:01:00): 8, Channel 00000000
Nov 22 21:41:35 era4 kernel: NVRM: os_schedule: Attempted to yield the CPU while in atomic or interrupt context
Nov 22 21:41:37 era4 kernel: NVRM: os_schedule: Attempted to yield the CPU while in atomic or interrupt context
Nov 22 21:41:42 era4 kernel: NVRM: Xid (0000:01:00): 16, Head 00000000 Count 00017c0c
Nov 22 21:42:09 era4 kernel: NVRM: Xid (0000:01:00): 8, Channel 00000020
Nov 22 21:42:26 era4 kernel: NVRM: Xid (0000:01:00): 8, Channel 00000020
Nov 22 21:42:41 era4 kernel: NVRM: Xid (0000:01:00): 16, Head 00000000 Count 00017c0d
Nov 22 21:42:50 era4 kernel: NVRM: Xid (0000:01:00): 8, Channel 00000020
Nov 22 21:42:59 era4 kernel: NVRM: Xid (0000:01:00): 8, Channel 00000020
Nov 22 21:42:59 era4 kernel: NVRM: Xid (0000:01:00): 16, Head 00000000 Count 00017c0e
Nov 22 21:43:13 era4 kernel: NVRM: GPU at 0000:01:00.0 has fallen off the bus.

See that last line? That means the driver could not “see” my video card any longer. I think the hard lockups were because the non-free, proprietary nVidia 3D driver has hooks into the kernel to do its “3D magic”. It can cause a video failure to hang the entire system. If there is a way to do fast 3D processing under X without hooking into the Linux kernel, well, I vote for that. Why? So that an ‘ssh’ into a desktop Linux box with a dead video card has a chance of being successful so a savvy troubleshooter has a chance to do forensics on the running system. In any case, a switch to a new video card solved the problem.

My new video card? It is an “old”, unused ATI Radeon X1650 Pro I had sitting on a shelf here. It is using the “free” ATI driver supplied with my distribution. Oh, and my 3D game, Unreal Tournament 2004, works just fine with that. However, I have not gotten Quake 4 to run yet with the “new” setup. I expect I will be able to get Quake 4 working if I decide to take the time to look into that. But for now, I am happy with what I have. At least I can get my work done, which is much more important than any game.

Now, if you will excuse me, I need to get back to building Linux based systems for our clients. Thanks for stopping by though.

Custom PC from ERACC   Custom Notebook from ERACC

Open Source: Why Military Forces Should Use Linux

Why? Because the level of skill required to crack a Unix-like OS is much higher than that needed for a Microsoft OS. Further, properly configured Unix-like systems are much more robust than Microsoft systems. Were Military forces using properly configured and properly secured Unix or Linux systems we would not see items like these below being reported.

I just had a, “What were they thinking?!”, moment while reading this article at ars technica: Computer virus hits US Predator and Reaper drone fleet. First, it is not a “computer virus”, it is a Microsoft operating system virus. Second, using Microsoft operating systems for any critical Military computer systems is just wrong. I know the US Military has specifications for rugged computer systems that must be made in the USA. That makes sense. What does not make sense is the fact that the US Military will accept Microsoft operating systems on its critical, sensitive hardware at this date in time. That is like specifying a bank vault that can withstand a nearby nuclear blast, but allowing the builder to install a screen door for access to the vault. It is just a Bad Idea!

This was a deja vu moment as well. I was following news about Military systems back in the 1990’s and had a similar experience when I read about the US Navy “smart ship” running Microsoft Windows NT … and having a ship killing system failure: Software glitches leave Navy Smart Ship dead in the water. I completely agreed with Ron Redman, deputy technical director of the Fleet Introduction Division of the Aegis Program Executive Office, at the time when he stated:

“Unix is a better system for control of equipment and machinery, whereas NT is a better system for the transfer of information and data. NT has never been fully refined and there are times when we have had shutdowns that resulted from NT.” … and … “Because of politics, some things are being forced on us that without political pressure we might not do, like Windows NT,” Redman said. “If it were up to me I probably would not have used Windows NT in this particular application. If we used Unix, we would have a system that has less of a tendency to go down.”

Actually, after re-reading that, I disagree that NT, or any Microsoft OS, was or is “a better system for the transfer of information and data” when compared to a Unix-like OS. I would use Linux for that too. Especially in a critical Military system like a “smart ship” or a drone control center. Frankly I do use Linux for operational security and the secure transfer of information and data in my own small business. I thank God that I do not have to succumb to political pressure forcing me to use a Microsoft OS for my business. It seems to me, if I can figure out how to implement Linux for my personal and business use, surely the US Military can do the same for its critical systems infrastructure. Obviously some people in the Military “get it” when it comes down to what system is best for critical control systems. Now if only the Microsoft lobbyists can be shut down from affecting the decisions as to what systems are best for the US Military.

Microsoft still makes a decent gaming operating system. But that is about the sum total for which I would agree a Microsoft system should be used. Even there I am agreeing reluctantly only because the majority of current PC game development targets the Microsoft OS.

Hey, US Military folk and US Senators with military oversight, if it has to be from the USA, ever hear of Red Hat Linux? How about the US NSA’s own Security-Enhanced Linux? Perhaps it is time for you folk to rethink the requirements for Military computing systems and make one of these Linux operating systems part of the requirement. Or take the Linux kernel source code and use your own internal Military IT staff and programmers to collaborate and build a custom system just for Military use. Any of these would be a better option than relying on a “known to be owned” OS like any of those from Microsoft. I will be glad to introduce you to Linux if you want to pay me for a Linux consultation. Just sayin’ …

Custom PC from ERACC   Custom Notebook from ERACC

Discuss this article at:

Edit Sat Oct  8 20:57:30 CDT 2011: Due to a salient observation elsewhere, change “pwn” to crack in the first paragraph.

Open Source: Niche Markets, Linux and Microsoft

If you are a Linux protagonist who has been around as long as, or longer than, I have, you have seen responses like these over and over as to why Linux distributions will never go mainstream on the PC desktop:

  • “Linux will always remain a niche platform because it does not have a native release of Adobe (Photoshop / Creative Suite / etcetera)!”
  • “Linux does not have Microsoft Office and Microsoft Office power users require Microsoft Office!”
  • “The web portal at (insert portal here) needs Internet Explorer. There is no native release of Internet Explorer for Linux, so no one will want to use Linux!”
  • Program X does not have a Linux version or equivalent!”
  • Or other claims along the same lines …

Yes, these comments usually do have exclamation points to show how emphatic the claimant feels about the statement. I think these claimants have the equation backwards. All of these cases are what is known as a “niche market”. How many people using PC systems need to use Adobe Photoshop? How many Microsoft Office users are a “Microsoft Office power user”? How many end-users of a PC system need to go to a web portal that requires Microsoft Internet Explorer? (I will ignore the fact that many of these “IE only” web portals usually work just fine if one fakes the browser string with Firefox or Opera.) How many people need to use Program X on their PC?  I am thinking, “Not that many.”, for all the above.

To me this suggests that the Microsoft platform is the niche platform:

  • Do you “need” Adobe (Photoshop / Creative Suite / etcetera) for your job? Then you are a niche user.
  • Do you “need” Microsoft Office because you are a “power user”? Then you are a niche user.
  • Do you “need” access to an IE only web portal? Then you are a niche user.
  • Do you “need” to run Program X on your PC? Then you are a niche user.

The vast majority of PC users do not need, or want, any of the programs that are often claimed to be the problem holding back adoption of Linux on the PC desktop in the mainstream. In my experience with the few end-users I have switched from Microsoft to Linux, some of them did have special needs that precluded using Linux on their desktop PC at this time. The others have zero problems using a Linux desktop PC.

These latter are people that do not try to solve PC problems themselves. They call a “computer guy” when they have problems. They would call a “computer guy” even if they ran Microsoft systems and had a problem. They have no “need” for any of the niche usage scenarios above. They are perfectly content that they can send and receive e-mail, access FaceBook, play Flash games, browse web sites, use personal finance software and make a simple spreadsheet with LibreOffice. All from their Linux based desktop PC.

One of these Linux desktop users is also a Skype user and there are “millions” of Skype users “out there”. Skype usage is less of a niche market than it used to be. That is going to be problematic once Microsoft kills Skype development for other platforms in favor of its own software now that Microsoft owns Skype. The “embrace, extend and extinguish” paradigm is still Microsoft’s bread and butter. But if Microsoft does what I suspect, Skype will end up being merged into some Microsoft based software. At that point our smart FOSS developers will likely figure out a way to inter-operate with the Microsoft software from FOSS programs. However, this “problem” would be non-existent if end-users were aware of and used FOSS communication projects like Ekiga.

So, that said, how do we get from where we are to the mainstream desktop?

The “problem” with adoption of Linux on the end-user desktop is not these niche usage scenarios. As I see it Linux adoption is a fourfold problem, apathy, education, marketing and pre-loading agreements.

  • Apathy – Okay, there is not much we can do about this one. If an end-user is apathetic about what operating system is on his or her PC just let it go.
  • Education – There are still many people who have no idea what Linux is or can do for them. I still meet people who have not even heard the term Linux. When I can, I give them a brief overview of what Linux is and then give them a Live CD distribution to play with. Those of us who are Linux professionals can take the opportunity to present Linux systems at local Chamber of Commerce gatherings and local technology shows.
  • Marketing – There is no one company marketing Linux to the masses on a large scale. We will see no advertisement on television or in print from an “Apple” that offers an alternative to Microsoft. Most of the “Linux Big Boys” are only marketing to businesses. Actually I think this should be one of the jobs of The Linux Foundation. But until that organization takes on major advertising, we can use local media and continue to use positive “word of mouth advertising” to “market” Linux.
  • Pre-loading Agreements – Microsoft has pretty much sewn up the pre-load venue with major PC manufacturers. Sure, some of these manufacturers give a slight nod to Linux and offer a few systems with Linux pre-loaded. But I am not content with the puny offerings from these major manufacturers. (Of course since my company builds custom systems with Linux pre-loaded this should come as no surprise to our regular readers.) I do not expect this to change any time soon. So, no consumers are likely to see a Linux based PC from HP, Dell, etcetera on the shelves at Best Buy. The only way I see to overcome this at this point is with education and marketing. If we can create a demand for Linux systems like Apple has done for Apple systems, the end result will be Linux systems on the shelves at major retail outlets.

There are people who should stick with Microsoft or Apple systems for their niche usage. For the rest of the PC user base, Linux on the desktop is ready to go.

Custom Notebook from ERACC Custom PC from ERACC

Discuss this article at:

Open Source: Live Migration of Mandriva to Mageia

Are you in the market for a new laptop, desktop or server PC with Linux installed? Please give us the opportunity to quote a preloaded Linux laptop, desktop or server system for you.

I took the plunge to migrate my personal / business desktop PC from Mandriva 2010.2 to Mageia 1 today (Sunday, 4 September 2011). I used the instructions from this page: Migrate from Mandriva Linux. Specifically the section titled, “b) Upgrading inline, using urpmi (CLI)”. The migration is roughly three quarters done as I type this. I decided to try to use the PC while I ran the migration from console 1 (Ctrl Alt F1). In preparation for this I closed programs I suspected would be most affected. Such as:

  • Firefox 3.x – which will be replaced with Firefox 4.x
  • – which should be replaced with LibreOffice
  • Gnucash – which has my accounting data I do not want to risk
  • Kopete – which is being upgraded

To access our company site and begin this article I kept Opera open. I did try to print a page from Opera and crashed Opera once while running this migration. I forgot about the migration running, or I would not have tried that. I also am able to use light applications such as gedit, but still cannot print from those. I do still have access to the LAN and the internet so the system is usable. But the system is not as useful with not being able to print while the migration runs. Of course, problems like that were not unexpected.

The system has not gotten to the point that X is unstable or anything like that. Which is pleasantly surprising to me. I had a 50/50 expectation that X would crash while this migration ran. I am only continuing to try to use the system so I can report to our readers about the experience. Otherwise I would close X, switch to runlevel 3 – which can still be done as I am not yet forced to use systemd, then run the migration at the console without running a GUI at the same time.

I am about to close X since I see X stuff being migrated. I will reboot following the migration. I am interested to see if everything “just works” or if I will have to fix something before I can get back to using the PC. I will be back to report more …

It is about 1.5 hours later and I’m back. Here are some interesting items about this migration:

  • Migration began at 11:30 AM CDT and finished installing all the packages at 6:00 PM CDT.
  • In total there were over 2600 packages migrated.
  • The average download speed from my chosen mirror over my broadband connection was around 400k.
  • The 16 GB /usr partition got to 94% full due to having several old kernel-source packages installed. These were all removed following the migration.
  • There were several hundred “orphaned” packages after the migration. These were removed with the command: urpme –auto-orphans.

My use of the proprietary nVidia driver was picked up and followed through to the new system because I enabled the ‘tainted’ repository (see Edit below) prior to migration. I did notice several old game packages being migrated that I have been running from source builds. So, I do not need those packages. These took up time and space and had to be removed following the migration. In hindsight, I should have gone through and removed unneeded packages before migration.

I did have to restart the migration with a specific mirror at the beginning. The mirror chosen for me by the command –

# urpmi.addmedia –distrib –mirrorlist

– was a mirror that was across the Atlantic from me and very slow. So I instead used the command –

# urpmi.addmedia –distrib (mirror_url)

– to choose a faster mirror closer to me. Where (mirror_url) is replaced with the HTTP address of the mirror I chose. In all, the migration went very smooth following the directions given by the Mageia people.

Discuss this article on:

Edit Mon Oct 10 21:28:08 CDT 2011: I discovered later this was actually due to using the ‘nonfree’ repository, although I did have ‘tainted’ enabled for the migration.

Open Source: Mandriva 2011 vs Mageia 1

Are you in the market for a new laptop, desktop or server PC with Linux installed? Please give us the opportunity to quote a preloaded Linux laptop, desktop or server system for you.

By the way, if you did not read my previous article, Open Source Horror Story – A Linux Recovery Tale, you do not know what you missed. Basically the article is about recovering from a failing hard drive after an attempted upgrade of Mandriva to the 2011 release. The article is written in 3rd person from a story teller’s point of view. It has some good information in it for those of you who may find yourself in a similar situation. Go have a look, and make a comment if you wish. Okay, enough about that, on with the new article.

As of today I find myself in the position of deciding whether or not to stick with my previously preferred distribution, Mandriva Linux. This is a bittersweet realization for me. I found Mandrake Linux several years ago in the early 2000’s, about the time they were working on coming out of bankruptcy. When I saw and understood the command-line urpm* package management tools for the first time I immediately “fell in love” with them. In my mind those tools were, and still are, one of the best package management implementations in all of Linux. At that point, Mandrake Linux became my distribution of choice. When Mandrake merged with Conectiva and reorganized to become Mandriva, I stuck with Mandriva. When Mandriva narrowly avoided another bankruptcy, I stuck with Mandriva. When Mandriva development seemed to be imploding and many developers left or were fired, I stuck with Mandriva. Now Mandriva 2011 is out, and Mandriva seems not to be “sticking with me”.

My preferred “desktop environments” for X on Linux are in this order: fluxbox, XFCE4, WindowMaker. Notice something? You got it!  Those are all “light” window manager / desktop environments, a category that does not include KDE or Gnome. I have never been a fan of desktop environments that are more resource hungry than most of the applications I want to run. I am even less fond of the direction both projects, meaning KDE and Gnome, are taking with their current  DE implementations. I stick with minimalist GUI implementations such as those mentioned in the first sentence of this paragraph. Now with the release of Mandriva 2011 I see this disturbing, to me, tidbit on the Mandriva Linux 2011 Release Notes:


GNOME, Xfce and other Desktop Environments (DE) and Window Managers (WM) are no longer included in the official Mandriva packages. Contribution packages from the Mandriva community are available for these desktop environments however. Starting from Mandriva Desktop 2011 only KDE Plasma Desktop is officially supported. If you need Mandriva with another DE or WM you can use unofficial packages or distributions prepared by community members (which are described below).

Wow. Does that suck or what? I have seen the new ROSA interface for KDE on Mandriva 2011. All I can say about it nicely is, “That is not for me.” The new community driven Linux distribution called Mageia, which is based on Mandriva 2010.2, has my beloved urpm* tools and will still “officially” supply / support fluxbox, XFCE4 and WindowMaker. Not only that, but after having had to do one fresh Mandriva 2011 install after a problem with a failing hard drive, I found out I have a strong dislike for the new Mandriva GUI installer. I really prefer the older Mandriva installers that work like the one in Mageia 1:

Installing Mageia 1

OGG Theora Video best viewed in Firefox.

Finally, Mandriva 2011 is to the point of switching from sysvinit to systemd for bootup. Yes, one can still run sysvinit with Mandriva 2011. But since sysvinit in Mandriva 2011 is deprecated I suspect it may become broken with subsequent updates. My suspicion may turn out to be wrong, but why should I take the chance? While I understand systemd on Linux is probably the future for us all, I am not yet ready to switch. Mageia 1 still uses sysvinit for bootup at this point with systemd possibly arriving with Mageia 2. This gives me a bit more “wiggle room” to learn about systemd before I take the plunge into using it on my systems.

Due to all of the above, but specifically the DE part, I am now seriously considering a move to Mageia. In fact, while writing this article I have convinced myself it is time. I am researching my needs in anticipation of switching to Mageia this very weekend in fact. By the time you read this article I may already be in the middle of a distribution switch or finished with same. Once I do switch and have a chance to become more familiar with Mageia I will begin writing about that distribution here on The ERACC Web Log.

Obviously, my choices here will not be the choices that others will make. Regardless, I am hopeful the information I give here may help someone else with his or her own decision about a distribution to choose.

Discuss this article on:

Open Source Horror Story – A Linux Recovery Tale

Are you in the market for a new laptop, desktop or server PC with Linux installed? Please give us the opportunity to quote a preloaded Linux laptop, desktop or server system for you.

Hi children! I know it is a bit early for scary tales. We usually get to those in October. But I have one for you that you just might want to hear now. So. Get your hot cocoa, your S’mores and your sleeping bag and come over here by the fire. I have a tale of chills and thrills to tell you young’uns. There now. Are you all snuggled in and ready for a scary tale? Good. Here goes …

It was late on an August evening. August 30th to be exact. A brave independent consultant and Linux administrator was finishing up a long, slow upgrade from Mandriva 2010.2 to Mandriva 2011 for a client. He had noticed the upgrade was taking an excessively long time, but as this was only his second upgrade of the new release of Mandriva, he chalked it up to the new release of Mandriva. Little did he suspect the slow upgrade was due to … due to … oh, I can hardly say it to you sweet, innocent young’uns. But to tell the tale properly I must say it … A FAILING HARD DRIVE! (Look! I have goose bumps!)

When he rebooted following the last stage of the upgrade, he saw a … a … a … KERNEL PANIC! The system could not find the root / boot partition. So, he booted a PartedMagic Live CD to access the drive and see what was wrong. But PartedMagic refused to mount the partitions too. When he checked with GParted he saw that the /home partition, which he knew to be an XFS file system, was being “reported” as a “damaged” EXT4 file system. This looked bad. Very bad. So, he ran GSmartControl and tested the drive. Oh no! The drive was giving errors by the megabyte! Oh the horror! The angst! The tearing out of the hair … Okay, so he’s 50ish and mostly bald on top with a ponytail. He really avoids pulling out what hair he has left. But you get the picture.

Okay, not to worry. He had sold the client a new, spare hard drive just the right size to replace the failing drive. He also “knew’ the client had backups, because he had set up the backups for them and told them how to run them. Plus they had periodic automatic backups as well and had been told how to check that the backups were running and completing successfully. But when he checked for the most recent backup … it was in May! No one had been running the manual backups and the automated backups were returning error logs that NO ONE WAS READING! (Yeah, he should have run an “extra” backup himself, but time was pressing because he had a time limit from the client to get the upgrade done. The time limit left no time for a backup.)

Now things were starting to look grim. He knew that losing three months of financial data stored in QuickBooks in the XP Professional virtual machine on the /home partition of the client’s drive could be a disaster for this small business client. Thinking it over, he decided the only solution was to run xfs_repair on the /home partition. So he did. Lo and behold, it worked! Well, somewhat. There were hundreds of megabytes in lost+found but the user directories showed up and most of the files were there, including what appeared to be the XP Professional virtual machine directory named .VirtualBox in the user account that ran the VM. Unless you have been in this position, my children, you have no idea the sense of relief this brave Linux denizen felt. But it was a premature relief, as you shall see.

He immediately shutdown the system and installed the spare hard drive. Then our brave lad rebooted with the PartedMagic Live CD and ran GParted again to create a new partition layout. Then he ran Clonezilla to clone the recovered /home partition to the new drive. Keeping his fingers, toes, arms, legs and eyes crossed for luck. (Did I mention he is a contortionist? No? Well, he’s not. That sentence is just for “color”.) The clone completed successfully and our intrepid Linux fellow shut down the system, removed the naughty hard drive, and gave it proper rites before smashing it with a sledge hammer. (Yeah, you guessed it, more “color”.)

Then he reran the “upgrade”, which was now morphed into a fresh install of Mandriva 2011 on the new hard drive. It was 4:00 AM on August 31st at this point. He was now into his 14th hour of an “upgrade” that had been supposed to take less than six hours by prearranged agreement with the client. By 7:30 AM, when the client’s staff began arriving, he had the system “finished”. The printer was printing. The scanner was scanning. The VM was booting. The rooster was crowing … just checking to see if you are paying attention. All appeared well and the client was understanding about hardware failures happening. After going over backup procedures with the client, again, our weary Linux consultant headed home for a short nap before starting his new business day.

Later that day he received a call. Yes, children, it was the client. The QuickBooks data was showing nothing past April 2010. Since this was August 2011, that was a Very Bad Thing. So, our fine Linux fellow headed back to the client and the “problem” system as he was now calling it. Upon review he discovered the restored virtual disk was one that had been a backup made in April of 2010 prior to an upgrade of VirtualBox at the time. Where was the most recent virtual disk with the client’s data? Gone. Vanished. Eaten by an evil hard drive. But, a light appeared above our hero’s head! Due to having had some sleep and some caffeine, he remembered that QuickBooks had been reinstalled with a new release in late June of 2011. He Had A Backup Of The System On A USB Drive From That Day! Yes, it would still mean losing two months of data. But that was much more acceptable in the client’s view than losing a year and a half of financial data. Which would mean near certain doom for almost any small business.

So, our Linux protagonist retrieved the USB hard drive, attached it to the system and ran a restore to get the virtual machine back from June 2011. This worked successfully and the VM booted. A check of the VM showed the data from June was there and intact. Our nice Linux guy packed up his gear, went over backup procedures with the client, again. (See a trend here?) Then headed home for supper and a good night’s rest. The End …

Well, not yet. You see, losing data really irritates our Linux Paladin. His mind would not let go of the problem. He kept thinking there was something he missed. Something he could have done to get all the data back. Something … something … some* … Ah HA! He recalled that lost+found directory with the hundreds of megabytes in it! He quickly called the client and arranged to go on-site after hours on that 1st day of September 2011. He combed through the lost+found directory with the ‘find’ command searching for files around the correct size of our missing, most recent, virtual machine file. There was one hit, just one. But it was enough. He had found the latest copy of the virtual machine. After making a backup(!) he copied this file to the correct directory, set back up the virtual machine using this found file and all the financial data was recovered. Everyone rejoiced and there was much feasting. (Yep, “color”.) The Real End.

What is the moral of our story young Tuxes? It is this: Never rely on someone else to do a backup. Backup, backup, backup, backup, backup for yourself. Then when you think you have enough backups, do another backup. You can be sure our Linux star has learned that lesson … again.

Discuss this article on:

Linux Hardware Support Better Than Windows 7

Are you in the market for a new laptop, desktop or server PC with Linux installed? Please give us the opportunity to quote a preloaded Linux laptop, desktop or server system for you.

I will start this off by adding, “… with the exception of some wireless chip sets and high end graphics cards.” to appease those of you who will act like Arnold Horshack (1, 2) if that is not mentioned. If there are other unsupported devices on Linux that are supported in Windows 7 feel free to scratch your itch and tell me in a comment.

The concept of better is a subjective idea. What is better to me is possibly, even probably, not better to someone else. In my case, and in the case of some of my clients, Linux hardware support is “better”. I do not buy cutting edge hardware and tend to keep systems and peripherals until they stop working and can no longer be repaired at a reasonable cost. When a new release of my favorite Linux distribution comes out I can be 100% certain that my hardware that works with my current release will still work with the new release. That is something I just take for granted. This is not so in the Microsoft camp.

For those people who hold on to working hardware through new Microsoft versions, their hardware may or may not be supported in a new release of a Microsoft OS. Take the example of a recent conversation I had with the manager at one of my client offices. I will call her “Mrs. B” here. Mrs. B is a Microsoft fanatic and will not even consider switching to Apple, much less Linux. When I mentioned switching to Linux for her office desktop during our conversation she laughingly said, “Gene, you know better than that.”, because we have had that discussion before. This came up in our recent conversation about her HP Photosmart 1115 printer.

Mrs B recently had to purchase a new PC for her office use because her old Microsoft XP Professional based PC died. She bought a cheap, commodity PC with Windows 7 Home Premium installed from an on-line discount store. She did not check whether or not her existing peripherals were supported. Why should she? They worked before, so they should still work. Correct? Not so correct. You see, HP has, for whatever reason, decided to not make drivers for the Photosmart 1115 for Vista, much less Windows 7.

Mrs. B had asked me to see if I could help her get her printer working on Windows 7 because she could not find the driver CD. So, I went to and did a search for drivers for her. I already suspected that HP had not created drivers for that model, and I was correct. I informed Mrs. B and mentioned that the printer does have support under Apple OS X and Linux. So maybe we could switch her to Linux so she would not have to get rid of her still working printer just to buy one that has Windows 7 drivers. That is when I got her response above. So, Mrs. B will be buying a new printer and either throwing away or giving away the still functional Photosmart 1115 printer.

While at HP’s web site, just for curiosity’s sake, I looked at the list of unsupported products in Windows 7. That is quite a list. Then I took items from the list at random and checked to see if HP reports they are supported under Linux. Oddly, some of the items in that list do have Windows 7 drivers. It seems even HP is not sure which of their products are not supported. Some of the products are not supported under Linux according to the HP driver search for them. Those also only have drivers for Microsoft Windows 3.1, Microsoft Windows 95 and Microsoft Windows 2000. It is possible these very old models are “win-printer” types that are gutted of any stand-alone capability and require a driver to function at all. But the other models I looked up all had support under Linux listed, but no support under Windows Vista or Windows 7.

One problem here is that Microsoft drivers are so closely tied to the system kernel that a new release of the operating system breaks old drivers. Under Apple OS X and Linux this is not a problem because most drivers, including those for printing, are separate from and not tied to the kernel. On Linux any driver that does require a specific kernel can be, and usually will be, easily recompiled by a distribution’s maintainers and released along with the new kernel. If the driver works with DKMS, even better. Printing runs as a separate subsystem, usually using CUPS. So, if one’s printer worked with Fedora 9 it still works with Fedora 15 and will probably still be able to work with Fedora 25 or whatever Fedora releases may be called later. So, one’s beloved Photosmart 1115 printer can still be used under Linux while it cannot be used with Windows 7. In my book, that is better hardware support with Linux.

These days I will only purchase new peripherals for my SOHO that specifically state they have Linux support or are shown to be supported by open source drivers. If the package says “Linux” on it, I also try to take the time to send an e-mail to the manufacturer letting them know I chose their product because they took the effort to put on the packaging the fact they support my preferred OS. This is my small effort to keep these manufacturers interested in supporting Linux. Perhaps you can do the same.

Do you have your own “peripheral horror story” with a Microsoft OS? Feel free to post a comment about it.

Discuss this article on:

Open Source: Multitasking with X and Linux

Are you in the market for a new laptop, desktop or server PC with Linux installed? Please give us the opportunity to quote a preloaded Linux laptop, desktop or server system for you.

I am talking about human multitasking here, not operating system multitasking. Every time I have to work on repairing a Microsoft Windows based system for a client I feel as if I am restricted to typing with one finger while blindfolded. Okay, I admit that is hyperbole. But after using X with fluxbox and Ten Workspaces on fluxboxten workspaces as well as the Linux console (1, 2) with screen for several years now, a Microsoft OS experience seems very restrictive to me. There are no multiple workspaces by default on a Microsoft desktop and adding anything to give a Microsoft system multiple workspaces would not be the Right Thing To Do when working on someone else’s PC system.

Another restriction is with multi-user on Microsoft. With my Linux desktop PC I have a user for work related tasks and a user for relaxation and gaming tasks. I can keep the work user logged in, switch to a Linux console using Ctrl Alt (F1-F6), login the game user, start a second X GUI session with startx startxfce4 — :1 and play a short game while “stuff” keeps running under the work user in the first X session. If I am playing a buggy 3D game that may crash X, I have no worries about my tasks in the other X session as they would be unaffected if a poorly designed 3D game took down the second session. I can do this “out of the box” on a typical Linux distribution installation. If you are from the limited Microsoft universe you have no concept to compare to this on a standard, out of the box, Microsoft desktop PC. Yes, you can switch users. But if you switch users as I do to play a game that subsequently crashes the Microsoft GUI called “Windows” it all crashes. Not just the session where the faulty program broke the GUI. It is also truly simple to switch between or among multiple sessions of X on a Linux PC. Just use Ctrl Alt F# to switch back and forth, where # is the virtual terminal number for an X session. For example, my first session is on virtual terminal 8 and my second session is on virtual terminal 9. To switch between them I use Ctrl Alt F9 and Ctrl Alt F8.

I am a “computer guy” and am one of the people that other folk call when they have a computer problem. Several of my clients are set up with VNC software so I can connect to their systems over the internet. For the clients that are running a Linux server or desktop PC from my company, I can connect with SSH to provide support. Some of these clients have contracted with my company to handle their Microsoft updates each month. So, for a fee, I do the boring work of making sure their Microsoft updates apply each month and resolve any problems that may arise for them due to a Microsoft update failing or conflicting with installed third party software. I also check third party software to see if those too need updates, as there is no central repository for updating everything as there is for a typical Linux distribution. I use UltraVNC at the client sites, because it has a Win64 version, and Vinagre with Two SSH Sessions RunningVinagre here at the SOHO. With Vinagre I can connect to multiple systems, using SSH and VNC, and be working on each of them simultaneously. My limitation is usually the “speed” of the internet connection at each end. Usually more than two client systems updating at the same location while trying to use VNC will cause remote sessions to timeout and drop. But, I can easily connect to two, three or four PC systems at two, three or four different client sites simultaneously and work on their systems remotely at night without a noticeable problem on my end. Being able to do this all with “free” software both at the client and on my side means I can multiply my ability with multitasking to support my clients without breaking the bank for my small business.

My Linux “desktop” is fluxbox as I stated above. I also stated I have ten workspaces. In my experience workspaces allow more efficient human multitasking. I do not suffer GUI clutter as one does on a single workspace system such as Microsoft’s “Windows”. I keep several programs running on my system all the time. Each workspace is designated to a specific task. So I know if I need to create a Labor Sheet for a client and make an invoice to send them, that is workspace 8 where I have Writer and GnuCash running. If I want to chat with a friend on Instant Messaging that is workspace 5 where I have Kopete running. If I want to browse the WWW or write an article for, that is workspace 3 where Firefox and Opera are always running. Should I want to listen to music while I work that is workspace 6 which I designated for multimedia tasks, meaning music and videos. With these multiple workspaces I can keep programs running with their window showing. So, when I want to switch to a running program, I just switch to the workspace where the program is running. I do not have to hunt through a list of minimized programs on my task-bar. Thus saving my time to be more productive.

What does this mean to you if you are a Microsoft user? Well, until you can experience multitasking on a Unix or Linux PC this will mean little to you. I regularly get comments like, “Oh, that would just confuse me.”, or, “I see no need for that.” from my Microsoft using friends. But my Linux using friends will likely all nod sagely and smile when I talk about work efficiency and human multitasking on Linux. Until you have experienced it and used Linux enough to get familiar with this, you will not really understand how beneficial multitasking with Linux can be for you. But if you come from the Microsoft camp to the Linux city and do give multitasking a serious look, and I mean long enough to get familiar with it, you might just get a clue about what I mean here.

  1. Meaning the text mode interface for Linux where no GUI or window manager is running. (back)
  2. Where is the Microsoft console? You may say there is one, but not really. Not compared to what one has on Linux and Unix. (back)

Discuss this article on:

Open Source: Pondering the Linux GUI

Are you in the market for a new laptop, desktop or server PC with Linux installed? Please give us the opportunity to quote a preloaded Linux laptop, desktop or server system for you.

First we had the KDE 3.5 to KDE 4.0 debacle. A release numbering scheme by the KDE folk that differed from what is considered the norm ended up with alpha level code being pushed out in most major desktop distributions of Linux. Many people were so upset about radical and broken changes to KDE during this period they left KDE, swearing never to return. It does not matter if KDE is “okay” now. Some of these people will probably not return to KDE.

Some of the disenchanted former KDE using folk moved to Gnome and liked what Gnome was at the time. These people got comfortable with Gnome 2.x and enjoyed the features it has. Now we have the strangeness that is Gnome 3. Once again, many people are not happy with the changes in Gnome 3. Especially upsetting to some is the loss of functionality they took for granted and an extreme change in the look and feel of Gnome.

What are these KDE and Gnome desktop FOSS developers thinking? “We want to be more like Microsoft!”? In other words, change the Graphical User Interface (GUI) just for the sake of change? (That is a rhetorical question, I already know the reasons given by these major desktop developers. I just disagree with much of their reasoning for such radical changes in major release versions.)

The change of the Microsoft GUI from Microsoft Windows XP to Microsoft Vista and Microsoft Windows 7 was radical. The Microsoft GUI change was basically just glitz, bells and whistles on top of the same broken OS with the added insult of removing and obfuscating much of what users had come to expect from their Microsoft GUI. There was no real need for such a radical change of the GUI. Sure, the new Microsoft GUI “looks pretty” if one has the hardware to support it. But the changes to add the “glitz, bells and whistles” also made older hardware obsolete faster since the older hardware did not have the “muscle” to run such a resource intensive GUI as was released with Vista and later Windows 7. At least the underlying Linux system is still robust, secure and works like it should. At least on Linux we still have sane window managers and GUI systems we can fall back to when the major desktop GUI developers go off the cliff with radical changes we do not like.

Are these GUI changes in KDE and Gnome “better” than what we had previously? The idea of “better” is completely subjective. What a Geek Code Jockey (developer) thinks is “better” is possibly, even probably, not going to be what an end-user thinks is “better”. Actually, most average PC end-users I know personally would prefer that their GUI not change so radically from one major release to the next. Incremental changes are good, especially if the changes do not break, obfuscate, move or remove something the end-user likes about said GUI. On the other hand radical changes are Bad, especially if said changes do break, obfuscate, move or remove something the end-user likes about said GUI.

For example, one of the reasons I like and stick with fluxbox is the fact that it does not change. The add-ons and keyboard shortcuts I put in place for my fluxbox “desktop” will still work as I expect when I get the next release of Mandriva Linux installed. That is what *I* want. Not some radical change such as we see with Gnome 3. Another optional GUI I use on some systems is Window Maker. Again, when the Linux distribution on which I am running Window Maker is upgraded to the next release, Window Maker will still work as I expect it to. No radical changes just for the sake of change or to scratch some itches of FOSS GUI developers. In some cases change can be good. Such as, I would really enjoy a change to a higher income bracket. But when it comes to getting my work done on my PC, I would prefer my GUI stay pretty much how it is now. After all, that is why I picked the GUI I use in the first place.

Do you have a GUI you like on your Linux system because the GUI does not change radically? Feel free to post a comment about it.

Discuss this article on:

Are You Smart? Then You Probably Do Not Use IE!

Note: The article at “AptiQuant” referred to for this story is probably a hoax. See this article at BBC News. Then compare the AptiQuant “team” with the team at Central Test based in France. A look at the domain record for the AptiQuant site shows an address that does not appear to exist. In my opinion, this was a pretty good hoax. But it does mean my conclusions below are now just based on my own observations and suspicions since we have to toss out the hoax. One comment to my article here did point out a different site with IQ results: Thanks for that!

This is priceless. A recent article at Fox News (Internet Explorer Users Are Dumber, Study Shows) points out that a study by AptiQuant shows users of the web browsers Camino, Chrome, Chrome Frame, Firefox, Opera and Safari scored higher on IQ testing than most users of versions of Microsoft’s Internet Explorer. A follow-up article at AptiQuant states that some Microsoft Internet Explorer users are threatening to sue AptiQuant. That latter fact just proved AptiQuant’s point I think.

I read the PDF from the AptiQuant site, you can get your own copy here, and noticed that the respondents to their study were self-selecting. People found the AptiQuant test while searching for IQ tests on the WWW and chose to take the test themselves. This of course leaves out the millions of people who were not looking for web sites to test their IQ. It is possible that those looking for an IQ test are on average more intellegent than those who are not. But that is just my personal suspicion. I have nothing I can use to back that suspicion. In any case a sample of 100,000+ people is a decent sample.

Part of the conclusion of this study states:

The study showed a substantial relationship between an individual’s cognitive ability and their choice of web browser. From the test results, it is a clear indication that individuals on the lower side of the IQ scale tend to resist a change/upgrade of their browsers. This hypothesis can be extended to any software in general, however more research is needed for that, which is a potential future work as an extension to this report.

This suggests to me that Microsoft users who refuse to move from a Microsoft operating system to something else may be in the less intelligent group. I think that is more than likely. My anecdotal evidence for this is I know several people who moved away from Microsoft to Linux in the past few years. As I know these people personally I can state with confidence that all of them are rather intelligent people. I did not test their IQ, but I have had extended conversations with these people on substantial subjects. Without exception each of these people I know personally have sharp minds and can “hold their own” in discussions on a number of subjects. Many of the people I know who refuse to leave Microsoft for something better tend to be those who also keep getting malware infections that my company is called upon to clean up. Does this scientifically prove that Microsoft users are statistically “dumber” than Linux users? Nope. But it does show to me that those people I know personally who have moved away from Microsoft to Linux are smarter in some respects than those I know who insist on using Microsoft systems and software.

That said, I do show three examples below where the users moved from Linux back to Microsoft. None of these people are “dumb”. Everyone else I know who switched to Linux as a user has been relatively content and, based on my anecdotal evidence, quite smart.

One of the people who moved from Linux back to Microsoft is a smart fellow who worked for NASA during the 1960’s and helped plan the first moon landing. He is not a “computer guy” and had long been a user of Microsoft based systems starting with IBM PC-DOS in the 1980’s. But he can do calculus in his head, which I cannot do, so he is not dumb by any stretch of the imagination. He just did not enjoy having to find and learn new software to do what he wanted on Linux. He does have a Linux VM running on top of his Microsoft OS so he still dabbles with Linux. But he is generally a Microsoft user again.

Another fellow that switched from Linux back to Microsoft has failing health and is on medication that impairs his ability to think and communicate. The change from Microsoft to Linux was therefore quite frustrating for him with the differences he encountered. He really needed familiarity to be able to do what he wanted with his PC, so he reverted back to Microsoft which was the right move for him.

The other people I know who moved from Linux to Microsoft did so because they run a small business that does much of its selling through eBay. We all know eBay is in the back pocket of Microsoft for some reason and has made it difficult to easily use some features of eBay with anything other than a Microsoft based system. Being on Linux made using these eBay features they needed either very difficult or impossible. So they moved back to Microsoft. However, these folk disliked having to use Microsoft so much they recently switched again and are now using Apple based systems for their business. I hope they are not similarly frustrated by eBay in their move to Apple. Time will tell.

In conclusion, I will be very interested to see a similar study done that takes into account the operating systems used. The quote above from the PDF document by AptiQuant states this may be done in the future. If that is done and released to the public, I suspect that we will find a similar pattern of IQ results based on the operating system used as well.

Discuss this article on: