What Happened To Apple's Software Quality?
Apple is a funny beast. We say "beast" because it's hard to deny that this company is ripping through the tech world like no other that we've seen in the last decade. Our television programs are laden with Apple ads poking fun at Windows and seducing you into buying yet another iPod, and every quarterly earnings report is filled with optimism. Even in the midst of the worst recession since the Great Depression, Apple was breaking profit records left and right. The company has continued to sell millions of iPhones, millions of iPods and even millions of Macs.
And remember, it wasn't that long ago that the industry at large considered Apple a sitting duck. The outfit's share price was abysmal, the management in disarray and the product line nothing short of woeful. Of course, we all know that Steve Jobs returned to his chair in the CEO corner office in order to turn things around, and things have gradually improved ever since. Few remember that the iPod revolution actually started in 2001 with a bulky, FireWire-bound device that held but 5GB of music at a cost of $100 per gigabyte.
That introduction, however, paved the way for people to begin paying attention not only to Apple's hardware, but to its software too. The company's user interfaces on its iPod lineup were remarkably simple. They could be controlled using a simple click-wheel, they had few graphical enhancements, but one thing was for sure: they "just worked."
Not surprisingly, the whole "it just works" mantra has grown close to Apple as a company, and not just when referring to its iPod family. Apple has been quick to pan Microsoft and Windows specifically for producing systems that were susceptible to viruses and that froze or broke-down often. Macs, on the other hand, were supposed to work beautifully at all times and not have to suffer through those awful spyware removal sessions and Blue Screens of Death. And whether you want to admit it or not, Apple was right. For awhile at least.