Apple has wormed its way into the broad population, creating new expectations and a model for IT. For fan boys, it’s vindication. For old-school IT, it’s a nightmare. For those not in either extreme, it’s further sign of the fundamental shift known as the consumerization of IT.
Apple effect isn’t merely consumerization effect
Users are shifting into mobile devices, and its implications on computing are indeed profound. But we already know that and can see it manifest itself in everything from Microsoft’s attempt to reinvent Windows and the notion that we’re entering a post-PC era.
Apple rides this trend, as does Google’s Android. But Apple lit the fuse with its iPhone, which redefined both mobile computing in particular and computing in general. The iPad lit the second fuse, breaking the separation between mobile and desktop computing. In some cases, an iPad is the primary computer already. Apple is defining very much what the new computing means, as well as training users on what to expect computing to be. As the notions of user technology and personal technology continue to blend, Apple’s ideas are reshaping the expectations and requirements of corporate IT as well.
The entrancing Apple ecosystem
Many in IT don’t get it. They’ll say that iPods are irrelevant to computing technology, and the fact that those are the majority of Apple products in use distorts any alleged Apple effect. The facts speak otherwise. That survey shows that the 51 percent of households that have an Apple product have three Apple products each on average, and a quarter of those plan to buy an additional one this year. What this signifies is the effect of the Apple ecosystem: It’s cliché to say that Apple products are easier to use than rivals, but they almost always are. You see this effect in the real world. The iPod or iPhone is a gateway drug to more Apple products. iTunes and now iCloud encourage the addition of more Apple products to share your digital goodies and — more important — your user experience. There’s truth to the joke that once you go Mac, you never go back.
Dying technology is euthanized
When Apple decides something needs to die, it kills it. That’s what happened with the floppy drive, then to all its proprietary ports, then to CDs, and most recently to Adobe Flash. PC users whine and point fingers, but their vendors eventually follow suit. Apple users simply deal and move on, perhaps after a brief complaint. That’s something else IT should learn: Stop mollycoddling old technology that slows the company and complicates its technology maintenance. The short-term cost of change is lower than the long-term cost of avoidance.
What Apple is doing right is to serve and engage customers, and replicate what is possible within IT. If you do so, you won’t worry about shadow IT, disrespect, irrelevance, or consumerization — you’ll be co-captaining a better company.
Two points:
I question whether you are confusing three Apples in every garage with an IT infrastructure. In 1980 the functional equivalent was three Walkmen in a household. No one confused (nor compared) these devices with corporate computing, and there’s no reason to do so today.
Second point: The dichotomy is that IT provides benefit, but slows extemporaneous invention. It provides stability, but hampers independent action. Should each employee in an organization be given free rein (as Apple and other collaborative technologies suggest) or should they be managed (as the old IT paradigm suggests)? Should business decisions be made based on Google-based web earches or on professionally managed, professional research? Are we better off in business, and independently, in society, with “democratic” computing, or do we need the managed environment that IT and business leaders espouse? So the question becomes, “who runs the show?”. And that question is larger than Apple.
Apple is more for entertainment. Micorsoft is the way for corporate IT ecosystem.