The idea that something is
amiss with IT investments dates back at least 15 years to when
Nobel Prize-winning economist Bob Solow quipped that computers are
everywhere except in the economic statistics. Hence, the productivity
paradox: the inability to convincingly demonstrate that our
investments in technology have resulted in measurable productivity
improvements. Much has been written about the phenomenon. But nobody has yet offered a number that would help executives define whether their own company's productivity paradox is small or large. I'd like to introduce a way. I call it the IT paradox number, the difference between a firm's actual computer hardware spending and what Moore's Law would predict it to be. Moore's Law states that the number of transistors that can be crammed into a semiconductor chip doubles every 24 months without increasing the costs of making it. That means that electronics costs can decline 33.3% annually. The U.S. Department of Commerce settled on an average decrease in "IT costs" at a compound annual rate of 17.5%. It applies that number to economic calculations such as the gross domestic product and the cost-of-living indicators. Hence, the formula for calculating one's own IT paradox number: It's your company's IT budget today, compared with what it could have been if you had previously bought your computing capacity every year for 17.5% less. If you're alarmed by your company's paradox number, I can offer some consolation: You have company. I have been tracking corporate IT costs for a long time. U.S. corporate computer hardware costs increased from $67 billion in 1990 to $153 billion last year. If Moore's Law would have delivered to its full potential, the 1998 IT budgets would have bought $558 billion worth of computing power. Therefore, the U.S. corporate IT paradox number now stands at $405 billion, the difference between $558 billion and $153 billion. The objections to such a calculation will come from those who claim that computers now do much more, and therefore the additional spending is justified. I'm not so sure. If CIOs can explain to their executive committees that they delivered 833% more computer-induced business capabilities ($558 billion divided by $67 billion) since 1990, I will gladly acknowledge that Moore's Law applies not only to semiconductor manufacturing, but also to IT budgets. What, then, explains the differences between the potential gains of Moore's Law and the harsh realities of IT spending?
The IT paradox number can be calculated by anyone and for any corporate computer budget. It may be a good idea for a corporate CIO to know what this number is before some smart chief financial officer brings it up next time budget cuts are on the agenda.
Strassmann (paul@strassmann.com) is still looking for the one organization that can explain its IT spending by applying Moore's Law. |