| continued from "Of men and machines" | Productivity declines
Strassmann goes on to examine the effect of computerization on noninterest expenses: buildings, payrolls, the sort of things that should be reduced when branch offices are closed and ATMs are brought in. The results? Strassmann concludes: "In 1989 one dollars' worth of non-interest expense supported $3.65 dollars worth of revenue, whereas seven years later--after a period of sustained computerization--the same dollar supports only $2.80 of revenue." If banks aren't more productive because of the miracle of computers, then what about the computer industry? Have computers made for better programmers? Have they reduced the need for support personnel? Is it a case of physician, heal thyself, but in this case an industry that needs to rethink its personnel policies?
An interesting dichotomy exists between the hardware and software sectors. From 1960 to 1984 employment in the manufacturing of computers and computer equipment rose 259% compared with an increase of 74% in nonfarm employment. However, from 1984 to 1995, computer manufacturing lost 32% of its workforce, according to the Bureau of Labor Statistics. In a 1996 report published in the Monthly Labor Review, Jacqueline Warnke, an economist at the BLS, wrote: "As the technologies of manufacturing computers becomes more routine and cheaply available, jobs are eliminated from computer manufacturing. On the other hand, industries that support computers have shown a remarkable increase in employment." "It's a myth that computers have measurably increased overall U.S. productivity of information. Whatever productivity gains may have happened to increase profits took place in the factories and the warehouses," writes Strassmann. And that's where automating technologies such as robotics and machine vision have had the most impact. Not in the executive offices--where solitaire and extracurricular web surfing are the order of the day--but on the factory floor. The most white-collar of the productivity-enhancing tools is a technology known as Case-Based Reasoning. CBR is a technology that collects information, say complaints from irate owners of home PCs that won't work, and retains it in a database so future operators can type in a few strategic keywords, such as model, software title, and installed peripherals and find an answer from a past case. This is the 1990s version of the expert system, which was the 1980s incarnation of artificial intelligence. Distill the wisdom of the chief cookie baker, code it into a program, and you no longer need the baker. Now, with unemployment at its lowest level since 1970, with birth rates declining and Silicon Valley companies so desperate that they're recruiting programming talent out of high schools, the big threat confronting the technology industry is not the next product cycle, software patents, or the breakup of the Microsoft/Intel oligopoly, but the lack of qualified personnel. Next in line in the threat department: the growing movement that challenges the notion that computers are making us more productive. Not since Sputnik kicked off a round of national hand-wringing over the strategic importance of engineering and science has there been so much attention paid to the issue of where America's brains are. Ask any CEO what their biggest problem is, and it isn't the government, but stupid, untrained, illiterate employees. No wonder the private sector spends more money on computer-based training than the public school system does, but, as David Gelertner has so eloquently opined, perhaps what we need more of are the three fundamentals of reading, writing and arithmetic and not a PC for every pupil. External links: | top | See also: Productivity
by the numbers Learning
by example (Forbes, June 8, 1992)
Forbes Front Page | Forbes Magazine | The Toolbox © 1998 Forbes Inc. Terms, Conditions and Notices
|