Cyber inequality is the notion that computers are helping splinter America economically. They seem one explanation for the sharp increase of wage inequality. Consider this oft-cited comparison: in 1979 a recent male college graduate earned 30 percent more than his high-school counterpart; by 1993 the gap was 70 percent. The prevailing wisdom among economists is that, as computers have spread, the demand for workers who can use them has increased. Wages for those workers rise, while wages for the unskilled (who can’t use them) fall.
What could be simpler: if the problem is no computer skills, then provide skills. Give the poor a tax credit for laptops, suggested Newt Gingrich. Although he retracted that proposal, he didn’t disavow the impulse. “[T]here has to be a missionary spirit,” he says, “that says to the poorest child in America, “Internet’s for you.’.” Vice President Al Gore gushes similarly about the Information Superhighway Technology, if denied, is doom; if supplied, it is salvation.
Bunk. Cyber inequality is mostly a myth. Computers do not really explain widening wage inequalities. The more important cause is a profound change in the business climate, begining in the 1970’s. Economic optimism waned and copetitive pressures intensified. Companies gradually overhauled the ways they pay, promote, hire and fire. Put simply, they began treating their most valuable workers better and their least valuable workers worse.
Everyone knows the pressures that prompted this upheaval: harsh recessions, tougher competition and more corporate debt. Shrinking profit margins reflect the intensity of those pressures. In the 1950s and 1960s, profits before taxes averaged about 16 percent of corporate revenues; they dropped to 11 percent in the 1970s and 9 percent in the 1980s.
When profit margins were fat, companies routinely promoted and provided across-the-board pay increases. The idea was to offset inflation and give something extra. Many companies adopted elaborate “job evaluation” systems that graded jobs acress different departments in an attempt to ensure “fairness.” Jobs with similar responsibilities would be paid similarly. Jobs with similar responsibilities would be paid similarly. “The focus was on internal equity,” says Paul Platten, a compensation expert with the Hay Group. Companies hired optimistically, because they expected higher profits from higher sales. Young and less skilled workers (the last hired) benefited.
Dropping profit margins devastated these practices. Across-the-board pay increases diminished. In a recent survey of 317 big companies, Buck Consultants found that 90 percent paid only merit increases. “Job evaluation” systems have been de-emphasized. More money is funneled into pay arrangements aimed at rewarding good performance. In the Buck survey, 30 percent of companies used lump-sum merit increases and 15 percent used “gainsharing.” New workers are hired more relucantly, because companies fear bloating labor costs; entry-level salaries are suppressed. Now the last hired often suffer.
Computers are only incidental to this process. This does not mean they haven’t changed the nature of work or eliminated some low-skilled jobs. They have. For example, the need for secretaries has dropped as more managers use computers. Between 1983 and 1993, the number of secretaries decreased by 14 percent. But this is an age-old process. New technologies have always destroyed and created jobs.
The issue is not whether some people have better computer skills and are rewarded for them; clearly, that happens. So what? Skill differences have always existed. The real question is whether computers have escalated those differences. By and large, the answer is no. Indeed, computers have spread–both at home and at work–precisely because more powerful microprocessors, memory chips and software have made them easier to use. Fewer generic computer skills are needed. As this happens, computers merely complement people’s other skills.
Almost accidentally, economists are now confirming that a broader process is fostering inequality. Men laid off between 1990 and 1992 have lost an average 20 percent of pay when they got new jobs, says a Census Bureau study. They didn’t become less skilled, but they were thrust into a hard labor market. Two thirds of the increase of inequality does not reflect growing gaps between more and less educated workers (say college and high-school graduates), reports Gary Burtless of The Brookings Institution. Rather, it reflects bigger gaps among workers with similar educations (say, college graduates). Companies are tougher at all levels. Finally, Peter Gottschalk of Boston College and Robert Moffitt of Brown University find that people’s earnings now fluctuate more from year to year than they used to; that’s consistent with harsher hiring, firing and pay practices.
Contrary to Gingrich and Gore, the Internet is not the promised land. Sure, our economic and social well-being would improve if some of our worst workers had better skills; but the skills they need most are basic literacy and good work habits. With those, computer competence will come if needed. The infatuation with computers as a cause or cure of social distress is misplaced. Mostly, computers mirror who we are: a people of vast vitality, great ingenuity and manifest imperfections.