A world of knowledge explored

READING
ID: 7WTXKH
File Data
CAT:Economics
DATE:December 7, 2025
Metrics
WORDS:1,435
EST:8 MIN
Transmission_Start
December 7, 2025

Technology Outpaced Education Creating Wage Gap

Target_Sector:Economics

The college graduate working at Starbucks has become such a cultural cliche that we barely notice the irony anymore. Meanwhile, software engineers fresh out of school command six-figure salaries before they've written their first line of production code. This growing divide didn't happen by accident—it's the result of a decades-long transformation in how technology rewards different types of work.

The Race Nobody Talks About

In 1974, Dutch economist Jan Tinbergen proposed a simple but powerful idea. He called it "the race between education and technology." On one side, technological advances keep demanding workers with more sophisticated skills. On the other, education systems scramble to produce those workers. Whoever's winning the race at any given moment determines whether inequality grows or shrinks.

For most of the 20th century, education was winning. American high school graduation rates soared. College enrollment exploded. Workers gained skills faster than technology could outpace them. The result? Wage inequality actually declined for decades.

Then something changed in the early 1980s. The college wage premium—how much more college graduates earn than high school graduates—had been falling throughout the 1970s as colleges churned out graduates. Suddenly it reversed course and began climbing steeply. A college degree started being worth dramatically more in the job market.

The timing seemed obvious. Microcomputers had just arrived.

When Computers Entered the Office

By the late 1990s, economists had reached what one researcher called "virtually unanimous agreement" about what was happening. Technology—especially computers—was increasing demand for highly skilled workers while making less-skilled workers less valuable. They gave this phenomenon an ungainly name: skill-biased technological change, or SBTC for short.

The evidence seemed compelling. Workers with more education were much more likely to use computers on the job. Industries that adopted computers more heavily saw bigger increases in wage inequality. The pattern held across different sectors, suggesting this wasn't just about manufacturing jobs moving overseas or specific industries declining.

Economists built elaborate models using something called a CES production function. Don't worry about the math—the core insight was simple. The economy needs both skilled and unskilled workers, but they're not perfect substitutes for each other. When technology makes skilled workers more productive, companies want more of them. Basic supply and demand takes over. If the supply of college graduates doesn't keep pace, their wages rise.

The numbers backed this up. Researchers estimated that if you increased the proportion of college workers by 10%, their relative wage would fall by about 6.6%. This elasticity—how responsive wages are to supply changes—became a key building block of the theory.

The Cracks Start Showing

Then the 1990s happened, and things got weird.

Computer technology didn't slow down—it accelerated. The internet arrived. Software ate the world. By every measure, technological change was speeding up. But wage inequality stopped growing as fast. In some dimensions, it plateaued entirely.

In 2002, economists David Card and John DiNardo published a devastating critique. They pointed out that SBTC couldn't explain several glaring patterns. The gender wage gap had been narrowing throughout the 1980s and 1990s, even as overall inequality rose. If technology favored skills and men had more of them, why were women catching up? Similar puzzles emerged with racial wage gaps and how returns to education varied by age.

Card and DiNardo's conclusion was blunt: SBTC "falls short as a unicausal explanation" for what was happening to wages. Technology mattered, sure. But so did globalization, declining union membership, changing labor market institutions, and a host of other factors.

The field had been too quick to crown a single explanation.

The Hollowing Out of the Middle

The next generation of research got more sophisticated. Instead of just dividing workers into "skilled" and "unskilled," economists started looking at specific tasks and occupations.

What they found was more nuanced and more troubling. Technology wasn't simply favoring the skilled over the unskilled. It was "hollowing out" the middle of the labor market.

Think about bank tellers, travel agents, and factory workers doing routine assembly. These were solidly middle-class jobs requiring moderate skills. Computers and robots could handle many of their tasks. Meanwhile, jobs at the top—requiring creativity, complex problem-solving, or management skills—remained hard to automate. Jobs at the bottom requiring physical presence and adaptability, like home health aides or restaurant workers, also proved resistant to automation.

The result was "job polarization." Employment grew at the top and bottom while middle-skill jobs disappeared. This explained patterns that simple SBTC couldn't. It wasn't just about education anymore. It was about which tasks machines could do.

Automation Versus Reinstatement

More recently, economists have drawn an even sharper distinction. Not all technological change affects workers the same way.

"Automation" displaces labor entirely. Self-checkout machines replace cashiers. Industrial robots replace assembly line workers. This directly reduces demand for certain types of work.

But there's another side: "reinstatement." This is when technology creates entirely new tasks that didn't exist before. Someone needs to maintain those robots. Someone needs to manage the e-commerce platform that the physical store never needed. Someone needs to analyze the data that's suddenly available.

Throughout economic history, reinstatement has largely balanced automation. Yes, tractors replaced farmhands, but the resulting productivity gains created wealth that fueled demand for new products and services, which required new workers. Ninety percent of Americans once worked in agriculture. Now it's less than 2%. Yet we don't have 88% unemployment.

The question haunting economists today is whether this balance will hold.

The AI Uncertainty

Which brings us to artificial intelligence.

In 2017, pollsters asked leading U.S. economists whether robots and AI would substantially increase long-term unemployment. About 35-40% said yes. For a profession that spent two centuries being relentlessly optimistic about technology's effects on labor markets, that's a remarkable shift.

MIT economist David Autor, one of the field's leading voices, argues we've entered a fourth paradigm in how we think about technology and inequality. The first was the education race. The second was task polarization. The third was the automation-reinstatement framework.

The fourth is uncertainty.

Previous waves of automation primarily affected routine manual and cognitive tasks. AI is different. It's starting to handle non-routine cognitive work—exactly the type that's been sheltered until now. Radiologists reading X-rays, lawyers reviewing contracts, programmers writing code—these high-skill, high-wage jobs suddenly look vulnerable.

We genuinely don't know what happens next. Will AI create enough new tasks to offset the ones it automates? Will it enhance human workers or replace them? The honest answer is that the economics profession is still figuring it out.

What the Numbers Tell Us

Despite all the complications and debates, some facts remain solid.

Educational inequality—the wage gap between those with more and less education—explains about 60% of overall earnings inequality growth between 1980 and 2017. Even in recent years, it accounts for about 40%. The education race hasn't become irrelevant. It's just not the whole story.

The college wage premium remains historically high. A college degree is still one of the best investments most people can make. But the variance has increased dramatically. Some college degrees lead to lucrative careers. Others leave graduates with debt and dim prospects. The average masks enormous differences.

Technology continues advancing at breakneck speed. But its effects on inequality aren't automatic or predetermined. They depend on policy choices, institutional structures, and how we organize work.

Where This Leaves Us

The simple story—computers favor skilled workers, so inequality rises—was never quite right. Reality is messier.

Technology does tend to complement higher skills and substitute for lower ones. But it also creates new opportunities, eliminates some jobs while creating others, and interacts with dozens of other economic forces. Globalization, changing labor market institutions, educational policy, and even cultural shifts all matter.

Perhaps most importantly, the future isn't fixed. Autor puts it well: we should treat it "as an expedition to be undertaken" rather than "a fate to be divined."

How AI affects inequality will depend partly on the technology itself. But it will depend just as much on how we choose to deploy it, what skills we invest in developing, and what policies we put in place to shape its impact.

The race between education and technology continues. We're just starting to realize it's a race we can influence, not just observe. Whether inequality continues rising or begins to fall won't be determined by algorithms alone. It will be determined by choices—about education, about labor markets, about how we structure work in an age of intelligent machines.

The college graduate at Starbucks and the software engineer with the six-figure salary are both part of the same story. Understanding that story is the first step toward writing a better ending.

Distribution Protocols