It’s no secret that the years since the Great Recession have been hard on American workers. Though unemployment has finally dipped below six per cent, real wages for most have barely budged since 2007. Indeed, the whole century so far has been tough: wages haven’t grown much since 2000. So it was big news when, last month, Aetna’s C.E.O., Mark Bertolini, announced that the company’s lowest-paid workers would get a substantial raise—from twelve to sixteen dollars an hour, in some cases—as well as improved medical coverage. Bertolini didn’t stop there. He said that it was not “fair” for employees of a Fortune 50 company to be struggling to make ends meet. He explicitly linked the decision to the broader debate about inequality, mentioning that he had given copies of Thomas Piketty’s “Capital in the Twenty-first Century” to all his top executives. “Companies are not just money-making machines,” he told me last week. “For the good of the social order, these are the kinds of investments we should be willing to make.”
Such rhetoric harks back to an earlier era in U.S. labor relations. These days, most of the benefits of economic growth go to people at the top of the income ladder. But in the postwar era, in particular, the wage-setting process was shaped by norms of fairness and internal equity. These norms were bolstered by the strength of the U.S. labor movement, which emphasized the idea of the “living” or “family” wage—that someone doing a full day’s work should be paid enough to live on. But they were embraced by many in the business class, too. Economists are typically skeptical that these kinds of norms play any role in setting wages. If you want to know why wages grew fast in the nineteen-fifties, they would say, look to the economic boom and an American workforce that didn’t have to compete with foreign workers. But this is too narrow a view: the fact that the benefits of economic growth in the postwar era were widely shared had a lot to do with the assumption that companies were responsible not only to their shareholders but also to their workers. That’s why someone like Peter Drucker, the dean of management theorists, could argue that no company’s C.E.O. should be paid more than twenty times what its average employee earned.
That’s not to imply that there aren’t solid business reasons for paying workers more. A substantial body of research suggests that it can make sense to pay above-market wages—economists call them “efficiency wages.” If you pay people better, they are more likely to stay, which saves money; job turnover was costing Aetna a hundred and twenty million dollars a year. Better-paid employees tend to work harder, too. The most famous example in business history is Henry Ford’s decision, in 1914, to start paying his workers the then handsome sum of five dollars a day. Working on the Model T assembly line was an unpleasant job. Workers had been quitting in huge numbers or simply not showing up for work. Once Ford started paying better, job turnover and absenteeism plummeted, and productivity and profits rose.
Subsequent research has borne out the wisdom of Ford’s approach. As the authors of a just published study of pay and performance in a hotel chain wrote, “Increases in wages do, in fact, pay for themselves.” Zeynep Ton, a business-school professor at M.I.T., shows in her recent book, “The Good Jobs Strategy,” that one of the reasons retailers like Trader Joe’s and Costco have flourished is that, instead of relentlessly cost-cutting, they pay their employees relatively well, invest heavily in training them, and design their operations to encourage employee initiative. Their upfront labor costs may be higher, but, as Ton told me, “these companies end up with motivated, capable workers, better service, and increased sales.” Bertolini—who, as it happens, once worked on a Ford rear-axle assembly line—makes a similar argument. “It’s hard for people to be fully engaged with customers when they’re worrying about how to put food on the table,” he told me. “So I don’t buy the idea that paying people well means sacrificing short-term earnings.”
That hardly seems like a radical position. But it certainly makes Bertolini an outlier in today’s corporate America. Since the nineteen-seventies, a combination of market forces, declining union strength, and ideological changes has led to what the economist Alan Krueger has described as a steady “erosion of the norms, institutions and practices that maintain fairness in the U.S. job market.” As a result, while companies these days tend to pay lavishly for talent on the high end—Bertolini made eight million dollars in 2013—they tend to treat frontline workers as disposable commodities.
This isn’t because companies are having trouble making money: corporate America, if not the rest of the economy, has done just fine over the past five years. It’s that all the rewards went into profits and executive salaries, rather than wages. That arrangement is the result not of some inevitable market logic but of a corporate ethos that says companies should pay workers as little as they can, and no more. This is what Bertolini seems to be challenging. His move may well turn out to be merely a one-off, rather than a harbinger of bigger change. But inequality and the shrinking middle class have become abiding preoccupations on Main Street and in Washington. It’s only fair that these concerns have finally reached the executive suite. ♦
James Surowiecki is the author of “The Wisdom of Crowds” and writes about economics, business, and finance for the magazine.