On CNN earlier this week, Democrat Rep. Ro Khanna said:
“If you look at the minimum wage, it increased with worker productivity until 1968 and that relationship was severed. If workers were actually getting paid for the value they were creating, it would be up to $23.”
This claim is frequently made in minimum wage debates. It implies that over the past 52 years, the failure of politicians to raise the federal minimum wage in line with aggregate productivity (output per worker hour) has caused a disconnect between worker’s production and their pay. And this is taken as evidence of companies using their power to exploit workers.
Let’s leave aside for a moment the power of the idea of your inherent right to contract your labor as you see fit. There are two obvious problems with Khanna’s analysis.
First, state, local, and city governments across the country already often set minimum wages significantly higher than the federal level, particularly in higher‐productivity regions. Even if minimum wages are considered a worthy policy tool and productivity a good guide to setting their level, setting minimum wages in line with the productivity conditions within a locality makes more sense than setting a homogenous higher wage floor for the whole country. Country‐wide productivity statistics mask vast productivity discrepancies across regions.
Second, and more importantly, comparing productivity gains among all workers as if these reflect what should have happened to hourly wage rates for minimum wage workers in particular industries is obviously a nonsense. After all, different industries experience different productivity growth rates over time, as do different types of workers within industries.
Sadly, a productivity series solely for minimum wage workers is not available. But back in 2019 I wrote a paper that tried to look at how the federal minimum might have evolved between 1987 and 2017 if it had tracked various different productivity trends:
The federal minimum wage in 1987 was $3.35, which is $7.32 in 2017 dollars. Since then, private nonfarm labor productivity has increased by an average of just under 2 percent per year. If the federal minimum wage had increased in line with trend productivity over that period, it would have increased to $13.22 by 2017 (see Figure 1).
Yet labor productivity in the restaurant sector (often regarded as a better proxy for a typical minimum wage industry) rose by an average of just 0.4 percent per year between 1987 and 2017 (with unit labor costs increasing by 3.3 percent per year). If pegged instead to this productivity measure, the minimum wage would have increased by just 13 percent in real terms over three decades, rising to $8.25 by 2017.
Given that the actual federal minimum wage was $7.25 in 2017 and that 22 states had minimum wages higher than $8.25, this productivity series and start date imply that minimum wages were higher in 2017 than justified by restaurant productivity trends since 1987 in much of the country.
Some subsectors have had even worse productivity performances. Labor productivity in “drinking places for alcoholic beverages” (i.e., bars and pubs) actually fell, on average, by 0.2 percent per year since 1987. If pegged to this trend productivity rate, the federal minimum wage would have fallen, too, to $6.89 in 2017 (see Figure 1)…
As I concluded then:
What this…analysis does show…is the danger of spurious comparisons between economy‐wide productivity and the level of the federal minimum wage. Making the link between the two explicit might lead us to deliver much higher wage floors than are justified by the productivity of workers in certain sectors or regions, causing significant localized job losses or other economic adjustments.