The resurgence of inflation has created a strange situation. Inflation is bad for ordinary people: their money buys less of what they need. But the prevailing approach to containing inflation is also bad for ordinary people: higher interest rates depress economic activity and keep wages down. What is this strange bind we’re in?

In the nineteenth century, political economists like David Ricardo tried to reconcile the notion that economic value was produced by workers with the idea that rewarding labor over a certain level would be a drain on growth. They claimed that there existed a “wage fund,” a specific amount of money that could be spent on wages without upsetting the laws of economics. That idea seemed perfectly appropriate for the times: nineteenth-century industrializing England, the world of grinding urban poverty portrayed in Dickens novels.

A politics founded on capital gains still structures the way leading economists and policymakers think about the economy.

Twentieth-century developments appeared to refute this idea, showing how economic growth and wage increases could coexist. The idea of employment as the ticket to a middle-class lifestyle became hegemonic—just plain common sense. This meant an increase not just in the formal wage (payment for individual work) but in the social wage, the benefits funded by an expanding fiscal state  that flowed primarily to white male wage workers. The reforms of the Progressive Era and the New Deal thus transformed wage-labor—for many centuries something to be avoided, the plight of the socially marginalized—into the foundation of full economic citizenship. They made it possible to imagine a more or less coherent lifecycle on the basis of secure employment: buy a house, send your kids to college, and save for retirement. But they simultaneously introduced a cycle of inflationary pressures: anticipating price increases, unions demanded wage increases to preserve purchasing power, which in turn helped push prices up. The wage-price spiral that developed in the 1970s represented a major headache for anyone invested in the monetary stability of the postwar order—the middle classes very much included.

Book after book over the last few years has documented how this New Deal consensus has been steadily but ruthlessly undone. The turning point came in the 1970s, when neoliberal policy began to decimate unions, deregulate markets, and reduce social expenditure. But even as employment ceased to be a ticket to the middle class, rapid financial expansion laid the basis for a new middle-class politics. Far from creating a world of “neutral” money, neoliberalism shifted inflationary pressures from wages to assets. Between 1982 and 2022 the total dollar value of U.S. corporate stock ballooned by a factor of 61; it had little more than doubled between 1962 and 1982. The growth of home values may seem less spectacular—an 18-fold increase over the last half century—but when adjusted for inflation and plotted against real wages, the latter is essentially flat.

The result was a new middle-class politics that revolved around capital gains. It was on this foundation that the Democratic Party began to reconstruct progressive thought and policymaking in the 1980s, creating a distinctive form of “progressive neoliberalism,” as Nancy Fraser has called it. And it is this middle-class politics that still structures the way leading economists and policymakers think about the economy.

Rumors of the death of neoliberalism are indeed greatly exaggerated. At a moment when economic authorities are calling for higher unemployment to combat supply-side inflation, it is essential to recall how U.S. political economy has been remade to serve the interests of asset owners at the expense of wage-earners. Only by reckoning with this asset economy head on can we envision a future that breaks free of it.


The Federal Reserve was the institutional lynchpin of the postwar arrangement. It has become customary to imagine the 1950s and ’60s as an age of civilized capitalism that worked for everyone. But the Fed always tried to impose limits on the social ends toward which the state could direct its growing fiscal powers. In its view, overly generous entitlement programs were bound to take people off the labor market, bringing the economy too close to full employment and putting upward pressure on wages and prices. The Fed used restrictive monetary policy to rein in these feared excesses. The consequences were always layered, disproportionately harming Black and other marginalized workers who often did not enjoy the protection of powerful unions.

Far from creating a world of “neutral” money, neoliberalism shifted inflationary pressures from wages to assets.

This arrangement was premised on the Federal Reserve having a reasonable degree of control over the amount of credit created by the banking system. From the second half of the 1960s, such control grew increasingly tenuous. Banks saw too many opportunities for profitable lending in a growing economy, and they found too many ways to get around regulatory restrictions. The lack of control over credit growth fanned inflation, and unions started pricing anticipated inflation into their wage demands.

Benchmarking wage claims in this way made perfect sense for unions. The consumer price index was invented in the early twentieth century by progressively minded institutional economists as a way to measure what money wages can buy. But systematically measuring purchasing power institutionalizes a strange feedback loop. Since wages are one of the key factors shaping prices, there is always the possibility that a wage increase will partly undo itself.

Recalibrating their thought around the specific challenges of the 1970s, neoliberal thinkers took this possibility to be a near certainty. Monetarism and rational expectations theory, in particular, argued that any gains originating outside the competitive logic of the market would be quicky eaten up by growing inflation. The political message of Milton Friedman’s long-run Phillips curve (a straight vertical line) was clear: any attempts to “artificially” inflate the employment level were ultimately futile, but they could do tremendous damage in the short run. The classical theory of the wage fund had been updated for the financial era.

This was the spirit behind the monetary policy turn initiated by Paul Volcker, the financial hawk appointed as Federal Reserve chairman in 1979 by President Jimmy Carter, who was increasingly desperate to get inflation under control. Volcker was convinced that the inflationary growth of wages was doing a great deal of harm to the U.S. economy, and he saw only one solution: “The standard of living of the average American has to decline.”

Volcker recognized that monetary policy had become the critical enabler of the inflationary spiral. But by the same token, it could be a powerful lever of change. Volcker saw his job as one of resetting expectations, and he was under no illusion that the Fed would by itself be able make the necessary adjustments. Unlike many less honest or astute observers, Volcker recognized that the Reagan administration’s war on organized labor was key to the eventual success of anti-inflationism.

What’s become known as the Volcker shock—a sudden clampdown on credit creation that pushed interest rates to spectacular levels and induced a recession—became the turning point it sought to be. But it never was able to slow down the creation of credit for very long, and inflationary pressures were not suppressed but rerouted. The 1980s reversed the economic dynamic of the ’70s: a combination of wage-price pressures and stagnating asset values was replaced with a combination of zero wage growth and rapid increases of asset values. Whereas the former situation could only be managed by subjecting the economy at large to ever worsening inflation, the instability caused by asset-driven financial growth could be addressed through more targeted interventions—mainly, bailouts for firms and markets considered “too big to fail.” This shift meant that new expectations were created for asset holders while wage-price expectations were wound down.

Traditionally, regulators and politicians had considered bailouts a major source of moral hazard, a way of rewarding irresponsible behavior that was impossible to explain to the tax-paying public. Had U.S. authorities continued in this spirit and decided to let failing firms fail, we would now be living in a very different world. Instead, the asset economy was born: a new economic logic that revolved around the promise of capital gains and asset inflation. Much of what we think of as the sophisticated, fast-paced world of financial innovation is underpinned by the willingness of the U.S. state to put a floor under the value of asset classes.


This was never quite a story of the 1 percent enriching themselves at the expense of the masses. The Savings and Loans bailout of the late 1980s is only one among many examples of the too-big-to-fail principle applied in a way that reflected the interests of a broad segment of the U.S. population. But when it came to reframing the destruction of industry and the disappearance of full employment as opportunities, the 1980s neoliberalism of Reagan ultimately lacked conviction. As historian Lily Geismer documents in her recent book Left Behind (2022), it was the Third Way philosophy of the Clinton era that transformed financial expansion into a more compelling middle-class politics.

Hatched largely in the Democratic Leadership Council (DLC) and similar fora, Clinton’s Third Way was a left-of-center political philosophy that accepted key tenets of neoliberal thought. As a reading of Al From’s (founder of the DLC) account of that era makes clear, the New Democrats saw inflation as the main threat to public interest in a strong economy, and they viewed wage pressures and deficit spending as the key drivers of inflation. In the wake of this reformulation of neoliberal market thought, economic policymakers viewed with growing skepticism any schemes designed to improve the lives of Americans as wage-earners, people who have nothing to sell but their labor power. In the new way of thinking, the only acceptable earnings increase was the result of human capital formation. It was the era of the knowledge economy, and the power of training and education featured prominently in the progressive-neoliberal fantasy of the transubstantiation of labor into capital. As Bill Clinton and Al Gore put it, “what you earn depends on what you learn.”

Much of finance is underpinned by the willingness of the U.S. state to put a floor under the value of asset classes.

With the ideological appeal of neoliberalism now more firmly joined to a practical program centered on democratizing access to capital gains, the middle-class credentials of capitalism received a new lease on life. The nostalgia that the 1990s now evokes for many committed Democrats of the Baby Boomer generation should be seen in that light: it represented the high point of this reconfigured asset-focused middle-class politics, when rising home and stock prices delivered benefits widely enough to give credence to promises of inclusive wealth and comfortable retirement, and meaningful returns on education for some gave the human capital dream sufficient traction to forestall political revolt against general wage stagnation. The Clinton administration’s proactive embrace of fiscal discipline and balanced budgets, not least through major cuts to welfare spending, largely relieved the Fed of having to police the government on that score. In combination with the steady weakening of organized labor, this arrangement meant that Alan Greenspan could focus on backstopping financial markets and promoting asset inflation without fear of price inflation—which he did.

By the start of the twenty-first century, the need to keep asset values growing had become thoroughly embedded in the muscle memory of central bankers. The steady lowering of interest rates allowed them to do just that. For much of the ’90s, the federal funds rate had hovered around 6 percent. But in response to the bursting of the dot-com bubble, the Greenspan Fed started aggressively lowering the rate, all the way down to 1 percent by 2004—a negative real interest rate that subsidized rather than charged for borrowing. The Bush administration observed none of the fiscal austerity that the Clinton administration had, but its spending was firmly targeted at corporations and the wealthy, so it did not pose much of a threat to the management of price inflation.

This pattern of financial expansion set the scene for the financial crisis of 2007–08. At that point the U.S. public had come to associate financial crises primarily with the speculative gyrations of high finance and the spectacular failure of overleveraged hedge funds. But now the liquidity bottlenecks originated not on Wall Street but on Main Street, where a growing number of Americans were struggling to keep up with their mortgage payments. The fact that liquidity problems originated at the bottom of the system constituted a fundamental challenge to the idea that the benefits of asset inflation and capital gains could be extended indefinitely. The core problem—the disconnect of wages and asset prices—was not something that the Obama administration ever made any serious plans to tackle. Its management of the crisis and the recovery was fully driven by the belief that any attempt to improve the lives of ordinary people needs to go through and obtain authorization from the financial markets.

Indeed, as Obama put it himself, his administration was populated with “folks who understand the financial markets”—economists such as Larry Summers and Timothy Geithner whose careers had taken flight during the Clinton years. Their version of Keynesianism was built on an acceptance of the neoliberal critique of traditional Keynesianism and its inflationism. The experience of the roaring ’90s had reconciled them to the end of wage growth, and they now emerged as the all-too-familiar public figures who think of themselves as politically progressive but are constitutionally incapable of identifying with those who depend on a monthly paycheck.

With interest rates at zero, an economic system that wouldn’t restart of its own accord, and an administration that was committed to fiscal austerity, the only way for the Federal Reserve to prevent an economic depression on the scale of the 1930s was to adopt the asset-purchasing (“quantitative easing”) policies that we have become acquainted with over the past decade. This amounted to a full normalization of the bailout system—indeed, its proactive implementation. As Gerald Epstein and Robert Pollin have demonstrated in these pages, bailouts are not exceptions to the core logic of neoliberalism; they are its modus operandi.

The upshot is that the most powerful actor in the global political economy became locked into a policy pattern whose contradictions were increasingly visible. Katharina Pistor’s definition of the Fed’s monetary sovereignty captures this paradoxical state of affairs: the capacity to backstop a financial system over which it was unable to exercise substantial control.

The desire to drive down wages has become untethered from any actual or plausible reason for doing so.

While this escalation of the logic of the asset economy served to keep asset values afloat—preventing the Great Recession from turning into a second Great Depression—it was unable to recapture the spirit of financial democratization of the 1990s and early 2000s. Since the mid-2010s, as each round of asset purchases did less and less to help ordinary people and sired more and more new millionaires, the Fed became more aware of the contradictions of its own position. Cautiously, it sought to highlight the inherent difficulty of managing an economic system that had effectively ruled out social spending and wage increases. Janet Yellen’s willingness to issue occasional reminders of the need for fiscal stimulus led Republicans to portray her as a tax-and-spend liberal, destroying any remaining chance of reappointment after her first term. But the concerns of Yellen’s successor, current Fed chair Jerome Powell, were for some time broadly similar. Although he took much greater care to maintain support for his policies across the political spectrum (not always successfully), he nonetheless felt compelled to point out that monetary policy was not able to solve all economic problems.

The pandemic blew such critical concerns out of the water. The public health emergency created an unusual situation for managers: workers suddenly became less dispensable. The resulting mild upward pressure on wages never became sustained or systemic, but mainstream media outlets nonetheless worked hard to conjure the threat of a wage-price spiral. (A Bloomberg article is emblematic of the genre: it acknowledges the structural context of wage stagnation but quickly goes on to argue that any wage increase could potentially set off the “dreaded spiral” and to wonder if the Federal Reserve could be underestimating the risk.) The prominence of these concerns reflected the extent to which the 1970s has shaped the contemporary political imagination.


In the end, as we all know now, inflation did return, and it has proven stickier than many expected. The Biden administration’s pandemic stimulus was intended to forestall a deflationary movement, and the combination of falling unemployment and a union organizing drive has pushed up wages in select sectors. But the key ingredient that would justify talk of a wage-price spiral is missing: workers’ ability to command wage increases that factor in anticipated price increases.

This is the key difference between the 1970s and the present. The assault on organized labor over the past decades has weakened the bargaining position of workers, leaving them unable to demand compensatory wage rises en masse. The ability and willingness of corporations to increase their mark-ups and profits have been far more significant in turning the transitory inflation associated with pandemic supply chain disruptions into sustained upward pressure on consumer prices.

These facts are not in dispute. And yet, the mainstream conversation has continued to revolve around the danger of a wage-price spiral and the need for workers to tighten their belts. On Twitter, Keynesian economist Olivier Blanchard—not a particularly reactionary character—worried about the difficulty of “convincing workers that unemployment has to increase to control inflation” in a situation where they are not responsible for inflation. This is not economic analysis: it’s neural cramp, induced by several decades of rationalizing the interests of an increasingly narrow and inaccessible middle class in the value of their asset holdings.

This logic requires not simply that workers bear the brunt of the problems that other people create but also that they acknowledge that doing so is in their own interest. Jane Elliott has argued that neoliberalism is not primarily interested in denying people agency, but in ensuring that they actively use their freedom to make their own situation worse. The strange bind that inflation discourse has imposed on workers—wage cuts will hurt you and your family, but all the other options are even worse—is emblematic of that state of affairs. The abject failure of contemporary Keynesianism consists in its acceptance of this logic, as if it reflected not a particular (neoliberal) way of seeing and thinking but an ontological condition ensuring that any attempt by ordinary people to improve their lives will always backfire to make things worse.

The real threat is that the asset economy will keep figuring out new ways to endure.

Powell, for example, has been upfront with the American people: the purpose of interest rate rises is to “get wages down.” If Ricardo and his generation of classical political economists still had specific ideas about what a correct wage level was, no one in the Federal Reserve would be able to specify a standard by which we can say that wages are too high. This is how we’ve ended up in a situation where Summers’s oracle-like number-spouting—“We need five years of unemployment above 5% to contain inflation”—is broadcast as serious commentary. It is tempting to dismiss these as arbitrary numbers, pulled out of thin air or perhaps from somewhere else. The real issue is not that they are wrong so much as beside the point, given that wages are not responsible for inflation in the first place. Should corporations see additional opportunities for raising prices to boost profits, Summers will just revise his numbers upward, demanding yet more sacrifice. The desire to drive down wages has become untethered from any actual reason for doing so or any plausible justification.

This all feels very archaic, even feudal. The idea that neoliberal market thought is now rapidly giving way to a political movement simply reasserting traditional hierarchies is shared by reactionary conservatives like Josh Hawley, and critical theorists like Jodi Dean and Robert Brenner, who are interested in the notion that we are entering an era of neo-feudalism.

It’s certainly not a bad moment to remind ourselves that the possibility of attaining a decent standard of living on the basis of wage-labor is a fairly recent invention, historically speaking, and that a capitalist economic order is perfectly consistent with the ownership of other people’s human capital and various forms of unfree labor. But it’s not particularly helpful to think of these trends as signaling a break with the economic model that has emerged over the past two decades. They are, more plausibly, simply the next stops on a road we’ve been travelling down for a while.

The real threat is not so much that the asset economy has definitively run out of steam; it’s that it will figure out new ways to keep going. Of course, in the current moment we can see interest rate increases hurting asset values as well as wages. But to imagine that these variables normally move in tandem would be to ignore the lessons of almost half a century of neoliberalism. A new phase of asset-driven growth is, in any case, what the wealthy are counting on: they are currently busy buying up assets because the market lull is a good moment to add to one’s portfolio before things take off again. If the Federal Reserve is successful in what it euphemistically calls “stabilizing inflation,” it will be facing a more extreme version of the problem it navigated in past years, having to push even more liquidity into the system through asset purchases simply to keep the system going.

It’s theoretically possible that current developments will create a situation where clearly delineated, easily comprehensible alternatives to the hegemony of neoliberalism will present themselves. But capitalism has a way of confusing our options, and the intensifying contradictions of progressive neoliberalism do not by themselves portend any particular future. What we should be asking ourselves is why we keep walking Blanchard’s Möbius strip. If there was a brief moment in the twentieth century where it really seemed that workers had figured out how to live in a system that was premised on their exploitation, we have now lived through several decades where the inherent impossibility of that project has been impressed on us at every turn. People like Blanchard are now spelling it out for us in the most literal way possible.