Barbara Fried’s article concludes with a hopeful note that the future of blame might be better than the past. But there are two movements pulling in opposite directions that should worry us. On the one hand, entrenched inequality and a weak economic recovery help to personalize economic failure, making ample room for blame. And it could get worse. On the other hand, statistical techniques may contextualize individual actions and de-emphasize blame, but in ways that do not further Fried’s egalitarian vision.

Five and a half years into the biggest downturn since the Great Depression, the economy remains fundamentally weak. We are roughly 10 million jobs away from full employment, with anemic job and wage growth. These are now familiar facts, yet the crisis has changed remarkably little in our public narrative of how the economy works.

The right reads the economic crisis through the language of blame and individual choices. Homeowners are blamed for getting in over their heads with bad mortgages, and there are no current legislative proposals that would redistribute this burden more fairly. Conservative economists claim that high unemployment is primarily driven by individuals who lack marketable skills and by employable people who choose unemployment in order to take advantage of the social safety net. The available data do not bare this out.

Meanwhile even some liberal arguments downplay the need to bring the economy to full employment. Instead, like liberal economic managers in the 1990s, today’s liberals focus on reducing the deficit and investing in long-term projects.

We need to be cautious. Alternatives to blame may be even worse.

There is a reason for this oversight: the explosion in inequality over the past 30 years. Inequality isn’t just a matter of income stratification. It’s also about social distance. The top 1 percent has recovered economically, so it’s difficult for them to identify with people who can’t find work, or to understand how foreclosures devastate neighborhoods or how mass unemployment undermines self-worth and the cohesion of communities.

As a recent report from the think tank Demos noted, the rich are more likely than the non-rich to say that deficit reduction is their highest priority, that the earned income tax credit shouldn’t be increased, and that the government doesn’t have a role in providing a decent standard or jobs for the unemployed. They regard their own successes entirely as a reflection of individual merit. The entrenched narrative that individual choices alone lead to success or failure bolsters a culture of blame and the public policies that result. 

But even if the narrative of blame were to change, if it were include a holistic view of individual actions, it is not clear that such change would lead to more egalitarian policies. A future in which people’s actions are situated more fully in their environments could be more authoritarian. Such a future is possible thanks to revolutionary statistical methods and newly available massive data sets that contextualize individual actions and reveal patterns in human behavior. Right now these algorithms pick movies and books for us, but soon they will structure how we carry out the government’s responsibilities. We need to proceed with caution.

Recall the role that environments played in the intellectual justification for rising incarceration in the 1980s. James Q. Wilson, whose passage opens Fried’s essay, argued that “disorder” and “outsiders” converged to create the conditions of mass crime. But rather than call for broader equality of opportunity or rehabilitation, he and others. called for more systematic arrests and policing to restore order.

Today policing also reacts to background conditions instead of individuals, now by using statistical techniques. New York’s COMPSTAT, for example, is used to predict crimes and deploy police resources accordingly. The ACLU argues that this approach encourages police to fish for crimes in order to fill reporting quotas, in turn leading to the incredibly low success rates of the city’s infamous stop-and-frisk program and to skyrocketing misdemeanor arrests.

As the legal scholar Bernard Harcourt argues in Against Prediction (2006), the application of statistical techniques to crime fighting reflects the desire to situate criminals in a more completely understood environment, or to better understand criminals themselves. But in effect the statistical approach leads to “emphasizing efficiency over crime minimization” and to an explosion in incarceration.

Perhaps we are at a moment when attitudes toward blame are changing. But this change could be for the worse. It could lead to statistical methods justifying bad actions in the deployment of government force, while doing little to push against the reactionary, libertarian view of how the entire economy works.