Editors Note: This Forum is available as our spring 2017 print issue. We are pleased to make it freely available online thanks to the generous support of the Cameron Schrier Foundation, the William and Flora Hewlett Foundation, the McCoy Family Center for Ethics in Society at Stanford University, and the National Endowment for the Arts. Please help us continue to make content like this open access by becoming a member!


To state the obvious, humans are the creators of new technology and can shape the path it takes (at least for now). Yet one common characteristic of basic income advocates, and indeed of progressives more generally, is a near-fatalistic acceptance of the current path of technological development. Either the topic is avoided altogether or automation is seen as inevitable. For a progressive movement that routinely challenges the market discipline of capitalism, this constitutes a striking retreat.

What would it look like to govern technological development and its effects on workers?

A truly progressive agenda around the future of work should therefore add control over technology into the mix: control of which technologies are developed and to what ends, and how they are incorporated into the organization of work and production. Moreover this agenda needs to expand beyond the current fixation on automation. Technology has many other equally important effects on work: for example, deskilling or upskilling existing tasks; shifting consumer demand toward new industries and new jobs; enabling outsourcing and the integration of a global virtual labor force; changing the job matching process; or disaggregating or aggregating workforces.

To be clear, this is a friendly amendment. In the future we will likely need some form of income replacement as well as Brishen Rogers’s welcome call for a stronger public sector and more robust collective bargaining laws. But we should demand more.

What would it look like to claim our right as a society to govern technological development and its effects on workers and the labor market? Here are several possible strategies that move from less to more interventionist.

Mitigation

It is time that progressives develop a robust and well-funded mitigation agenda. Basic income is one form of mitigation, but fleets of omniscient robots are decades away. There are plenty of near- and medium-term technologies whose effects we can anticipate or already see. Immediate forms of restitution could include industry- or occupation-specific funding pools and the technology equivalent of Trade Adjustment Assistance (education, training, and job placement for workers whose jobs are impacted by computers). Any number of business-side taxes could be leveraged for funding, including the robot tax endorsed by Bill Gates or a requirement that Uber pay into a fund for every self-driving car it puts on the road. And again, mitigation is not just about responding to automation. We might devise a deskilling tax, or mandatory retention and re-training laws when skill-changing technologies are introduced in the workplace.

Whatever specific tools we decide to use, a robust cost–benefit analysis of new technologies will be needed. Imagine that we include as metrics the number of workers displaced, the loss in their lifetime earnings, and the impact on their health and their children’s earnings. How would self-driving trucks fare under such an analysis? Even if the benefits still end up outweighing the full societal costs, at least we then have a metric by which to assess restitution. But perhaps a model of truck automation would emerge that preserves some percentage of the workforce to guide and manage the fleet.

Collective bargaining

Because unions are currently fighting for their lives, the first instinct within labor can be to obstruct technology. It is likely that important opportunities are missed as a result. In workplaces where unions still have enough density, the deployment of new technologies should become a topic of bargaining. In the 1960s and ’70s, the longshoremen’s union (ILWU) bargained over the adoption of shipping containers, thereby ensuring job security for incumbent workers, incentives for early retirement, and guaranteed pensions. But a lot of technological change is incremental and affects individual tasks rather than entire jobs. As a result there are many small decisions that cumulatively affect skill requirements, task mix, worker discretion, and promotion opportunities. Management consultants should not be the only voices guiding unionized employers when those decisions are made.

Technological change within one industry can also open up opportunities in another. For example, meal delivery apps are disrupting the food supply chain by delivering prepared meals and meal kits directly to consumers. But beneath the high-tech gloss lies surprisingly traditional work structures: scores of workers in large food processing facilities, many of them direct employees. Investigative reporting of Blue Apron’s plants last year uncovered low wages and serious health and safety violations. If this new industry segment grows and thrives, it could offer fertile organizing ground.

A more ambitious approach is to figure out how to harness new technology for organizing. For example, alt-labor is exploring whether the aggregation provided by on-demand platforms can help to organize workers who had previously been isolated in disaggregated workplaces, such as domestic work. One barrier is that these platforms often do not allow worker-to-worker communication (which is no accident). Why not regulate labor platforms as a condition for receiving a business license, so that they would have to enable secure communication between workers and also agree to remain neutral if organizing results?

Governance

While mitigation and bargaining over impacts are important, ultimately the progressive goal should be governance: a seat at the table when decisions are made over which technologies are developed in the first place and in pursuit of which goals.

Governance can take several forms. One option is to control technology via direct regulation. For example, consider computer algorithms that result in discriminatory outcomes in lending, hiring, or sentencing. Law scholars are actively debating what type of anti-discrimination legal regime is needed to address these cases, which could potentially lead to regulating machine learning itself (since it is dependent on classification schemes). Product market regulation is another key arena. For example, how different would ride-sharing look if legislators had resisted Uber’s lobbyists and classified Uber as a taxi company? Taxi apps would still have been developed but likely with different effects on drivers. The issue of who is able to access the big data generated by private sector firms is also receiving attention; often it is customers and workers contributing that data. Greater access by government could unveil underlying business models (such as predatory pricing) that might then be subject to regulation.

It should be possible to implement technology in a way that values human work and is economically viable.

It would also be possible to take an active role in shaping technology via a multi-stakeholder model. The key insight is that there are multiple paths of technological development. Optimizing efficiency by reducing or eliminating human input is not the only path; within any given occupation or industry there are alternatives where technology works with humans to improve productivity. But how to shape what engineers call the design choice—augmentation vs. automation—is not yet clear.

Ideally we would establish mandated oversight structures that allow for multi-stakeholder decision making over what is developed. We would greatly expand the goals of innovation—to eliminating poverty, saving the planet, the full realization of every human being, the end of dangerous and back-breaking work, etc.—and maybe even insist that some amount of work has intrinsic value to humans. And we would harness the powerful fact that public dollars fund a lot of technological development, often in universities. As the saying goes, venture capital only funds the last mile.

Opportunities for this ambitious form of governance might be found in industries where there is a clear public interest. In the health care sector, for example, the path of technological development is not set in stone. Current attempts to introduce new technologies such as electronic patient records, automatic medication dispensers, and computer-assisted diagnosis have run into myriad challenges, some due to lack of federal standards, some due to competing goals, some due to unintended effects. A social bargaining model backed up by regulation could help pave the way to a health care system that uses technology to free up workers to deliver high-quality, patient-centered care.

• • •

Is any of this feasible? Regulating and shaping technological change will require an enormous amount of power—over the private sector, over government, and over universities. But that is equally true of the basic income model and Rogers’s extensions of it, which are predicated on there being sufficient political will to generate the needed revenue and change U.S. labor law. If we are willing to challenge capital to fund a basic income response to automation, then why not also try to govern technology directly?

The progressive case is that alternatives to shareholder capitalism exist and can thrive in the United States. By extension it should be possible to design and implement technology in a way that complements and values human work and is economically viable.

A final word: the tech sector is forging ahead without us. Last year Google, Facebook, Amazon, IBM, and Microsoft formed the Partnership on Artificial Intelligence to Benefit People and Society (Apple joined soon after), with the goal of establishing an ethics of artificial intelligence to ensure that it is developed with fairness and inclusivity, as well as transparency and an eye for privacy. Reportedly stakeholders from civic society will be invited, but in the end this is self-regulation. Who will be at the table to represent the voices of affected communities and workers, and how much power will they bring?