“Would the world be a better, or even a different, place if the public understood more of the scope and the limitations, the findings and the methods of science?” This question was taken up in 1985 by the UK’s Royal Society, one of the world’s oldest and most distinguished scientific bodies. A committee chaired by geneticist Sir Walter Bodmer answered in the affirmative: yes, a scientifically literate public would make the world a better place, facilitating public decision-making and increasing national prosperity.

It is commonly assumed that anti-science sentiment stems primarily from ignorance.

Nearly four decades later, this view remains very popular—both within expert communities and without. The public, it is assumed, knows little about science: they are ignorant not just of scientific facts but of scientific methodology, the distinctive way scientific research is conducted. Moreover, this ignorance is supposed to be the primary source of widespread anti-science attitudes, generating fear and suspicion of scientists, scientific innovations, and public policy that is said to “follow the science.” The consequences are on wide display, from opposition to genetically modified foods to the anti-vax movement.

This influential conception of the relations between science and society helped underwrite what has become known as the “knowledge deficit model” of science communication. The model posits an asymmetric relation between scientists and the public: non-scientists are seen as passive recipients of scientific knowledge, which they should accept more or less uncritically according to the dispensations of scientific experts. As Steve Miller notes, “this model adopted a one-way, top-down communication process, in which scientists—with all the required information—filled the knowledge vacuum in the scientifically illiterate general public as they saw fit.”

Viewing this way, the problem of public support for science has a clear enough solution: we need better science education, broadly construed. Educational and intellectual institutions must do a better job of teaching science, while scientists need to better communicate their findings—and the way they arrived at them—to the general public. The challenge, in short, is to improve the knowledge and reasoning of the masses. Once citizens know more and reason better about science, robust public support will follow, and scientific progress—backed by popular consensus—will be freed to deliver its benefits to society without irrational or ignorant pushback.

These are ultimately empirical claims about the social world; as a result, they can be tested. And in fact, the deficit model has not fared well in the face of evidence over the last two decades. For starters, the approach simply does not reliably deliver the expected results of wider support and acceptance of science among the public. Despite concentrated efforts in science education and dissemination, periodic surveys on the public’s understanding of and attitudes toward science both in the United States and in the UK indicate little to no change in scientific literacy over the years. With respect to vaccines, in particular, interventions based on providing scientific evidence refuting vaccination myths have largely proven ineffective.

With its top-down, technocratic view of scientific communication, the deficit model has also been criticized for being insufficiently aligned with democratic values such as equality, autonomy, and participation. Some have argued that since citizens provide funds for research and innovation as taxpayers, they should have a say on how these resources are administered and spent. Moreover, since decisions taken in scientific contexts can have deep, disruptive, and differential impacts on society, citizens should have the opportunity to express their opinions and preferences on whether and how research and innovation will disrupt their lives.

The “knowledge deficit model” has not fared well in the face of evidence over the last two decades.

In light of the apparent failure of the deficit model, alternative approaches to science communication have flourished over the last two decades, inspired by the idea that addressing anti-science sentiment requires transforming our scientific and political institutions. According to the deficit model, scientific institutions did not need to be dramatically transformed; the point was to win support from an incompetent public through education and communication. By contrast, these alternative approaches contend that scientific research and innovation should be brought closer to society—not as a matter of one-way instruction but as a matter of two-way responsiveness, participation, and accountability.

In place of the narrow goal of fostering scientific literacy, these accounts contend that we should look to the broader goal of facilitating cooperation between scientists and citizens (even while we also work to improve science education). They call for a shift from encouraging public understanding of science to promoting public engagement with science. And they view the public not as a fount of ignorance or a passive recipient of scientific enlightenment but as a reservoir of “local knowledge”—rooted in the expertise that arises from personal experience—whose insight can and should inform the practice of science.


Several lines of contemporary scholarship emphasize what might be gained by moving from a focus on science education to a focus on reciprocal power-sharing, cooperation, and exchange between researchers and citizens. Some, such as philosopher Heather Douglas, have argued for the democratization of science. Since many scientific decisions have a significant impact on society, they should not be made solely by a minority elite, however well-trained or knowledgeable; they should instead be the result of processes in which all those affected have the opportunity to participate (albeit in different ways). Moreover, as philosophers Pierluigi Barrotta and Eleonora Montuschi have argued, science should itself be responsive to society: adopting a synergistic approach that allows different people to contribute with their diverse experiences and bodies of local knowledge would make it possible to raise and address new significant research questions, gather relevant data, and attain new knowledge. In a similar vein, science and technologies studies scholar Sheila Jasanoff recommends the adoption of “technologies of humility,” whereby stronger citizen participation should improve science governance in terms of accountability.

Others have emphasized that community engagement is of pivotal practical importance, for example when citizens express fears regarding vaccines. Engagement with the public may even be viewed as a moral imperative that scientists cannot escape. As historian of science Naomi Oreskes puts it in reference to climate change, scientists have a “sentinel responsibility to alert society to threats about which ordinary people have no other way of knowing.” Engaging with and involving different segments of society is fundamental for achieving a better understanding of the challenges faced by societies and developing research that is sensitive to these challenges and thus able to serve societal needs.

In reality, public distrust is often animated by concerns over spurious interests—above all, monetary or political incentives.

One example of a framework for putting these ideas into practice is Horizon 2020, the European Union’s research and innovation funding program from 2014 to 2020, which managed a budget of nearly €80 billion. Horizon 2020 adopted the Responsible Research and Innovation (RRI) policy framework, which “requires all societal actors (researchers, citizens, policy makers, business, third sector organisations etc.) to work together during the whole research and innovation process.” In this scheme, science should be done with and for society; research and innovation should be the product of the joint efforts of scientists and citizens and should serve societal interests. To advance this goal, Horizon 2020 encouraged the adoption of dialogical engagement practices: those that establish two-way communication between experts and citizens at various stages of the scientific process (including in the design of scientific projects and planning of research priorities).

This may sound like progress, but there are reasons to doubt that such efforts represent a meaningful shift in the relationship between science and society. For one thing, Horizon 2020’s successor, Horizon Europe, was initially criticized for sidelining RRI initiatives. Furthermore, a wealth of evidence suggests the deficit model remains deeply entrenched in scientific and policymaking communities. In a recent paper on “The Lure of Rationality,” for instance, Molly Simis and colleagues argue that significant portions of scientific communities still tend to view the public as non-scientific, and moreover “as an ‘other’ entity that they are not part of.” Jack Stilgoe and colleagues likewise lament how the paradigm of public engagement has come to function as a “procedural” strategy to “gain trust for a predetermined approach,” leaving existing power structures intact. Taken together, this work suggests that the public engagement narrative has come to function more as “rhetoric” than reality.

Many factors may help to explain why the deficit model persists. One is that that there is indeed a deficit in knowledge, but it is to be found on the scientists’ side. Scientists rarely learn about science communication and usually receive very little to no formal training on how to be good communicators, especially for popular audiences. As a result, they are not sufficiently informed on how individuals form opinions on scientific matters to be able to support, design, plan, and implement science communication strategies that go beyond science education. (The Bodmer report, for its part, did emphasize the importance of providing such training for scientists themselves, not just journalists or specially designated scientific communicators.)

Another reason for the persistence of the deficit model may be that it is particularly attractive from a policymaking perspective. It offers a relatively simple origin story for anti-science attitudes and points to a relatively easy solution—at least, one that requires relatively little of scientific and political institutions themselves. A recent example of a well-meaning but arguably limited approach to the problem along the lines of the deficit model is the Stanford report “Science Education in an Age of Misinformation,” which emphasizes the importance of promoting a better understanding of how science works as a way to counter scientific misinformation.


Whatever the reason, the persistence of the deficit model is alarming, since it over-promises what science education and one-way science communication can achieve. Meeting the challenge of two of the most urgent crises we currently face—the COVID-19 pandemic and the catastrophic effects of climate change—requires massive social coordination and widespread popular buy-in. We need understanding of and compliance with public health measures such as vaccination and mask-wearing, where appropriate, and we need more sustainable individual choices and political pressure on governmental bodies to drastically reduce carbon emissions.

The deficit model over-promises what science education and one-way science communication can achieve.

It is therefore imperative that viable alternatives to the deficit model of science communication be found and effectively implemented. Rather than focus on the epistemic dimension—what the public knows about science and how it works—this work entails thinking more directly about the nature and sources of trust in scientific and governmental institutions. It is notable, as we argued elsewhere, that this “trust deficit” is not primarily fueled by an epistemic concern—the perceived incompetence of scientists, say. Rather, public distrust is often animated by concerns over spurious interests—above all, monetary or political incentives that are perceived to compromise the reliability or legitimacy of scientific knowledge claims.

Philosopher Maya Goldenberg’s recent book, Vaccine Hesitancy: Public Trust, Expertise, and the War on Science (2021), explores these issues in the context of opposition to vaccination. In some cases, vaccine skepticism may indeed be fueled by basic misunderstanding. But as Goldenberg compellingly argues, vaccine hesitancy can be a sign of reasonable distrust of medical and scientific institutions rather than a result of misunderstanding or a war on scientific knowledge and expertise. She identifies two main factors contributing to reasonable hesitancy: legacies of scientific or medical racism and the commercialization of biomedical science. Indeed, multiple studies have shown how historical patterns of mistreatment—including medical experimentation without consent and exclusion of certain groups from clinical trials—help to explain why some communities remain deeply suspicious of scientific interventions. In short, while segments of the public may concede the competence of scientific experts—having the appropriate level of knowledge and skills related to a certain scientific area—they may simultaneously doubt their benevolence.

In situations like this, one-way communication from scientific authorities may not be sufficient to restore trust—even when it seeks to demonstrate why a particular scientific claim is not a product of illegitimate or discriminatory practices. The obstacle in such cases isn’t an irrational reluctance to consider new information. It’s that the information itself is deemed unreliable, in part because of a poor moral assessment of who or what is conveying it. Addressing these perceived moral failures of scientific or medical institutions may require much more than the mere conveyance of technical information.

Ultimately, trust entails vulnerability. If I trust you enough to let your input influence important decisions that I make about my life, I make myself vulnerable to you; I give you a certain amount of power over me. Health care decisions are especially risky in this regard, and the risk is only compounded for marginalized communities. As philosopher Katherine Hawley has noted, “those who are more comfortably situated can afford to be more trusting, since they can more easily bounce back if they get things wrong.” Of course, distrust can be risky too; failing to get a COVID-19 vaccine significantly increases one’s risk of serious disease. But framing the issue in terms of trust and vulnerability clarifies why it’s misleading to think of these issues solely in terms of knowledge deficits. As physicians Michelle Morse and Bram Wispelwey wrote in these pages last year, “Rather than ask what response to past harm might make our institutions worthy of trust, the effect is to lay the blame on marginalized communities and to distract from the underlying source of mistrust.”

Social trust in science must be earned and cultivated, and this process depends as much on power as on knowledge.

Of course, distrust of scientific institutions has also recently been fueled by misinformation campaigns meant to discredit traditional authoritative sources of knowledge, in particular against the background of the recent rise of populism. These efforts create a hostile, unwelcoming environment for the production and dissemination of scientific information. The politization of COVID-19 vaccines is a case in point. In this context, countering the spread of misinformation and the political weaponization of anti-science discourse requires much more than well-designed science communication initiatives or even robust fact checking. We must be mindful of the way scientific and political discourses are intertwined—and of the limitations of what science communication and popularization by itself can achieve, against the background of political power struggles.

In the end, the knowledge deficit model fails because it views public trust and acceptance of science primarily as an epistemic problem—a matter of too little knowledge. What we need instead are approaches that respect equally important moral and political factors in shaping the relationship between science and society. Social trust in science must be earned and cultivated, and this process depends as much on power as on knowledge. Even in an ideal case, no amount of consensus about scientific facts or the mechanisms of knowledge production will eliminate disagreement about the policies we should pursue as a democratic society—a political and moral question that inevitably impinges on values.

Exactly what institutional changes are necessary to better cultivate trust is an empirical matter, open to debate and the lessons of the social sciences. But recognizing the true nature of the problem—that it is as much moral and political as epistemic—is a necessary first step toward finding solutions.