In the throes of a pandemic that won't quit, many Americans are anxious, but not only about COVID-19; they're also fearful about vaccines, chemicals, and even (non-existent) "chemtrails," to name just a few. Inexplicably, even after more than a million U.S. deaths from COVID-19, the U.S. population remains under-vaccinated and under-boosted. While California has gotten more than 70% of its population fully vaccinated, a large number of states – including Missouri, Georgia, Arkansas, Alabama, Wyoming, Indiana – have barely reached 50%, in spite of exhortations by political leaders and medical professionals.
According to Naval War College professor Tom Nichols, we're witnessing the "death of expertise": "a Google-driven, Wikipedia-based, blog-sodden collapse of any division between professionals and laymen, students and teachers, knowers and wonderers – in other words, between those of any achievement in an area and those with none at all."
The pandemic has brought armchair epidemiologists and infectious disease experts out in droves, and especially with policies in flux, this is not a trivial problem. It confounds policymakers and regulators who feel compelled to seek non-expert input on decisions, wasting time and taxpayers' money, and making them increasingly reluctant to contravene even uninformed, misguided vox populi.
Science is not democratic. The citizenry does not get to vote on whether a whale is a mammal or a fish, or on the boiling point of water; legislatures cannot repeal the laws of nature, although legislators in Indiana once tried to redefine the mathematical constant pi.
While it is certainly important to improve the public's appreciation of the methodology of science and enable non-experts to understand the rationale for government policy, it is less useful, and generally counterproductive when they are asked to help formulate policy.
A frequently cited model for direct citizen involvement is Denmark's "consensus conferences," where non-experts are invited to bring the basic "common sense" they have derived from their "worries, visions, general view and actual everyday experience."
This approach, which the Danes have applied to any number of highly technical scientific issues – including food irradiation, molecular genetic engineering, environmental chemicals, and human-genome mapping – has consistently led to hyper-precautionary regulation and innovation-destroying legislation.
The experience of many other nations hasn't been any more promising. In 2003, the British government, local authorities and other organizations spent a half-million pounds to hold hundreds of public discussions and focus groups in an attempt to gain insight into the public's views on genetic engineering.
The result? As Mark Henderson, the science correspondent for The Times, put it:
The exercise has been farce from start to finish ... One of the six meetings ... spent much of its time discussing whether the SARS virus might have come from [genetically modified] cotton in China. It's more likely to have come from outer space.
Henderson noted that the meetings were dominated by anti-technology zealots, the only faction that was organized and impassioned enough about the issue to attend. We see that as well in the scripted, troll factory-generated comments in response to U.S. government requests for public comment on many proposed regulations.
Other examples come from the U.S. National Science Foundation, whose primary mission is to support laboratory research across many disciplines. Two decades ago, NSF started funding a series of "citizens technology forums," at which ordinary, previously uninformed Americans were brought together to solve thorny questions of technology policy.
For NSF's citizens' forum on nanotechnology (science conducted at the nanoscale, which is about 1 to 100 nanometers), organizers selected "from a broad pool of applicants a diverse and roughly representative group of 74 citizens to participate at six geographically distinct sites across the country." Participants were informed by "a 61-page background document – vetted by experts – to read prior to deliberating."
The result was an incoherent and – despite the 61 pages of background – set of conclusions and recommendations. Two general themes emerged, however: The public was suspicious, worried, and generally hostile to the technologies, but if they resulted in breakthroughs, the participants also wanted to be sure the government would "guarantee access to them if they prove too expensive for the average American."
Despite this dismal record, the European Food Safety Authority (EFSA) still thinks public consultations on highly technical, arcane issues are a good idea, and two years ago announced three more of them, on these subjects: (1) gene drives, which "comprise genetic elements that can pass traits among sexually reproducing organisms at a frequency greater than the rate expected by simple Mendelian inheritance"; (2) synthetic biology, which "employs an engineering‐based approach to build novel biological systems which have potential food and feed applications"; and (3) guidelines concerning certain techniques for gene editing, or genome editing – specifically, "the biosafety of plants developed through type 1 and type 2 Site‐Directed Nucleases (SDN‐1 and SDN‐2) and oligonucleotide directed mutagenesis."
These exercises will undoubtedly replicate the failures of their predecessors.
Politicians like to pay lip service to public engagement on regulatory issues, even if those issues require understanding of sophisticated and complex issues. Former Secretary of Agriculture Dan Glickman once said that there must be public trust "in the regulatory process that ensures thorough review [of genetically engineered plants] – including complete and open public involvement."
The question is, how does one secure that trust? Should we conduct a referendum on the approval for marketing of a second-generation COVID-19 vaccine or a new pesticide, or for the design of a nuclear power plant?
The bottom line is that decades of soliciting public engagement on overregulated entities and products such as nuclear power, pesticides, and nanotechnologies have failed to gain public trust and acceptance. Nor has the subordination of evidence-based policymaking to emotional and political considerations either increased public acceptance or encouraged innovation. Let's stop doing it.
Henry I. Miller, a physician and molecular biologist, was a research associate at the NIH and the founding director of the FDA's Office of Biotechnology.