fluoride fallacies: consensus
Like all informal logical fallacies, the argument from authority is often less a cheap rhetorical tactic than a misemployed heuristic. Because (by definition) a significant burden of evidence or reasoning has been shifted to an authority, whatever scrutiny might have been aimed at that evidence or reasoning now bears upon the source of authority.
In fact, authority is, at least in today’s world, an indispensable source of belief. For instance, when getting dressed we implicitly rely on several authorities: To believe that our clothing is appropriate for the day’s weather, we rely on the authority of weather forecasters; that it is appropriate for our occupation, that of tradition or of seniority; that it is appropriate to our appearance, that of fashion; that it has not been negligently laced with toxins, those of moral standards and of government regulatory agencies; and so on.
Obviously there are authorities generally, or at least widely, thought to be reliable that nonetheless are not. We can know this not just because authorities disagree, but because we have more reliable sources of evidence regarding many things in life than authorities. For example, we don’t (or oughtn’t) rely on weather forecasters to know whether it is currently raining in our own neighborhood, or on political commentators to know whether wealth is fairly distributed, or (solely) on the generic advice of any of our confidants to know what course of action to take with respect to a thorny social situation.
An especially common authority cited with regard to an issue is the expert—someone trained, knowledgeable, and experienced in the issue itself or in some discipline(s) relevant thereto, who can convey and contextualize current best knowledge on the subject and its implications for people’s lives. We rely on the experts who prescribe our meds and tune up our bikes as much as we rely on those who inform our healthcare professionals and policymakers. When an expert speaks about their area of expertise, we tend to listen. (While it’s another story in itself, we even tend to listen carefully to experts when we’re predisposed to—even intent upon—doubting them. When we fall into denial, we crave their validation even as we reject their credibility.)
There is, however, a categorically stronger authority than any one expert or even team of experts, whose testimony may bear directly upon an issue, and that is an expert consensus. While there are a variety of important reasons for this, especially in the sciences—familiarity with the nuances of the issue, the statistical training and critical thinking skills required to understand current knowledge and make reasonable (and financially attractive) predictions, the (increasingly) collaborative nature of research, the formal peer-review process, the lure of novelty and especially of debunkery, etc.—a simple probabilistic analogy illustrates how, in the absence of other intelligible evidence, an expert consensus is far and away a more reliable source than any individual expert. (I’ll come back to this in a moment.)
The fluoride wars exhibit a great deal of argument from authority, and rightly so: The simply-put question of whether to raise (or lower) the fluoride content of municipal water supplies to a common level requires evaluating benefits, risks, and costs; taken together, the relevant areas of expertise range far beyond dentistry, to organic chemistry to toxicology to ecology to geology to biostatistics, and on and on. No single person is qualified to review the raw data from every domain and synthesize it into a definitive answer. There is no shame in relying on experts.
We rely on imaginative, skeptical, and well-trained researchers to produce, incrementally, a reliable body of evidence addressing each point: correlations (between fluoridation levels, caries (negative), fluorosis (positive), and other physiological conditions (none)), biological mechanisms, dose responses, net savings, environmental impacts, etc. We rely on the broader expert communities to scrutinize this work. We rely on public representatives and their consultants to review the evidence and make policy recommendations. And, at the local level, we rely on the local water authority to manage our water in a way consistent with those recommendations.
From the anti-fluoridationist literature, however, one gets the impression that the expert community has been consistently, drastically, culpably wrong. Well, experts get things wrong all the time, but this has much to do with the sheer number of experts and of areas of expertise. Most experts will be right about their area of expertise most of the time, and a lot more than anyone else on average. Experts also collude with nefarious individuals or organizations to become shills for some product or ideology (or both), but, like other crimes and disasters (of the legal or humanitarian variety), we are exposed to far disproportionately many such instances, relative to instances of basic (and typical) scientific integrity.
Returning to simple probabilities: Conservatively taking a consensus to consist of 90% of experts and (ultraconservatively) 100 experts to exist, on the assumption of independent assessments of the existing evidence we would expect, when any given expert might be wrong with probability p, a consensus of experts to simultaneously be wrong with probability
which could only ever be 5% larger than p (q/p is maximized at around p=0.95) and is profoundly less than p whenever p is not close to 1 (p<0.8 already implies that q<p/140)—and in almost any setting it is absurd to expect that any expert would incorrectly assess a body of evidence with probability greater than 0.5 (since it would be absurd to consider anyone an expert in the first place who could not provide a more reliable assessment than pure chance). While the assumption of independent assessments is itself incredibly naïve, in cases (such as this) involving a community of experts tens of thousands strong, it is entirely reasonable to expect that at least 100 independent assessments of the evidence have been conducted, and therefore that a few orders of magnitude constitute a charitable bound on the relative reasonableness of an expectation that some experts have it wrong and one that a consensus has it wrong.* (The historico-mathematically literate will have recognized the Marquis de Condorcet‘s jury principle.)
This realization may motivate the conspiracymongering that invariably accompanies objections to water fluoridation (or any other government program). But this move confuses an improvement of the consequents for an improvement of the posterior: It is incredibly unlikely for modern experts to reach a false consensus on so sharp (quantitative) an issue as the efficacy, safety, and cost-effectiveness of water fluoridation on the assumption that they are generally acting with integrity and diligence. Such a state of affairs is incredibly likely, however, on the assumption that said expert community is engaged with various government agencies in a extensive conspiracy to deceive the public. Fine. But the plausibility of such a broad, collaborative, and persistent conspiracy in the first place is arguably far smaller than that of a false consensus.
So, in lieu of conspiracy theories, how does one leverage expert authority to build a case against an expert consensus? The short answer is that one cannot. Other evidence must be taken into account, to refute the consensus opinion or to demonstrate that it is compromised (and probably a whole lot of both). The former requires something approaching an expert understanding of the issue itself—much greater, for instance, than is exhibited by an opportunistic splattering of scatterplots—while the latter requires strong evidence of widespread negligence or conflict of interest—much stronger, for instance, than the lure over an inconsequential profit margin for professional dental organizations to collude with Big Shit.
The long answer is that there are several ways to spin a preponderance of expert support for a position into a case for far more caution towards it than is actually due. Each provides an illuminating example of the fallacious argument from authority. In coming post(s), i’ll outline three such maneuvers, cite some examples of each, and discuss a theme they exhibit that reveals a perhaps more essential (and unsettling) authority in the minds of fluoridation opponents.
* I don’t intend this simplistic model to serve as a justification for trusting expert consensus (which would take a lot more differences into account than sheer number); rather, i mean with it to illustrate how sheer numbers can produce very substantial consequences from very basic and reasonable assumptions.