Of course some conspiracy theories, under our definition, have turned out to be true. The Watergate hotel room used by Democratic National Committee was, in fact, bugged by Republican officials, operating at the behest of the White House. In the 1950s, the Central Intelligence Agency did, in fact, administer LSD and related drugs under Project MKULTRA, in an effort to investigate the possibility of “mind control.” Operation Northwoods, a rumored plan by the Department of Defense to simulate acts of terrorism and to blame them on Cuba, really was proposed by high-level officials (though the plan never went into effect).
For our purposes, the most useful way to understand the pervasiveness of conspiracy theories is to examine how people acquire information. For most of what they believe that they know, human beings lack personal or direct information; they must rely on what other people think. In some domains, people suffer from a “crippled epistemology,” in the sense that they know very few things, and what they know is wrong. Many extremists fall in this category; their extremism stems not from irrationality, but from the fact that they have little (r elevant) information, and their extremist views are supported by what little they know. Conspiracy theorizing often has the same feature. Those who believe that Israel was responsible for the attacks of 9/11, or that the Central Intelligence Agency killed President Kennedy, may well be responding quite rationally to the informational signals that they receive. Consider here the suggestive fact that terrorism is more likely to arise in nations that lack civil rights and civil liberties. An evident reason for the connection is that terrorism is an extreme form of political protest, and when peo ple lack the usual outlets for registering their protest, they might resort to violence. But consider another possibility: When civil rights and civil liberties are restricted, little information is available, and what comes from government cannot be trusted. If the trustworthy information justifies conspiracy theories and extremism, and (therefore?) violence, then terrorism is more likely to arise.
Rather than taking the continued existence of the hard core as a constraint, and addressing itself solely to the third-party mass audience, government might undertake (legal) tactics for breaking up the tight cognitive clusters of extremist theories, arguments and rhetoric that are produced by the hard core and reinforce it in turn. One promising tactic is cognitive infiltration of extremist groups. By this we do not mean 1960s-style infiltration with a view to surveillance and collecting information, possibly for use in future prosecutions. Rather, we mean that government efforts might succeed in weakening or even breaking up the ideological and epistemological complexes that constitute these networks and groups.
How might this tactic work? Recall that extremist networks and groups, including the groups that purvey conspiracy theories, typically suffer from a kind of crippled epistemology. Hearing only conspiratorial accounts of government behavior, their members become ever more prone to believe and generate such accounts. Informational and reputational cascades, group polarization, and selection effects suggest that the generation of ever-more-extreme views within these groups can be dampened or reversed by the introduction of cognitive diversity. We suggest a role for government efforts, and agents, in introducing such diversity. Government agents (and their allies) might enter chat rooms, online social networks, or even real-space groups and attempt to undermine percolating conspiracy theories by raising doubts about their factual premises, causal logic or implications for political action.
In one variant, government agents would openly proclaim, or at least make no effort to conceal, their institutional affiliations. A recent newspaper story recounts that Arabic-speaking Muslim officials from the State Department have participated in dialogues at radical Islamist chat rooms and websites in order to ventilate arguments not usually heard among the groups that cluster around those sites, with some success. In another variant, government officials would participate anonymously or even with false identities. Each approach has distinct costs and benefits; the second is riskier but potentially brings higher returns. In the former case, where government officials participate openly as such, hard-core members of the relevant networks, communities and conspiracy-minded organizations may entirely discount what the officials say, right from the beginning. The risk with tactics of anonymous participation, conversely, is that if the tactic becomes known, any true member of the relevant groups who raises doubts may be suspected of government connections. Despite these difficulties, the two forms of cognitive infiltration offer different risk-reward mixes and are both potentially useful instruments.
There is a similar tradeoff along another dimension: whether the infiltration should occur in the real world, through physical penetration of conspiracist groups by undercover agents, or instead should occur strictly in cyberspace. The latter is safer, but potentially less productive. The former will sometimes be indispensable, where the groups that purvey conspiracy theories (and perhaps themselves formulate conspiracies) formulate their views through real-space informational networks rather than virtual networks. Infiltration of any kind poses well-known risks: perhaps agents will be asked to perform criminal acts to prove their bona fides, or (less plausibly) will themselves become persuaded by the conspiratorial views they are supposed to be undermining; perhaps agents will be unmasked and harmed by the infiltrated group. But the risks are generally greater for real-world infiltration, where the agent is exposed to more serious harms.
Indeed, for some sycophantic OLC conception of “It’s legal if the President does it” that was Nixon’s last refuge from the rule of law. Even England rejected this concept of despotic rule 799 years ago.