“Just remember, what you’re seeing and what you’re reading is not what’s happening” – Donald Trump
If we accept the premise that American society has intentionally damaged its ability to make decisions, we can return to John Boyd’s OODA framework to see exactly how various political, cultural, and technological forces have been used to subvert each stage of the decision-making cycle.
I’ve created my own parallel framework to reflect this “counter-OODA” approach to misinformation, disinformation, and propaganda. I call it the OCDN (Obscure + Confuse + Dither + Nullify) doom loop:
It’s worth noting that Boyd himself anticipated the dangers associated with using the OODA Loop outside of the field of battle. Boyd Biographer Robert Coram states:
“If someone truly understands how to create menace and uncertainty and mistrust, then how to exploit and magnify the presence of these disconcerting elements, the Loop can be vicious, a terribly destructive force, virtually unstoppable in causing panic and confusion … This is true whether the Loop is applied in combat, in competitive business practices, in sports, or in personal relationships.”
He later adds:
“The key thing to understand [about the Loop] is not the mechanical cycle itself, but rather the need to execute the cycle in such a fashion as to get inside the mind and the decision cycle of the adversary. This means the adversary is dealing with outdated or irrelevant information and thus becomes confused and disoriented and can’t function.”
As a citizen living in a confused, disoriented, and dysfunctional society, my hope is that by placing our current climate of disinformation into the context of the OODA Loop, we can begin to develop a plan that allows us to tackle each of these components in a coordinated fashion.
A few notes before we get started:
- Ever since Tim Berners-Lee published the first website in August of 1991 – the same month the USSR went belly-up – more powerful browsers, search engines, and social media applications have led to the explosive growth in both content and usage of the internet. While there is no doubt this technology has had a net-positive effect on our society, its disruptive nature has also created an extraordinary crisis in our information ecosystem. Much of what follows reflects the growing pains of this revolutionary change in communication.
- In cataloging these forces, I recognize that there may be gaps in my coverage and that my summaries may be over-simplified or misplaced. I’m sure that many books will be (and have been) written on each and every topic … I just wanted to keep it simple.
- I tried to focus on the techniques themselves instead of pointing too many fingers but many of the current examples stem from the information vortex that is the Trump administration.
OBSERVE vs. OBSCURE
The first step in any decision-making process involves the observation of outside stimuli. Because we are constantly bombarded with such a high level of information, we naturally depend on selective filtering mechanisms to help us focus on what’s most important. These mechanisms can be co-opted by external forces in several ways:
- Addition of false information
- Reduction of true information
- Creation of distractions that make it hard to absorb any information
Let’s take a look at these techniques individually:
Addition of False Information
Studies have shown that false information distorts people’s recollection of past events, interferes with their judgement, and reduces their trust in reliable sources. Repeated exposure to false statements also increases the likelihood that someone will accept them as true (something called the “illusory truth effect”). These characteristics make disinformation – the intentional spread of false information – an ideal tool for degrading a society’s decision-making capabilities.
While foreign actors like Russia have been doing this for years, new digital media tools have allowed them to turbocharge their disinformation campaigns using something called the “firehose of falsehood” approach. Also known as “censorship through noise” or “flooding the zone with shit”, this propaganda model is built on a high-volume, multichannel environment that force-feeds people so much information that they are required to take shortcuts in their evaluation of new information. This technique dilutes and controls the messages people receive, making them susceptible to false narratives and alternative facts. (The full Rand report on this technique is worth reading.)
The sheer volume of false information is designed to divert attention from authentic content, create doubt about reliable sources, and launder false narratives so they “feel” true.
“What we’re facing is a new form of propaganda that wasn’t really possible until the digital age. And it works not by creating a consensus around any particular narrative but by muddying the waters so that consensus isn’t achievable.”
Domestic organizations seeking to influence people (or make money off them) have adopted this method and built a loosely affiliated network of websites, radio stations, television stations, Potemkin news sites, and social media applications to manipulate the information environment. These echo chambers draw people in by serving up misleading content that is designed to provoke a reaction rather than inform. (Great summary here.)
In some cases, these networks have achieved almost complete epistemic closure, a level of closed-mindedness and indoctrination in which no outside information gets past the filters. Fox News, originally created in 1996 as an alternative to a perceived liberal bias in mainstream media, has spawned a completely separate conservative information ecosystem trafficking in outrage, pseudoscience, and conspiracy theories (e.g. OAN, Infowars, QAnon).
Reduction of True Information
The introduction of the internet (and the subsequent disruption of the traditional media industry) has reduced or eliminated many sources of reliable information and accountable journalism. This is particularly true for local news outlets, many of which have either been absorbed by larger competitors, forced to make staffing reductions, or disappeared altogether. This decline means less coverage of school board meetings, routine government activities, and other topics of local concern.
The profession of journalism itself has also taken a hit. Say what you will about mainstream media bias, but at least the journalistic code of ethics paid lip service to the principles of truthfulness and accuracy. The loss of nearly 50% of the country’s news analysts, reporters, and journalists in the past twelve years has reduced focus on fact checking, objectivity, fairness, and accountability. (Bloggers, social media influencers, talk radio personalities, and political pundits are not held to these same standards.)
Finally, there are several ways to restrict what people see and hear through direct and indirect suppression of information. Journalists who have survived the implosion of the news industry may be faced with partisan attacks designed to silence them and further weaken the pillars of fact-based journalism. These attacks can include physical threats, doxing, and other acts of intimidation that are intended to punish – and limit – unfavorable coverage. Similarly, cancel culture (or call-out culture) uses public shaming and the threat of sanctions to neutralize the opinions of controversial public figures.
Since Constitutional protections against censorship do not apply to private corporations, social media companies are free to block users, take down posts, and restrict access to content. Businesses can also strategically sue critics, withhold research, or pressure workers to stay silent. Government censorship of itself includes the removal of or alteration of websites, the firing of whistleblowers, the scrubbing of federally funded research, and the reduction of access to public information.
Our brain generally tries to filter out extraneous information in order to make more important items stand out. However, magicians, politicians, and criminals have always understood the value of controlling an audience’s attention through distraction and misdirection. Outrageous claims, threats, gaslighting, and anonymous rumors generate chaos that draws focus away from more substantial concerns. These feints successfully manipulate people’s perception of an event and can even have an impact on how these events are remembered.
Google, Facebook, and other platforms have aided and abetted this confusion by developing search and recommendation algorithms that favor profit over truth and consensus. These algorithms are designed to keep you continuously engaged in the service of someone else’s business model. Every click on a cat video or tiny home tour gets fed back into an equation that is optimized for generating ad revenue instead of benefiting your own – or society’s – health and welfare.
ORIENT vs. CONFUSE
The orientation stage of Boyd’s OODA loop is powered by a mix of experience, training, innate abilities, cultural traditions, and other influences all mixed together. It is the mind in action … analyzing, synthesizing, judging, and recombining what is observed and trying to make sense of it.
Your mindset plays a critical role in how you view the world. A poor mindset can cause people to see patterns that aren’t there or ignore things that are there. Mindset also has a direct impact on what you see and how you act (the “implicit guidance and control” in the Loop). It is also unique to you. The inherent complexity of this phase makes it the most difficult to understand – and perhaps the easiest to disrupt.
Factors which can influence the delicate interplay of thought include the following:
- Cognitive biases
- Intellectual Vices
- Deskilling of the population
When people are making decisions, they often use simple heuristic rules – mental shortcuts – to reduce the strain on their working memory. Unfortunately, while these patterns of reflexive thinking can result in faster outcomes, they can also lead to errors in judgement and/or sub-optimal decisions known as “cognitive biases”.
Examples of common cognitive biases include confirmation bias (the tendency to look for evidence that confirms one’s existing beliefs), the availability heuristic (the tendency to give more credence to something that’s easy to remember), the illusory truth effect (the tendency to believe false information to be correct after repeated exposure), framing (the tendency to draw conclusions based on how information is presented), and the Dunning-Kruger Effect (the tendency for unskilled individuals to overestimate their own abilities). (All definitions are taken – more or less – from Wikipedia.)
Disinformation techniques like the “firehose of falsehoods” take advantage of these innate human biases by “laundering” false statements to make them seem more believable. The sheer volume of information can also trick the human brain into seeing patterns and connections where none exist, leading to support for conspiracy theories and other bizarre notions.
Closed information ecosystems, meanwhile, cause people to discount evidence that comes from unfamiliar sources and glom on to information that comes from familiar sources. Over time, this can destroy any sense of shared reality.
Tribalism is a specific type of bias in which people separate themselves into social groups with similar interests and/or cultures. Most of the time, the differences between these groups are minimized in order to maintain social order. During times of stress, however, fractures may develop that cause these groups to drift apart and compete with one another for power.
“[H]uman beings are hardwired for tribalism. We compulsively (and unconsciously) divide the social landscape into ingroups and outgroups; selectively process information that affirms the virtues of the former and the vices of the latter; and allow our self-esteem to rise and fall with the status of our team.”
Individuals are heavily influenced by the behavior of others in their tribe. If they are struggling with incomplete information, they will look to others in their in-group to see how they respond. Social pressures within the tribe can also cause people to suppress individual thoughts in a desire to fit in. This results in a dysfunctional decision-making process driven by groupthink, virtue signaling, and preference falsification. Decisions end up being based on tribal narratives rather than objective reality.
Writing about the development of coalitions, evolutionary psychologist, John Tooby notes that:
“[T]o earn membership in a group you must send signals that clearly indicate that you differentially support it, compared to rival groups. Hence, optimal weighting of beliefs and communications in the individual mind will make it feel good to think and express content conforming to and flattering to one’s group’s shared beliefs and to attack and misrepresent rival groups. The more biased away from neutral truth, the better the communication functions to affirm coalitional identity, generating polarization in excess of actual policy disagreements.”
Biblical literalists dispute the findings of evolutionary biologists because it threatens their worldview. Multiculturalists downgrade the value of the Western canon because it ignores women and minorities. Energy companies question the motives of climate scientists because it jeopardizes their profits. Anti-vaxxers ignore the advice of pediatricians because they fear harmful side-effects. The list goes on and on, with each subsequent disagreement serving to reinforce the divisions between groups.
Intellectual vices are character traits or attitudes that interfere with the acquisition and evaluation of knowledge. They differ from biases in that they are not innate human qualities but rather poor thinking styles that can only be changed through careful development of the mind.
Examples of intellectual vices include wishful thinking, prejudice, closed-mindedness, contempt for truth, dogmatism, overconfidence, selective attention, mistrust of experts, negligence, conformity, carelessness, rigidity, insensitivity to detail, obtuseness, lack of thoroughness. and epistemic insouciance (e.g. the indifference to whether one’s claims are based on fact … in other words, a bullshit artist).
People with a high level of intellectual “viciousness” are more likely to endorse conspiracy theories, believe fake new stories, and support other questionable beliefs. Intellectual vices are also associated with poor judgement, the dismissal of contrary evidence, and a generally unreliable approach to inquiry.
Skills are abilities or proficiencies acquired through deliberate effort over time. They represent practical knowledge that can be acquired through training instead of habits of mind that must be developed. The loss of skills such as media literacy, critical thinking, and metacognition can have a negative impact on decision-making ability by disrupting the feedback loops people use to improve their thought processes.
A decline in media literacy – the ability to identify credible sources of information – contaminates the mental filters that people use to screen out unreliable information. A steady diet of social media, reality TV, fake documentaries, and staged reenactments, for example, appears to have eroded the American public’s sense of what’s real and what’s entertainment (e.g. mermaids, Bigfoot).
People are regularly tricked by deep fakes, photoshopped images, and other manipulated media. The reputations (and messages) of authentic institutions are undercut by their false portrayal as villains in online conspiracies or mockumentaries. The character (and expertise) of public figures is called into question through slander or doctored recordings of them doing or saying something they didn’t. Meanwhile, the trustworthiness of other sources is elevated through these same methods, making criminals look like heroes and fools look like brainiacs.
The loss of critical thinking and metacognition skills (both of which can be thought of as “thinking about thinking”) compounds these challenges by trapping people in a state of arrested mental development. New ideas and new ways of thinking are snubbed. Complex problems are ignored by people with little stomach for nuanced debate. Past decisions are left unanalyzed, leading to poor outcomes and no chance of improvement.
DECIDE vs. DITHER
Decision-making is the process of choosing a course of action from several different options. Ideally, there is an obvious, rational choice available but there may be times – during a military engagement or a business deal – where a surprising or unexpected decision can prove to be advantageous. Each alternative in a range of choices is shaped and evaluated through the analysis and synthesis of information gathered in previous stages. These alternatives are also influenced by a mix of motivating/demotivating and supportive/hindering factors.
Emotional biases can influence decision-making by clouding judgment, sapping one’s motivation, or altering perceptions of a given situation. Anxiety about a particular outcome can lead someone to choose a less risky option. A happy person might make a different decision than they would if they were sad. A person experiencing jealousy or pride might opt for something else.
The primary emotions at play in the current political climate are anger, fear, anxiety, and disgust. Angry and fearful people tend to make pessimistic judgements about the future and are more likely to have a heightened sense of perceived risk. Anxiety disrupts the decision-making region of the brain’s prefrontal cortex while disgust can cause people to retreat from the unfamiliar and make them less cooperative. All of these negative emotions can be triggered by frightening images, disquieting narratives, and divisive language.
Politicians, pundits, and other media figures are aware of these tendencies and frequently use fearmongering and sensationalism to heighten the sense of uncertainty and ambiguity felt by their audience. This tension makes it easier for them to steer people toward specific conclusions (and keep them coming back for more).
Writing about the use of fear in media, Dr. Deborah Serani states:
“The success of fear-based news relies on presenting dramatic anecdotes in place of scientific evidence, promoting isolated events as trends, depicting categories of people as dangerous and replacing optimism with fatalistic thinking.”
(Interesting side note: my wife pointed out to me that people often use humor to diffuse anger and suggested that this can sometimes reduce the motivation for taking action. Research shows that laughter can indeed override other emotions while inhibiting regions of the brain involved in decision-making. Perhaps humor introduces a natural dampening effect that can be used to prevent rash decisions.)
Like any other human activity, decision-making requires energy and stamina. People who are hungry, thirsty, or drowsy are less able to concentrate and more likely to make poor choices or engage in impulsive behavior. Studies show, for example, that people make better choices in the morning when they are fresh and alert than they do later in the day.
Mental fatigue can also wear people down and make it harder to choose. The stress of having to make too many decisions (decision fatigue) or of being presented with too many options for a single decision (analysis paralysis or choice overload) can make people choose unwisely or cause them to avoid making a decision altogether. The concept of present bias – the tendency of people to value short-term-rewards over better long-term outcomes – can also lead people to discount decisions that might be beneficial to their future selves.
ACT vs. NULLIFY
The last step in any decision-making process is the action step. This step involves both executing the choice made in the decision phase and evaluating that same choice through observation of the outcomes. In that sense, the action step can also be viewed as the start of another decision cycle where any results are also inputs to the next decision (hence the “loop” of the OODA Loop).
Following through on a decision isn’t always easy. Whether external circumstances block your efforts, or your own lack of commitment prevents you from following through, taking action means devoting time and effort to completing a task. Anything that interferes with this step reduces the chance that something will get done.
Organizations frequently take advantage of this struggle by throwing up obstacles to action. Companies introduce paperwork, long waits, and other bureaucratic delays to discourage people from cashing in rebates, registering complaints, or making claims. Politicians and their enablers do the same by adding “friction” to the election process … making it harder to vote, harder to count votes, and harder to distribute votes fairly (e.g. gerrymandering, manipulation of the Census).
This deliberate disenfranchisement not only prevents action, but it also eliminates vital feedback used to improve future decisions. Mistakes go uncorrected, engineering problems fester, and political policies are allowed to atrophy. The inevitable result is a surprise … often in the form of abrupt, unpleasant change.
IN CLOSING …
A fully functioning democracy requires its citizens to both recognize a shared reality and make good decisions based on that reality. Understanding the ins and outs of the human decision-making process allows people to recognize when they may be drifting away from this consensus and make course corrections. Only through continuous improvement of the individual citizen’s decision-making capabilities can we hope to re-build our society over the next generation.
In his book, The Demon-Haunted World, astrophysicist Carl Sagan states that:
“One of the saddest lessons of history is this: If we’ve been bamboozled long enough, we tend to reject any evidence of the bamboozle. We’re no longer interested in finding out the truth. The bamboozle has captured us. It’s simply too painful to acknowledge, even to ourselves, that we’ve been taken. Once you give a charlatan power over you, you almost never get it back.”
To counter the bamboozle, he advocates for the development of a personal “baloney detection kit,” a set of tools for thinking skeptically. Here’s hoping that the OCDN Doom Loop is something you can add to your own kit to fight the good fight.
Happy Election Eve, everyone!
Image credit: Anna Kinde
Series: The Short-Circuiting of the American Mind
11/3/2020 – “But it’s not just American politics that’s in disarray. It’s our whole information ecosystem. Trump’s presidency fortified the alternate realities that Americans live in, the contradictory sets of facts that they accept and the competing truths that they tell.” (https://www.nytimes.com/2020/11/03/opinion/joe-biden-2020.html)
11/13/2020 – How America was primed for disinformation about the 2020 election: https://fivethirtyeight.com/features/americans-were-primed-to-believe-the-current-onslaught-of-disinformation/
11/16/2020 – “Responding to scientific evidence and insight is now a source of national strength and therefore power; those [who] ignore it will not be able to build resilience to contain shocks … It is hard to mobilize the relevant actors, ideas, and resources to face a threat that political leaders do not believe in.” (https://www.theatlantic.com/ideas/archive/2020/11/pandemic-revealing-new-form-national-power/616944/)
12/14/2020 – Intellectual virtues, baby! (https://www.nytimes.com/2020/12/14/opinion/trump-voter-fraud-education.html)
6/25/2021 – “[The] #StopTheSteal campaign … is a massive and devastatingly effective deployment of Russian-style information warfare against American democracy — by Americans themselves — with an eye toward the future. We should think of it not as a momentary partisan outburst but a kind of epistemic 9/11: a moment when a menace that has been developing for years reaches maturity and displays its full prowess.” (https://www.washingtonpost.com/opinions/2021/06/25/war-truth-is-raging-not-everyone-recognizes-were-it/)