Disinformation Handbook: A Concise Guide to Countering Disinformation (1)
Part One: Concepts and disinformation tactics
Introduction
We frequently hear about disinformation, misinformation, and “fake news,” but sometimes these concepts and terms are muddled. Natto Thoughts readers might find it useful to have a concise overview of the disinformation topic in one place, with how-to tips on protecting oneself from being deceived.
The present report is Part One of our disinformation handbook. It provides some concepts associated with disinformation as well as tactics that information operations use. Part Two suggests ways you can detect disinformation and avoid being harmed or manipulated by it. It lists organizations and people who work to detect and counter disinformation and the techniques they use, including developments related to artificial intelligence. Part 3 provides links to handbooks on disinformation and how to counter it, as well as to the Natto Team’s own postings.
The work of corrosion
You may have seen the award-winning 2006 movie “The Lives of Others.” In that film, officers of East Germany’s dreaded Ministry for State Security, commonly known as the Stasi, seek to silence a dissident playwright. They set out to destroy the playwright’s beauty-and love-filled life, eventually driving his girlfriend to betrayal and suicide. As enshrined in a 1976 directive, the Stasi had a system for psychologically breaking their subjects through methods that included spreading false rumors about the person -- that is, disinformation. The Stasi’s system was known as Zersetzungsarbeit, a word German-born security expert Thomas Rid translates as “the work of corrosion.” As Rid explains in his seminal book Active Measures, “Zersetzung is a morbid German word for disintegration by malicious external forces–it applies to the disintegration of a body politic, of a community, or an individual’s mental stability, or even of human tissue.” Disinformation was a key part of the Stasi’s Zersetzungsarbeit, aimed at perceived enemies not only in East Germany but also worldwide, as Rid explains.
Disinformation – referring to the deliberate spread of false or misleading information – can undermine trust in human relationships, social and political institutions, and even the idea of truth itself. False information spread online can ruin a business or a person’s reputation and can spark harassment or violence, sometimes leading to psychological trauma, injury, or death. Misleading publicity can dissuade people from taking lifesaving health measures, cause bank runs and enable stock market manipulation. Disinformation sharpens partisan conflicts that can undermine and dehumanize society; as researchers have shown, when test subjects are told that an opposing group is violent or corrupt, the test subjects feel they have license to act violent or corrupt themselves. Disinformation can undermine trust in elections and other democratic institutions and suppress voter turnout, thus affecting votes. It can undermine trust in the idea of truth itself. Indeed, according to the RAND corporation, a US think tank, Russian military theorists have likened information weapons to weapons of mass destruction in their strategic effect. .
What we are talking about: basic terms and definitions
Sometimes the media do not distinguish between disinformation and misinformation.
Disinformation refers to the deliberate and often covert spread of false or misleading information, whereas misinformation refers to the sometimes-unwitting spread of “incorrect or misleading information.” In addition, some analysts have coined the term malinformation to refer to the deliberately misleading use of correct information to “inflict harm on a person, organization or country.”
A more inflammatory term is fake news, referring to “false information, often of a sensational nature, that mimics news media content,” as the Skeptical Science Debunking Handbook defined it.
One type of disinformation involves the falsification of video or audio images. This can include deepfakes – the use of artificial intelligence to generate synthetic faces or put words into the mouth of a familiar sounding voice. But alterations using less-sophisticated technology are more widespread. They are sometimes called cheapfakes.
The spread of false information sometimes shares the same techniques as legitimate business and political speech. Information campaigns can cover all kinds of marketing and outreach operations, not always involving falsehood. As the Data&Society Research Institute notes, “Information campaigns are organized communicative activities that aim to reach large groups of people. ….they are deliberate attempts to persuade. The terms advertising, public relations, public diplomacy (or public affairs), information operations, and propaganda all describe deliberate, systematic information campaigns, usually conducted through mass media forms.” “Propaganda,” for example, refers to “ideas, facts, or allegations spread deliberately to further one's cause or to damage an opposing cause,” whether by a government or any group; it does not necessarily connote false information, although the term is often popularly used as such.
Persuasive efforts that are less benign and more manipulative are often referred to as an information operation. Originally a military term for “the strategic use of technological, operational, and psychological resources to disrupt the enemy’s informational capacities and protect friendly forces,” the term “information operation” has recently come to refer to “deliberate and systematic attempts to steer public opinion using inauthentic accounts and inaccurate information.” An operation that uses social media or other online platforms, or that includes the use of cyber threat activity against the target’s computer systems, is known as a cyber-enabled information operation.
People using information operations often do so because they perceive themselves to be in a state of virtual warfare in the social or political realm. Information warfare (IW) refers to conflict and competition in the information space, whether in the cyber realm of computer and communications systems, or in the psychological realm of public opinion and decision-making . It is generally the realm of state-to-state conflict, although fights in the information space between, say, tobacco companies and health advocates, or oil companies and environmental advocates over the effects of cigarettes and fossil fuels have also been described as information wars. Russia has a well-developed doctrine of information confrontation (информационное противоборство), and it builds on a long Soviet history of pursuing strategic goals through propaganda and so-called active measures— covert attempts to influence other countries’ political landscape and decision-making. China has also been developing concepts of information warfare since the mid-2000s and has an agency called the United Front Work Department that seeks to influence people abroad.
Many countries have developed doctrines and strategies for information operations as part of so-called grey-zone (gray-zone), hybrid, non-linear or irregular warfare – interstate competition and conflict in circumstances short of declared war. Information operations are a type of asymmetric warfare that even countries with relatively weaker military capabilities can undertake.
Caveat: terms like “disinformation” and “fake news” sometimes get used indiscriminately. Individuals and governments can dismiss any criticism as “fake news”; or they can rush to label a story as disinformation even before the true facts have been established. Part 2 of this report will provide tips for assessing the reliability of online information and discusses the various efforts being made to identify disinformation at scale.
Concepts
Here are some concepts and phenomena related to information operations
Active measures —covert attempts to influence other countries’ political landscape and decision-making. For a history of Soviet active measures, see Thomas Rid, Active Measures.
Asymmetric warfare – conflict between two countries or groups that are of unequal strength militarily. The weaker party can draw on non-kinetic or non-military strengths such as cyber or information operations.
Coordinated inauthentic behavior (CIB), a term that Facebook coined as a synonym for influence operations, which it defines as “coordinated efforts to manipulate public debate for a strategic goal where fake accounts are central to the operation“. A sociologist has defined CIB as “a manipulative communication tactic that uses a mix of authentic, fake, and duplicated social media accounts to operate as an adversarial network (AN) across multiple social media platforms.”
Cyber-enabled information operations (CEIO or CyIO) – can be understood in a broad sense, referring to any information operations occurring in cyberspace, or in a narrower sense as information operations that include cyber threat activity against computer systems, such as hack-and-leak operations.
Disinfodemic – defined by UNESCO as an "epidemic" of disinformation. The term derives from “misinfodemic,” a phrase coined by Harvard researchers in 2018 to refer to “the spread of a particular health outcome or disease facilitated by viral misinformation.” In 2019 iDefense, part of Accenture Security, used the term “disinfodemics” to refer to a hypothetical scenario in which“disinformation intended to degrade public health could be utilized by states to target adversarial nations. Actors are likely to use fabricated media content or falsified personas to discredit legitimate sources of health information and further degrade them through cyberattacks.” These concerns proved prescient, as charges of disinformation flew back and forth as the COVID-19 pandemic hit in 2020. The term “disinfodemic” can also refer to any disinformation that takes advantage of an epidemic to “achieve specific political agendas” or “promote hate speech and aggression against various ethnic and minority groups.”
Disinformation-for-hire - also known as “dark PR” or “black PR,” a subset of “influence-for-hire,” or “influence operations as a service.” A seminal Buzzfeed article on the subject called it “an end-to-end online manipulation system, which can influence people on a massive scale — resulting in votes cast, products sold, and perceptions changed,” whether hired by private individuals or firms or by governments. Two disinformation-for-hire groups fought on opposite sides of political contests in Africa’s Sahel region in the past several years, as the Washington Post reported in October 2023.
Inauthentic persona or account – a false persona on social media, being used by an information operator to amplify messaging that serves their strategies.
Influence-for-hire: – see also “disinformation-for-hire.”
Information operation – Originally a military term for “the strategic use of technological, operational, and psychological resources to disrupt the enemy’s informational capacities and protect friendly forces,” the term “information operation” has recently come to refer to “deliberate and systematic attempts to steer public opinion using inauthentic accounts and inaccurate information.”
Information warfare (IW) – conflict and competition in the information space, whether in the cyber realm of computer and communications systems, or in the psychological realm of public opinion and decision-making.
Informational nihilism – a term Caroline Orr Bueno uses that ”describes a phenomenon whereby information is stripped of its informational value in order to more effectively weaponize it.”
Liar’s dividend – when people are so wary of disinformation that they come to dismiss genuine information as inauthentic
Tactics and Techniques that Information Operators Use
Algorithmic manipulation – seeking to influence perceptions by taking advantage of the formulas that “boost popular content” on social media and other online platforms. Types of algorithmic manipulation include “hypernudging, microtargeting, and filtering”; hashtag manipulation; “keyword curation” and “strategic signaling”; and “search engine optimization (SEO) poisoning.”
Amplification – To amplify or propagate a message on social media, sometimes using bots or other artificial accounts, sometimes through the use of artificial intelligence (AI) chatbots.
Astroturfing – creating a social movement or campaign “falsely made to appear grassroots.” See also “Influence laundering.”
Asymmetric flooding – a tactic to “selectively manipulate levels of attention to politics” by, for example, increasing the amount of political messaging on channels favoring one party just before an election, while distracting the adherents of the opposite party with non-political postings to reduce their incentive to vote.
Copypasta – “an amplification technique that involves specific text being copied and pasted by many accounts,” according to disinformation research firm Logically.
Crisis actors – The term can refer to the legitimate use of people pretending to be disaster victims in trainings for first responders; however, it also refers to the use of paid actors in staged reports of disasters. Russia has used crisis actors as NattoThoughts has pointed out. Due to awareness of this phenomenon, sometimes real disaster victims are dismissed as crisis actors.
Cross-platform coordinated campaigns - referring to engagement and interaction between accounts across platforms that can produce an effect disproportionate to the seemingly small audiences on any one platform, as the Natto Team has discussed here.
False hacktivism – the deceptive use of personas posing as idealists opposed to corruption or wrongdoing. The Solntsepek persona, which Russian military hackers apparently used to publicize material they had stolen from Ukrainian organizations, is an example; see Natto Thoughts postings here and here.
False flag – defined by Merriam-Webster as “a hostile or harmful action (such as an attack) that is designed to look like it was perpetrated by someone other than the person or group responsible for it.” Russian government hackers have a long history of false-flag operations, such as stealing Iranian hacking tools so as to carry out operations and blame them on Iran, or disguising cyber attacks on Ukrainian computer systems as criminal ransomware.
Franchising – farming out information operations to authentic-sounding voices. In one such operation, the Internet Research Agency (IRA), the troll farm that mercenary/propagandist Yevgeniy Prigozhin controlled, created false personas posing as editors to recruit real journalists to spread their message in target countries. See Natto Thoughts reports "Troll Humor" and “Disrupt and Demoralize, Deniably, Part 2” and see “Troll farm” below.
Gaslighting – defined by Data & Society as “situations in which a person orchestrates deceptions and inaccurately narrates events to the extent that their victim stops trusting their own judgments and perceptions.”
Hack-and-leak attacks – “where emails (or data) are stolen and published online in an attempt to embarrass or discredit an individual candidate or party,” according to the UK’s National Cyber Security Centre. The Russian interference in the 2016 US presidential election is a well-known example. Another example is the anti-Ukrainian Solntsepek campaign, which the Natto Team discussed here and here.
Influence laundering – “using allies inside nominally independent organizations to spread propaganda and cultivate ties with rising leaders,” as the New York Times phrased it in a report on a newly declassified U.S. intelligence analysis of Russian influence operations. One of these Russian government-funded efforts allegedly recruited members of US racial-justice groups to make statements supporting Russia’s annexation of territories in Ukraine,” according to the US Justice Department. On another progressive advocacy group that has echoed Chinese and Russian propaganda themes, see this report. See also “Astroturfing.”
Information laundering – as described in the Natto Team’s “Troll Humor,” a tactic for giving a false story a veneer of legitimacy by initially posting it on a fringe website, then reposting and “liking” it on progressively better-known websites and eventually on mainstream media outlets. Further discussed here, here and here.
Rage farming - disinformation researcher Caroline Orr Bueno coined the term to describe “a combination of outrage bait and engagement farming…posting content that is intentionally provocative and meant to stoke outrage — usually among one’s political adversaries, but sometimes among anyone who sees it…” The Russian effort to undermine the Women’s March of 2017 in the United States by stoking rage against a Palestinian-American participant, which Natto Thoughts discussed here, is an example.
Sock puppet – “a false online identity used for deceptive purposes,” according to Merriam-Webster. Discussed in Natto Thoughts reporting on the anti-Ukrainian Solntsepek hack-and-leak campaign.
Troll farm – defined by Collins English Dictionary as “An organization whose employees or members attempt to create conflict and disruption in an online community by posting deliberately inflammatory or provocative comments.” The Internet Research Agency (IRA) in St. Petersburg, which the mercenary/propagandist Yevgeniy Prigozhin controlled, is a famous troll farm; Natto Team has discussed it extensively, including here, here and here.
Typo-squatting – “create a variation of a reputable or trusted domain using a deliberate typo in the address”or by “replacing letters with look-alike characters to create visually identical domains,” leading users to visit the false domains by accident. In one example, during the US 2016 election campaign Russian military hackers allegedly created the website actblues.com to mimic that of a Democratic fundraising website and redirected would-be donors to their own site.
Useful idiot - defined by Merriam-Webster as “a naive or credulous person who can be manipulated or exploited to advance a cause or political agenda”. See also “Franchising” and “Influence Laundering.”