Information Wars. Richard Stengel
Rights
L—Legal Department
NCTC—National Counterterrorism Center
NEA—Near Eastern Affairs
NSC—National Security Council
PA—Public Affairs
PC—Principals Committee
PDP—President’s Daily Brief
P—Under Secretary for Political Affairs
R—Under Secretary for Public Diplomacy and Public Affairs
SCA—South and Central Asian Affairs
S—Secretary of State
Not long ago, a reader emailed me the following question: “Do you think the ratio of disinformation to true information is different now than at other times in human history?” Hmmm. To be honest, I was stumped. Disinformation has been around for as long as we’ve had information. But it’s awfully hard to measure the supply, scale, and scope of disinformation—heck, it’s hard enough just to spot it. Nobody that I know of really quantifies it. What is indisputable is that in recent decades the supply of information has increased exponentially.
In 2010, Eric Schmidt, the former CEO of Google, said we create as much information every two days—about five exabytes—as all the information created from the dawn of civilization until 2003. Various scholars have disputed this, but even if it’s every month rather than every two days, the scale is mind-boggling. So, even if the ratio of disinformation to true information has remained constant, there’s a lot more disinformation in absolute terms than ever before.
But the critical issue is not how much disinformation there is, but how available it is. What’s new is the ease of access, which can make it seem more abundant. Once upon a time, you had to work hard to discover conspiracy theories—find and check out obscure books from the library, look up old newspaper clips on microfiche. (Does anyone under forty know what microfiche is?) Today, conspiracy theories and disinformation are a quick Google search away—or, disinformation finds you, through microtargeting or recommendation engines or your third cousin on Facebook. And, of course, if you search for disinformation or conspiracy theories on Google or read them on Facebook, you can be sure you will get a lot more of them from those same platforms!
One of the strategies of disinformation in general and Russian disinformation in particular is that it uses emotion to get eyeballs. Stories that elicit emotion—the Pope endorsed Donald Trump, Hillary Clinton is running a child sex trafficking ring—are shared much more widely than stories that are more dispassionate. Disinformation with an emotional hook creates stronger reactions. Because social media platforms optimize for virality, disinformation can create so-called “rumor cascades,” where false claims accelerate at ten times the rate of true ones.
We’re also talking about disinformation more. “People tend to assess the relative importance of issues by the ease with which they are retrieved from memory,” wrote the great Daniel Kahneman, “and this is largely determined by the extent of coverage in the media.” The fact that the media is now attuned to disinformation is, of course, a good thing, but it also makes it seem more plentiful than before.
Even unsuccessful disinformation has a negative effect. Disinformation creates what some have dubbed a “liar’s dividend;” that is, even after a falsehood has been debunked, people still wonder, Well, there must have been something to it, or perhaps at least a kernel of the falsehood was true. The dividend is like a single cancer cell that remains after surgery and then starts multiplying again.
We’re only beginning to grasp the scale of disinformation and cyber warfare. The numbers are startling. The Defense Department is attacked millions of times—every day. Facebook announced that it had removed more than three billion fake accounts last year—yes, that’s billion with a b. Tens of millions of digital records are stolen every day. A business falls victim to a ransomware attack every thirteen seconds. One in ten urls is malicious. And today, malicious actors don’t just steal data—they manipulate it, too. Bad actors can go into your medical records and input an underlying condition forcing your insurance rates up. Disinformation has never been easier to create—or disseminate.
Totalitarian leaders are using the entire architecture of disinformation—social media influencers, troll farms, micro-targeted advertising, coordinated bot attacks—to mislead voters. Disinformation-for-hire firms offering these services are springing up around the world. Disinformation is big business. And it’s still at the beginning of its growth curve. We’re only just starting to understand the malign potential of deep fakes and cheap fakes. Pretty soon, anyone with a smart phone—3.5 billion people and counting—will be able to create a misleading image or video clip in seconds.
Hybrid warfare—which uses disinformation—has become part of the modern playbook of militaries around the world. There are no barriers to entry. Hacking and disinformation campaigns cost far less than a single F-35. This is asymmetric warfare offering a big return on a small investment. The U.S. may spend more on its military than the next eight nations combined, but for the cost of a couple of hundred cyber trolls, you can take down an American company or even influence an election. Why physically invade a country, when you can digitally attack it and not risk any casualties? Defensive information weapons haven’t caught up to the offensive ones. It’s far easier to create false information than to detect or neutralize it.
Disinformation is not just a problem of foreign influence. There’s a lot more domestic disinformation than foreign disinformation. For one thing, foreign disinformation operations work only if there is a domestic market for them. When Russian operatives created a Twitter account claiming to represent the official Tennessee GOP, @TEN_GOP, in the 2016 election cycle, the ploy worked because it got hundreds of thousands of American followers. Foreign influence dies if no one heeds it. It works only when Americans amplify it.
Disinformation created by American fringe groups—white nationalists, hate groups, antigovernment movements, left-wing extremists—is growing. These groups have a big advantage over foreign ones—they have built-in domestic audiences of their fellow travelers. Disinformationists supporting presidential candidates are hard at work. And, of course, your Uncle Milton is still forwarding you a junk news story that undocumented immigrants are secretly running the FBI. We tend to underestimate the supply of domestic disinformation because it has always been part of our information ecosystem. But domestic players greatly outnumber foreign ones.
So, yes, the supply of disinformation is growing, but that’s in part because the demand is also growing. People seek it out. The tendency to see conspiracy theories is in our genes. One reason we seem to have reached a deeply polarized US and THEM culture is that this is how we’ve always seen the world, ever since we lived in small, isolated groups. WE were always worried about THEM, and that helped US survive. Our brains are wired that way. Tribalism is the breeding ground for disinformation, and social media seems to reinforce tribalism. The problem for humanity is that the technology to create and distribute disinformation is evolving a whole lot faster than we are. I don’t have a solution for that.
The ontological problem of disinformation is that it gets in the way of us seeing reality for what it is. Of course, no human being sees reality exactly the way it is—we all have prejudices and biases. But disinformation exaggerates those prejudices and biases and accentuates our divides. The truth is, disinformation doesn’t create divides between people, it widens them. One reason it’s easy to amplify division is that we have so much of it. That’s the ultimate goal of the disinformationists—not so much that we believe them, but that we question those things that are demonstrably true.
Disinformation is in part the cause for what Hannah Arendt once called the curious mixture of “gullibility and cynicism” of voters in modern politics. Disinformation, she suggests, helps create the strange circumstance in which people “believe everything and nothing, think that everything was possible and nothing was true.”