Zucked: How Users Got Used and What We Can Do About It. Roger McNamee

Zucked: How Users Got Used and What We Can Do About It - Roger McNamee


Скачать книгу
xenophobic proposal.

      The stunning outcome of Brexit triggered a hypothesis: in an election context, Facebook may confer advantages to campaign messages based on fear or anger over those based on neutral or positive emotions. It does this because Facebook’s advertising business model depends on engagement, which can best be triggered through appeals to our most basic emotions. What I did not know at the time is that while joy also works, which is why puppy and cat videos and photos of babies are so popular, not everyone reacts the same way to happy content. Some people get jealous, for example. “Lizard brain” emotions such as fear and anger produce a more uniform reaction and are more viral in a mass audience. When users are riled up, they consume and share more content. Dispassionate users have relatively little value to Facebook, which does everything in its power to activate the lizard brain. Facebook has used surveillance to build giant profiles on every user and provides each user with a customized Truman Show, similar to the Jim Carrey film about a person who lives his entire life as the star of his own television show. It starts out giving users “what they want,” but the algorithms are trained to nudge user attention in directions that Facebook wants. The algorithms choose posts calculated to press emotional buttons because scaring users or pissing them off increases time on site. When users pay attention, Facebook calls it engagement, but the goal is behavior modification that makes advertising more valuable. I wish I had understood this in 2016. At this writing, Facebook is the fourth most valuable company in America, despite being only fifteen years old, and its value stems from its mastery of surveillance and behavioral modification.

      When new technology first comes into our lives, it surprises and astonishes us, like a magic trick. We give it a special place, treating it like the product equivalent of a new baby. The most successful tech products gradually integrate themselves into our lives. Before long, we forget what life was like before them. Most of us have that relationship today with smartphones and internet platforms like Facebook and Google. Their benefits are so obvious we can’t imagine foregoing them. Not so obvious are the ways that technology products change us. The process has repeated itself in every generation since the telephone, including radio, television, and personal computers. On the plus side, technology has opened up the world, providing access to knowledge that was inaccessible in prior generations. It has enabled us to create and do remarkable things. But all that value has a cost. Beginning with television, technology has changed the way we engage with society, substituting passive consumption of content and ideas for civic engagement, digital communication for conversation. Subtly and persistently, it has contributed to our conversion from citizens to consumers. Being a citizen is an active state; being a consumer is passive. A transformation that crept along for fifty years accelerated dramatically with the introduction of internet platforms. We were prepared to enjoy the benefits but unprepared for the dark side. Unfortunately, the same can be said for the Silicon Valley leaders whose innovations made the transformation possible.

      If you are a fan of democracy, as I am, this should scare you. Facebook has become a powerful source of news in most democratic countries. To a remarkable degree it has made itself the public square in which countries share ideas, form opinions, and debate issues outside the voting booth. But Facebook is more than just a forum. It is a profit-maximizing business controlled by one person. It is a massive artificial intelligence that influences every aspect of user activity, whether political or otherwise. Even the smallest decisions at Facebook reverberate through the public square the company has created with implications for every person it touches. The fact that users are not conscious of Facebook’s influence magnifies the effect. If Facebook favors inflammatory campaigns, democracy suffers.

      August 2016 brought a new wave of stunning revelations. Press reports confirmed that Russians had been behind the hacks of servers at the Democratic National Committee (DNC) and Democratic Congressional Campaign Committee (DCCC). Emails stolen in the DNC hack were distributed by WikiLeaks, causing significant damage to the Clinton campaign. The chairman of the DCCC pleaded with Republicans not to use the stolen data in congressional campaigns. I wondered if it were possible that Russians had played a role in the Facebook issues that had been troubling me earlier.

      Just before I wrote the op-ed, ProPublica revealed that Facebook’s advertising tools enabled property owners to discriminate based on race, in violation of the Fair Housing Act. The Department of Housing and Urban Development opened an investigation that was later closed, but reopened in April 2018. Here again, Facebook’s architecture and business model enabled bad actors to harm innocent people.

      Like Jimmy Stewart in the movie, I did not have enough data or insight to understand everything I had seen, so I sought to learn more. As I did so, in the days and weeks after the election, Dan Rose exhibited incredible patience with me. He encouraged me to send more examples of harm, which I did. Nothing changed. Dan never budged. In February 2017, more than three months after the election, I finally concluded that I would not succeed in convincing Dan and his colleagues; I needed a different strategy. Facebook remained a clear and present danger to democracy. The very same tools that made Facebook a compelling platform for advertisers could also be exploited to inflict harm. Facebook was getting more powerful by the day. Its artificial intelligence engine learned more about every user. Its algorithms got better at pressing users’ emotional buttons. Its tools for advertisers improved constantly. In the wrong hands, Facebook was an ever-more-powerful weapon. And the next US election—the 2018 midterms—was fast approaching.

      Yet no one in power seemed to recognize the threat. The early months of 2017 revealed extensive relationships between officials of the Trump campaign and people associated with the Russian government. Details emerged about a June 2016 meeting in Trump Tower between inner-circle members of the campaign and Russians suspected of intelligence affiliations. Congress spun up Intelligence Committee investigations that focused on that meeting.

      But still there was no official concern about the role that social media platforms, especially Facebook, had played in the 2016 election. Every day that passed without an investigation increased the likelihood that the interference would continue. If someone did not act quickly, our democratic processes could be overwhelmed by outside forces; the 2018 midterm election would likely be subject to interference, possibly greater than we had seen in 2016. Our Constitution anticipated many problems, but not the possibility that a foreign country could interfere in our elections without consequences. I could not sit back and watch. I needed some help, and I needed a plan, not necessarily in that order.

       The Strangest Meeting Ever

      New technology is not good or evil in and of itself. It’s all about how people choose to use it. —DAVID WONG

      I should probably tell the story of how I intersected with Facebook in the first place. In the middle of 2006, Facebook’s chief privacy officer, Chris Kelly, sent me an email stating that his boss was facing an existential crisis and required advice from an unbiased person. Would I be willing to meet with Mark Zuckerberg?

      Facebook was two years old, Zuck was twenty-two, and I was fifty. The platform was limited to college students, graduates with an alumni email address, and high school students. News Feed, the heart of Facebook’s user experience, was not yet available. The company had only nine million dollars in revenue in the prior year. But Facebook had huge potential—that was already obvious—and I leapt at the opportunity to meet its founder.

      Zuck showed up at my Elevation Partners office on Sand Hill Road in Menlo Park, California, dressed casually, with a messenger bag over his shoulder. U2 singer Bono and I had formed Elevation in 2004, along with former Apple CFO Fred Anderson, former Electronic Arts president John Riccitiello, and two career investors, Bret Pearlman and Marc Bodnick. We had configured one of our conference rooms as a living room, complete with a large arcade video game system, and that is where Zuck and I met. We closed the door and sat down on comfy chairs about three feet apart. No one else was in the room.

      Since this was our first meeting, I wanted to say something before Zuck told me about the existential crisis.

      “If it has not already happened, Mark, either Microsoft or Yahoo is going to offer one billion dollars for Facebook. Your parents, your board of directors, your management team,


Скачать книгу