The People’s Platform: Taking Back Power and Culture in the Digital Age. Astra Taylor
and quips” from the “cynics” cut deep, making it that much harder for wannabe Web barons “to build themselves back up again.” But build themselves back up a handful of them did, heading to the one place insulated against the downturn, Silicon Valley. “The Valley was still awash in cash and smart people,” says Lacy. “Everyone was just scared to use them.”
Web 2.0 was the logical consequence of the Internet going mainstream, weaving itself into everyday life and presenting new opportunities as millions of people rushed online. The “human need to connect” is “a far more powerful use of the Web than for something like buying a book online,” Lacy writes, recounting the evolution of companies like Facebook, LinkedIn, Twitter, and the now beleaguered Digg. “That’s why these sites are frequently described as addictive … everyone is addicted to validations and human connections.”
Instead of the old start-up model, which tried to sell us things, the new one trades on our sociability—our likes and desires, our observations and curiosities, our relationships and networks—which is mined, analyzed, and monetized. To put it another way, Web 2.0 is not about users buying products; rather, users are the product. We are what companies like Google and Facebook sell to advertisers. Of course, social media have made a new kind of engagement possible: they have also generated a handful of enormous companies that profit off the creations and interactions of others. What is social networking if not the commercialization of the once unprofitable art of conversation? That, in a nutshell, is Web 2.0: content is no longer king, as the digital sages like to say; connections are.
Though no longer the popular buzzword it once was, “Web 2.0” remains relevant, its key tenets incorporated not just by social networking sites, but in just by all cultural production and distribution, from journalism to film and music. As traditional institutions go under—consider the independent book, record, and video stores that have gone out of business—they are being replaced by a small number of online giants—Amazon, iTunes, Netflix, and so on—that are better positioned to survey and track users. These behemoths “harness collective intelligence,” as the process has been described, to sell people goods and services directly or indirectly. “The key to media in the twenty-first century may be who has the most knowledge of audience behavior, not who produces the most popular content,” Tom Rosenstiel, the director of the Pew Research Center’s Project for Excellence in Journalism, explained.
Understanding what sites people visit, what content they view, what products they buy and even their geographic coordinates will allow advertisers to better target individual consumers. And more of that knowledge will reside with technology companies than with content producers. Google, for instance, will know much more about each user than will the proprietor of any one news site. It can track users’ online behavior through its Droid software on mobile phones, its Google Chrome Web browser, its search engine and its new tablet software. The ability to target users is why Apple wants to control the audience data that goes through the iPad. And the company that may come to know the most about you is Facebook, with which users freely share what they like, where they go and who their friends are.5
For those who desire to create art and culture—or “content,” to use that horrible, flattening word—the shift is significant. More and more of the money circulating online is being soaked up by technology companies, with only a trickle making its way to creators or the institutions that directly support them. In 2010 publishers of articles and videos received around twenty cents of each dollar advertisers spent on their sites, down from almost a whole dollar in 2003.6 Cultural products are increasingly valuable only insofar as they serve as a kind of “signal generator” from which data can be mined. The real profits flow not to the people who fill the platforms where audiences congregate and communicate—the content creators—but to those who own them.
The original dot-com bubble’s promise was first and foremost about money. Champions of the new economy conceded that the digital tide would inevitably lift some boats higher than others, but they commonly assumed that everyone would get a boost from the virtual effervescence. A lucky minority would work at a company that was acquired or went public and spend the rest of their days relaxing on the beach, but the prevailing image had each individual getting in on the action, even if it was just by trading stocks online.
After the bubble popped, the dream of a collective Internet-enabled payday faded. The new crop of Internet titans never bothered to issue such empty promises to the masses. The secret of Web 2.0 economics, as Lacy emphasizes, is getting people to create content without demanding compensation, whether by contributing code, testing services, or sharing everything from personal photos to restaurant reviews. “A great Web 2.0 site needs a mob of people who use it, love it, and live by it—and convince their friends and family to do the same,” Lacy writes. “Mobs will devote more time to a site they love than to their jobs. They’ll frequently build the site for the founders for free.” These sites exist only because of unpaid labor, the millions of minions toiling to fill the coffers of a fortunate few.
Spelling this out, Lacy is not accusatory but admiring—awestruck, even. When she writes that “social networking, media, and user-generated content sites tap into—and exploit—core human emotions,” it’s with fealty appropriate to a fiefdom. As such, her book inadvertently provides a perfect exposé of the hypocrisy lurking behind so much social media rhetoric. The story she tells, after all, is about nothing so much as fortune seeking, yet the question of compensating those who contribute to popular Web sites, when it arises, is quickly brushed aside. The “mobs” receive something “far greater than money,” Lacy writes, offering up the now-standard rationalization for the inequity: entertainment, self-expression, and validation.7 This time around, no one’s claiming the market will be democratized—instead, the promise is that culture will be. We will “create” and “connect” and the entrepreneurs will keep the cash.
This arrangement has been called “digital sharecropping.”8 Instead of the production or distribution of culture being concentrated in the hands of the few, it is the economic value of culture that is hoarded. A small group, positioned to capture the value of the network, benefits disproportionately from a collective effort. The owners of social networking sites may be forbidden from selling songs, photos, or reviews posted by individual users, for example, but the companies themselves, including user content, might be turned over for a hefty sum: hundreds of millions for Bebo and Myspace and Goodreads, one billion or more for Instagram and Tumblr. The mammoth archive of videos displayed on YouTube and bought by Google was less a priceless treasure to be preserved than a vehicle for ads. These platforms succeed because of an almost unfathomable economy of scale; each search brings revenue from targeted advertising and fodder for the data miners: each mouse click is a trickle in the flood.
Over the last few years, there has been an intermittent but spirited debate about the ethics of this economic relationship. When Flickr was sold to Yahoo!, popular bloggers asked whether the site should compensate those who provided the most viewed photographs; when the Huffington Post was acquired by AOL for $315 million, many of the thousands of people who had been blogging for free were aghast, and some even started a boycott; when Facebook announced its upcoming IPO, journalists speculated about what the company, ethically, owed its users, the source of its enormous valuation.9 The same holds for a multitude of sites: Twitter wouldn’t be worth billions if people didn’t tweet, Yelp would be useless without freely provided reviews, Snapchat nothing without chatters. The people who spend their time sharing videos with friends, rating products, or writing assessments of their recent excursion to the coffee shop—are they the users or the used?
The Internet, it has been noted, is a strange amalgamation of playground and factory, a place where amusement and labor overlap in confusing ways. We may enjoy using social media, while also experiencing them as obligatory; more and more jobs require employees to cultivate an online presence, and social networking sites are often the first place an employer turns when considering a potential hire. Some academics call this phenomenon “playbor,” an awkward coinage that tries to get at the strange way “sexual desire, boredom, friendship” become “fodder for speculative profit” online, to quote media scholar Trebor Scholz.10 Others use the term “social factory” to describe the Web 2.0, envisioning it as a machine that