The People’s Platform: Taking Back Power and Culture in the Digital Age. Astra Taylor
infrastructure, uncompetitive business and exploitative labor practices, and shady political lobbying initiatives, all of which have made major technology firms the subjects of increasing scrutiny from academics, commentators, activists, and even government officials in the United States and abroad.4
People are beginning to recognize that Silicon Valley platitudes about “changing the world” and maxims like “don’t be evil” are not enough to ensure that some of the biggest corporations on Earth will behave well. The risk, however, is that we will respond to troubling disclosures and other disappointments with cynicism and resignation when what we need is clearheaded and rigorous inquiry into the obstacles that have stalled some of the positive changes the Internet was supposed to usher in.
First and foremost, we need to rethink how power operates in a post-broadcast era. It was easy, under the old-media model, to point the finger at television executives and newspaper editors (and even book publishers) and the way they shaped the cultural and social landscape from on high. In a networked age, things are far more ambiguous, yet new-media thinking, with its radical sheen and easy talk of revolution, ignores these nuances. The state is painted largely as a source of problematic authority, while private enterprise is given a free pass; democracy, fuzzily defined, is attained through “sharing,” “collaboration,” “innovation,” and “disruption.”
In fact, wealth and power are shifting to those who control the platforms on which all of us create, consume, and connect. The companies that provide these and related services are quickly becoming the Disneys of the digital world—monoliths hungry for quarterly profits, answerable to their shareholders not us, their users, and more influential, more ubiquitous, and more insinuated into the fabric of our everyday lives than Mickey Mouse ever was. As such they pose a whole new set of challenges to the health of our culture.
Right now we have very little to guide us as we attempt to think through these predicaments. We are at a loss, in part, because we have wholly adopted the language and vision offered up by Silicon Valley executives and the new-media boosters who promote their interests. They foresee a marketplace of ideas powered by profit-driven companies who will provide us with platforms to creatively express ourselves and on which the most deserving and popular will succeed.
They speak about openness, transparency, and participation, and these terms now define our highest ideals, our conception of what is good and desirable, for the future of media in a networked age. But these ideals are not sufficient if we want to build a more democratic and durable digital culture. Openness, in particular, is not necessarily progressive. While the Internet creates space for many voices, the openness of the Web reflects and even amplifies real-world inequities as often as it ameliorates them.
I’ve tried hard to avoid the Manichean view of technology, which assumes either that the Internet will save us or that it is leading us astray, that it is making us stupid or making us smart, that things are black or white. The truth is subtler: technology alone cannot deliver the cultural transformation we have been waiting for; instead, we need to first understand and then address the underlying social and economic forces that shape it. Only then can we make good on the unprecedented opportunity the Internet offers and begin to make the ideal of a more inclusive and equitable culture a reality. If we want the Internet to truly be a people’s platform, we will have to work to make it so.
I moved to New York City in 1999 just in time to see the dot-com dream come crashing down. I saw high-profile start-ups empty out their spacious lofts, the once ebullient spaces vacant and echoing; there were pink-slip parties where content providers, designers, and managers gathered for one last night of revelry. Although I barely felt the aftershocks that rippled through the economy when the bubble burst, plenty of others were left thoroughly shaken. In San Francisco the boom’s rising rents pushed out the poor and working class, as well as those who had chosen voluntary poverty by devoting themselves to social service or creative experimentation. Almost overnight, the tech companies disappeared, the office space and luxury condos vacated, jilting the city and its inhabitants despite the irreversible accommodations that had been made on behalf of the start-ups. Some estimate that 450,000 jobs were lost in the Bay Area alone.1
As the economist Doug Henwood has pointed out, a kind of amnesia blots out the dot-com era, blurring it like a bad hangover. It seems so long ago: before tragedy struck lower Manhattan, before the wars in Afghanistan and Iraq started, before George W. Bush and then Barack Obama took office, before the economy collapsed a second time. When the rare backward glance is cast, the period is usually dismissed as an anomaly, an embarrassing by-product of irrational exuberance and excess, an aberrational event that gets chalked up to collective folly (the crazy business schemes, the utopian bombast, the stock market fever), but “never as something emerging from the innards of American economic machinery,” to use Henwood’s phrase.2
At the time of the boom, however, the prevailing myth was that the machinery had been forever changed. “Technological innovation,” Alan Greenspan marveled, had instigated a new phase of productivity and growth that was “not just a cyclical phenomenon or a statistical aberration, but … a more deep-seated, still developing, shift in our economic landscape.” Everyone would be getting richer, forever. (Income polarization was actually increasing at the time, the already affluent becoming ever more so while wages for most U.S. workers stagnated at levels below 1970s standards.)3 The wonders of computing meant skyrocketing productivity, plentiful jobs, and the end of recessions. The combination of the Internet and IPOs (initial public offerings) had flattened hierarchies, computer programming jobs were reconceived as hip, and information was officially more important than matter (bits, boosters liked to say, had triumphed over atoms). A new economy was upon us.
Despite the hype, the new economy was never that novel. With some exceptions, the Internet companies that fueled the late nineties fervor were mostly about taking material from the off-line world and simply posting it online or buying and selling rather ordinary goods, like pet food or diapers, and prompting Internet users to behave like conventional customers. Due to changes in law and growing public enthusiasm for high-risk investing, the amount of money available to venture capital funds ballooned from $12 billion in 1996 to $106 billion in 2000, leading many doomed ideas to be propped up by speculative backing. Massive sums were committed to enterprises that replicated efforts: multiple sites specialized in selling toys or beauty supplies or home improvement products, and most of them flopped. Barring notable anomalies like Amazon and eBay, online shopping failed to meet inflated expectations. The Web was declared a wasteland and investments dried up, but not before many venture capitalists and executives profited handsomely, soaking up underwriting fees from IPOs or exercising their options before stocks went under.4 Although the new economy evaporated, the experience set the stage for a second bubble and cemented a relationship between technology and the market that shapes our digital lives to this day.
As business and technology writer Sarah Lacy explains in her breathless account of Silicon Valley’s recent rebirth, Once You’re Lucky, Twice You’re Good, a few discerning entrepreneurs extracted a lesson from the bust that they applied to new endeavors with aplomb after the turn of the millennium: the heart of the Internet experience was not e-commerce but e-mail, that is to say, connecting and communicating with other people as opposed to consuming goods that could easily be bought at a store down the street. Out of that insight rose the new wave of social media companies that would be christened Web 2.0.
The story Lacy tells is a familiar one to those who paid attention back in the day: ambition and acquisitions, entrepreneurs and IPOs. “Winning Is Everything” is the title of one chapter; “Fuck the Sweater-Vests” another. You’d think it was the nineties all over again, except that this time around the protagonists aspired to market valuations in the billions, not millions. Lacy admires the entrepreneurs all the more for their hubris; they are phoenixes, visionaries who emerged unscathed from the inferno, who walked on burning coals to get ahead.