Название: Zucked: How Users Got Used and What We Can Do About It
Автор: Roger McNamee
Издательство: HarperCollins
isbn: 9780008319021
isbn:
Facebook’s user count reached one hundred million in the third quarter of 2008. This was astonishing for a company that was only four and half years old, but Facebook was just getting started. Only seven months later, the user count hit two hundred million, aided by the launch of the Like button. The Like button soon defined the Facebook experience. “Getting Likes” became a social phenomenon. It gave users an incentive to spend more time on the site and joined photo tagging as a trigger for addiction to Facebook. To make its advertising valuable, Facebook needs to gain and hold user attention, which it does with behavior modification techniques that promote addiction, according to a growing body of evidence. Behavior modification and addiction would play a giant role in the Facebook story, but were not visible during my time as a mentor to Zuck and would remain unknown to me until 2017.
It turns out everyone wants to be liked, and the Like button provided a yardstick of social validation and social reciprocity—packaged as a variable reward—that transformed social networking. It seemed that every Facebook user wanted to know how many Likes they received for each post, and that tempted many users to return to the platform several times a day. Facebook amplified the signal with notifications, teasing users constantly. The Like button helped boost the user count to 305 million by the end of September 2009. Like buttons spread like wildfire to sites across the web, and along with Connect enabled Facebook to track its users wherever they browsed.
The acquisition of FriendFeed in August 2009 gave Facebook an application for aggregating feeds from a wide range of apps and blogs. It also provided technology and a team that would protect Facebook’s flank from the new kid on the block, Twitter. Over the following year, Facebook acquisitions would enable photo sharing and the importing of contacts. Such acquisitions made Facebook more valuable to users, but that was nothing compared to the value they created for Facebook’s advertising. On every metric, Facebook prospered. Revenue grew rapidly. Facebook’s secret sauce was its ability to imitate and improve upon the ideas of others, and then scale them. The company demonstrated an exceptional aptitude for managing hypergrowth, a skill that is as rare as it is valuable. In September 2009, the company announced that it had turned cash flow positive. This is not the same as turning profitable, but it was actually a more important milestone. It meant that Facebook generated enough revenue to cover all its cash expenses. It would not need more venture capital to survive. The company was only five and a half years old.
With Sheryl on board as chief operating officer in charge of delivering revenues, Facebook quickly developed its infrastructure to enable rapid growth. This simplified Zuck’s life so he could focus on strategic issues. Facebook had transitioned from startup to serious business. This coming-of-age had implications for me, too. Effectively, Zuck had graduated. With Sheryl as his partner, I did not think Zuck would need mentoring from me any longer. My domain expertise in mobile made me valuable as a strategy advisor, but even that would be a temporary gig. Like most successful entrepreneurs and executives, Zuck is brilliant (and ruthless) about upgrading his closest advisors as he goes along. In the earliest days of Facebook, Sean Parker played an essential role as president, but his skills stopped matching the company’s needs, so Zuck moved on from him. He also dropped the chief operating officer who followed Parker and replaced him with Sheryl. The process is Darwinian in every sense. It is natural and necessary. I have encountered it so many times that I can usually anticipate the right moment to step back. I never give it a moment’s thought.
Knowing that we had accomplished everything we could have hoped for at the time I began mentoring him, I sent Zuck a message saying that my job was done. He was appreciative and said we would always be friends. At this point, I stopped being an insider, but I remained a true believer in Facebook. While failures like Beacon had foreshadowed problems to come, all I could see was the potential of Facebook as a force for good. The Arab Spring was still a year away, but the analyst in me could see how Facebook might be used by grassroots campaigns. What I did not grasp was that Zuck’s ambition had no limit. I did not appreciate that his focus on code as the solution to every problem would blind him to the human cost of Facebook’s outsized success. And I never imagined that Zuck would craft a culture in which criticism and disagreement apparently had no place.
The following year, 2010, was big for Facebook in surprising ways. By July, Facebook had five hundred million users, half of whom visited the site every day. Average daily usage was thirty-four minutes. Users who joined Facebook to stay in touch with family soon found new functions to enjoy. They spent more time on the site, shared more posts, and saw more ads.
October saw the release of The Social Network, a feature film about the early days of Facebook. The film was a critical and commercial success, winning three Academy Awards and four Golden Globes. The plot focused on Zuck’s relationship with the Winklevoss twins and the lawsuit that resulted from it. The portrayal of Zuck was unflattering. Zuck complained that the film did not accurately tell the story, but hardly anyone besides him seemed to care. I chose not to watch the film, preferring the Zuck I knew to a version crafted in Hollywood.
Just before the end of 2010, Facebook improved its user interface again, edging closer to the look and feel we know today. The company finished 2010 with 608 million monthly users. The rate of user growth remained exceptionally high, and minutes of use per user per day continued to rise. Early in 2011, Facebook received an investment of five hundred million dollars for 1 percent of the company, pushing the valuation up to fifty billion dollars. Unlike the Microsoft deal, this transaction reflected a financial investor’s assessment of Facebook’s value. At this point, even Microsoft was making money on its investment. Facebook was not only the most exciting company since Google, it showed every indication that it would become one of the greatest tech companies of all time. New investors were clamoring to buy shares. By June 2011, DoubleClick announced that Facebook was the most visited site on the web, with more than one trillion visits. Nielsen disagreed, saying Facebook still trailed Google, but it appeared to be only a matter of time before the two companies would agree that Facebook was #1.
In March 2011, I saw a presentation that introduced the first seed of doubt into my rosy view of Facebook. The occasion was the annual TED Conference in Long Beach, the global launch pad for TED Talks. The eighteen-minute Talks are thematically organized over four days, providing brain candy to millions far beyond the conference. That year, the highlight for me was a nine-minute talk by Eli Pariser, the board president of MoveOn.org. Eli had an insight that his Facebook and Google feeds had stopped being neutral. Even though his Facebook friend list included a balance of liberals and conservatives, his tendency to click more often on liberal links had led the algorithms to prioritize such content, eventually crowding out conservative content entirely. He worked with friends to demonstrate that the change was universal on both Facebook and Google. The platforms were pretending to be neutral, but they were filtering content in ways that were invisible to users. Having argued that the open web offered an improvement on the biases of traditional content editors, the platforms were surreptitiously implementing algorithmic filters that lacked the value system of human editors. Algorithms would not act in a socially responsible way on their own. Users would think they were seeing a balance of content when in fact they were trapped in what Eli called a “filter bubble” created and enforced by algorithms. He hypothesized that giving algorithms gatekeeping power without also requiring civic responsibility would lead to unexpected, negative consequences. Other publishers were jumping on board the personalization bandwagon. There might be no way for users to escape from filter bubbles.
Eli’s conclusion? If platforms are going to be gatekeepers, they need to program a sense of civic responsibility into their algorithms. They need to be transparent about the rules that determine what gets through the filter. And they need to give users control of their bubble.
I was gobsmacked. It was one of the most СКАЧАТЬ