Название: Zucked
Автор: Roger McNamee
Издательство: HarperCollins
Жанр: Биографии и Мемуары
isbn: 9780008319021
isbn:
This is a story about trust. Technology platforms, including Facebook and Google, are the beneficiaries of trust and goodwill accumulated over fifty years by earlier generations of technology companies. They have taken advantage of our trust, using sophisticated techniques to prey on the weakest aspects of human psychology, to gather and exploit private data, and to craft business models that do not protect users from harm. Users must now learn to be skeptical about products they love, to change their online behavior, insist that platforms accept responsibility for the impact of their choices, and push policy makers to regulate the platforms to protect the public interest.
This is a story about privilege. It reveals how hypersuccessful people can be so focused on their own goals that they forget that others also have rights and privileges. How it is possible for otherwise brilliant people to lose sight of the fact that their users are entitled to self-determination. How success can breed overconfidence to the point of resistance to constructive feedback from friends, much less criticism. How some of the hardest working, most productive people on earth can be so blind to the consequences of their actions that they are willing to put democracy at risk to protect their privilege.
This is also a story about power. It describes how even the best of ideas, in the hands of people with good intentions, can still go terribly wrong. Imagine a stew of unregulated capitalism, addictive technology, and authoritarian values, combined with Silicon Valley’s relentlessness and hubris, unleashed on billions of unsuspecting users. I think the day will come, sooner than I could have imagined just two years ago, when the world will recognize that the value users receive from the Facebook-dominated social media/attention economy revolution masked an unmitigated disaster for our democracy, for public health, for personal privacy, and for the economy. It did not have to be that way. It will take a concerted effort to fix it.
When historians finish with this corner of history, I suspect that they will cut Facebook some slack about the poor choices that Zuck, Sheryl Sandberg, and their team made as the company grew. I do. Making mistakes is part of life, and growing a startup to global scale is immensely challenging. Where I fault Facebook—and where I believe history will, as well—is for the company’s response to criticism and evidence. They had an opportunity to be the hero in their own story by taking responsibility for their choices and the catastrophic outcomes those choices produced. Instead, Zuck and Sheryl chose another path.
This story is still unfolding. I have written this book now to serve as a warning. My goals are to make readers aware of a crisis, help them understand how and why it happened, and suggest a path forward. If I achieve only one thing, I hope it will be to make the reader appreciate that he or she has a role to play in the solution. I hope every reader will embrace the opportunity.
It is possible that the worst damage from Facebook and the other internet platforms is behind us, but that is not where the smart money will place its bet. The most likely case is that the technology and business model of Facebook and others will continue to undermine democracy, public health, privacy, and innovation until a countervailing power, in the form of government intervention or user protest, forces change.
TEN DAYS BEFORE the November 2016 election, I had reached out formally to Mark Zuckerberg and Facebook chief operating officer Sheryl Sandberg, two people I considered friends, to share my fear that bad actors were exploiting Facebook’s architecture and business model to inflict harm on innocent people, and that the company was not living up to its potential as a force for good in society. In a two-page memo, I had cited a number of instances of harm, none actually committed by Facebook employees but all enabled by the company’s algorithms, advertising model, automation, culture, and value system. I also cited examples of harm to employees and users that resulted from the company’s culture and priorities. I have included the memo in the appendix.
Zuck created Facebook to bring the world together. What I did not know when I met him but would eventually discover was that his idealism was unbuffered by realism or empathy. He seems to have assumed that everyone would view and use Facebook the way he did, not imagining how easily the platform could be exploited to cause harm. He did not believe in data privacy and did everything he could to maximize disclosure and sharing. He operated the company as if every problem could be solved with more or better code. He embraced invasive surveillance, careless sharing of private data, and behavior modification in pursuit of unprecedented scale and influence. Surveillance, the sharing of user data, and behavioral modification are the foundation of Facebook’s success. Users are fuel for Facebook’s growth and, in some cases, the victims of it.
When I reached out to Zuck and Sheryl, all I had was a hypothesis that bad actors were using Facebook to cause harm. I suspected that the examples I saw reflected systemic flaws in the platform’s design and the company’s culture. I did not emphasize the threat to the presidential election, because at that time I could not imagine that the exploitation of Facebook would affect the outcome, and I did not want the company to dismiss my concerns if Hillary Clinton won, as was widely anticipated. I warned that Facebook needed to fix the flaws or risk its brand and the trust of users. While it had not inflicted harm directly, Facebook was being used as a weapon, and users had a right to expect the company to protect them.
The memo was a draft of an op-ed that I had written at the invitation of the technology blog Recode. My concerns had been building throughout 2016 and reached a peak with the news that the Russians were attempting to interfere in the presidential election. I was increasingly freaked out by what I had seen, and the tone of the op-ed reflected that. My wife, Ann, wisely encouraged me to send the op-ed to Zuck and Sheryl first, before publication. I had been one of Zuck’s many advisors in Facebook’s early days, and I played a role in Sheryl’s joining the company as chief operating officer. I had not been involved with the company since 2009, but I remained a huge fan. My small contribution to the success of one of the greatest companies ever to come out of Silicon Valley was one of the true highlights of my thirty-four-year career. Ann pointed out that communicating through an op-ed might cause the wrong kind of press reaction, making it harder for Facebook to accept my concerns. My goal was to fix the problems at Facebook, not embarrass anyone. I did not imagine that Zuck and Sheryl had done anything wrong intentionally. It seemed more like a case of unintended consequences of well-intended strategies. Other than a handful of email exchanges, I had not spoken to Zuck in seven years, but I had interacted with Sheryl from time to time. At one point, I had provided them with significant value, so it was not crazy to imagine that they would take my concerns seriously. My goal was to persuade Zuck and Sheryl to investigate and take appropriate action. The publication of the op-ed could wait a few days.
Zuck and Sheryl each responded to my email within a matter of hours. Their replies were polite but not encouraging. They suggested that the problems I cited were anomalies that the company had already addressed, but they offered to connect me with a senior executive to hear me out. The man they chose was Dan Rose, a member of their inner circle with whom I was friendly. I spoke with Dan at least twice before the election. Each time, he listened patiently and repeated what Zuck and Sheryl had said, with one important addition: he asserted that Facebook was technically a platform, not a media company, which meant it was not responsible for the actions of third parties. He said it like that should have been enough to settle the matter.
Dan Rose is a very smart man, but he does not make policy at Facebook. That is Zuck’s role. Dan’s role is to carry out Zuck’s orders. It would have been better to speak with Zuck, but that was not an option, so I took what I could get. Quite understandably, Facebook did not want me to go public СКАЧАТЬ