The world’s biggest social network faces oblivion if it doesn’t take painful steps to change. The Center’s director sketches Facebook’s possible path forwards.
By Jeffrey Cole
In my two most recent columns from the end of last year, I looked at the mess Facebook has created for itself. All its wounds are self-inflicted and come from arrogance, greed, and carelessness. The world’s largest and most successful social network is at a crisis point, and its survival can no longer be taken as assured. An even more important question that I posed at the end of my last column asked whether Facebook should survive.
In this third and (for now) final column, I want to look at how Facebook can turn itself around, survive, and even flourish. Since the end of the year, it has dug itself into an even deeper hole. Another misstep, and it will be the first of the big five (Apple, Amazon, Google, Microsoft and Facebook) to disappear and the biggest company ever to crash and burn.
Facebook needs to regain the trust of the government, advertisers and, most importantly, its users. Trust takes years to build and can be lost in a flash. That is what has happened here.
To survive, Facebook must take four essential steps.
Step 1: Fess up
Facebook must make a full and public accounting of what went wrong. Not a hackneyed press statement or hollow commitment to members of Congress who have no idea how a social network (or the internet, for that matter) works.
Facebook must undertake a detailed mea culpa, conducted externally, followed by public commitments to how it will change in the future. Anything less will not be enough.
Facebook must make a clear and unambiguous commitment to the privacy of its users over the needs of its advertisers or shareholders. Nothing less will warrant its survival.
Step 2: Zuck and Sheryl? Time to go
Facebook must make serious changes to its management.
Clearly, Facebook’s founder and CEO, Mark Zuckerberg, has lost his users’ trust. He has not demonstrated the talent or interest in running the company as a real chief executive committed to his users. A new CEO has to take the leadership of Facebook. Zuckerberg should be moved into a position more aligned with his true talents: key thinker and software architect. (Chief Nerd?)
Sheryl Sandberg, the COO, needs to go. She was brought in after Facebook was founded to apply much-needed managerial discipline (she came from Google). In short, as when Google brought in Eric Schmidt as CEO, Sandberg was there to manage the company for inexperienced founders—the adult supervision.
Her main responsibility was to ensure that what happened never happened.
While running Facebook, Sandberg was far more concerned with cultivating her own brand as a working mother who could have it all, as a best-selling author (of Lean In), and as a public personality.
All that was fine when things were good, but when the going got tough at Facebook (first with Cambridge Analytica), Sandberg was primarily concerned with the damage to her own reputation, not Facebook’s.
All Facebook’s wounds are self-inflicted and come from arrogance, greed, and carelessness. The world’s largest and most successful social network is at a crisis point, and its survival can no longer be taken as assured. An even more important question that I posed at the end of my last column asked whether Facebook should survive.
In the days after Cambridge Analytica, the most serious crisis Facebook ever experienced, both Zuckerberg and Sandberg simply disappeared. There was no adult supervision in Facebook’s darkest hour.
Zuckerberg and Sandberg wanted to be seen as digital geniuses who changed the world. When things fell apart, they had nothing to say.
Step 3: Get transparent
Facebook needs to take immediate and concrete steps to assure users of its commitment to privacy and that the mistakes of 2018 (and before) will never be repeated.
There are a number of ways Facebook can try to salvage its reputation. A good start would be the “Consumer’s Privacy Bill of Rights” that the Center articulated seven years ago. Our tracking work clearly showed that even after 12 years of going online, shopping, using credit cards, and sharing private information, the vast majority of internet users are still “very concerned” about their security and privacy.
We proposed the following Bill of Rights that companies with access to private information should provide for their users:
1. Tell users/customers/readers/viewers/listeners what information is being collected and what is being done with it. No right could be more essential and central to customer trust. The right to know what is being collected and why, how it is protected and, most importantly, who is it being shared with?
2. Privacy statements that are not written by lawyers for lawyers. There is no problem with privacy policies written by lawyers: they are the ones who need to vet the details. But those statements should be written in simple language, be easy to read and not overly long. Steve Jobs once admitted that he could not understand Apple’s own privacy statements and didn’t bother reading them.
It’s difficult not to conclude that companies collecting data create massively long, complex-to-read privacy statements in order to guarantee that impatient users quickly go to the end and click “I agree.” The point of the statements as they are written now is to ensure that they are neither read nor understood.
3. Companies sharing private information with third parties must provide an opt-out, or even better, an opt-in. There must be a way to enjoy the benefits of the service without handing over all the details of your private life.
4. Users should be compensated for letting companies use and, in some cases, share private data. That compensation may be financial, but it doesn’t have to be. Other forms of compensation might include points or coupons, advance information or notices of sales, or other benefits.
In fairness to Facebook and Google, they already provide a form of compensation. They provide valuable services for “free” (no direct payment). If a user does not want Facebook or Google to collect or share any information, the company should be able to require that user to pay a monthly fee ($5 a month?) to search the web or be part of a massive social network. The fact that neither company directly charges for its service is a form of compensation, but that does not give them license to share or sell very personal data with anyone.
Step 4: Hire an ombudsperson
Facebook should not be trusted to police itself when it comes to protecting its users’ privacy. At the Center, we believe in governmental regulation as a last resort (although some industry-wide guidelines are desperately needed). The government should not be inspecting Facebook’s records or constantly looking over its shoulder unless the company refuses to reform itself.
What is needed is a highly-respected outsider: an ombudsperson or monitor to help Facebook craft strong and clear policy guidelines and then to continue monitoring to ensure the company follows those guidelines. When newspapers or others have lost trust with their customers (for example, the Washington Post’s experience with a false story — “Jimmy’s World” in 1980), that trust has been restored through the creation of an ombudsperson. That role has been to create change to ensure that mistakes are corrected and then monitor that progress.
Facebook has experienced the greatest crisis any digital company has ever faced. It has lost the trust of its users. It could be the first of the internet giants to disappear. Without management change and clear new policies, that disappearance is practically guaranteed. With change, Facebook has a chance to restore its users’ trust and become a model of reform.
The clock is ticking.
Jeffrey Cole is the founder and director of The Center for the Digital Future at USC Annenberg.
See all columns from the center.
January 16, 2019