Facebook’s Survival Dilemma
Facebook is the most vulnerable of all Big Tech companies. What does it take to survive?
Big Technology is a weekly newsletter dedicated to covering the tech world with honest, nuanced reporting. Join the 10,000+ subscribers who tune into Big Technology for tech news without the spin. Here’s an easy way to subscribe:
Of the many sentences the Wall Street Journal published about Facebook this week, one stood out: “The fear was that eventually users might stop using Facebook altogether.”
Facebook’s executives, according to the Journal’s document-based reporting, feared the service’s decline in 2017 — and for good reason. Comments, likes, reshares, and original posts were all falling that year. Without this activity, Facebook would be a shell, and people might stop coming back. The executives needed to do something.
Their answer was to shift Facebook’s News Feed algorithm to prioritize “meaningful social interactions.” Instead of optimizing for time spent on Facebook (i.e., showing plenty of videos), they’d push posts that sparked discussion to the top of the feed. This could keep users engaged and — importantly — coming back. Facebook spun the changes as an attempt to improve well-being, leaving out the business motivations.
The new algorithm worked, sort of. Though the number of people using Facebook each day increased and its decline in comments slowed, divisive content went to the top of the News Feed. The change was so noticeable to publishers and political parties that some responded by posting more outrage-stoking content, giving Facebook’s userbase more fodder to rage over. “Our approach,” wrote one Facebook data science team in a memo, “has had unhealthy side effects on important slices of public content, such as politics and news.”
This left Facebook with a tough dilemma. The company could optimize for harmony and risk decline, or accept the outrage and stay relevant. As it considered this, Snapchat and YouTube threatened to lure away its users, and Musical.ly was preparing to merge with TikTok. It was vulnerable.
Facebook settled on some half measures. It did put limits on how the algorithm treated some civic and health content. But Facebook CEO Mark Zuckerberg turned down suggestions that would limit the impact of its share button more broadly, concerned about how it would impact engagement.
“Research shows certain partisan divisions in our society have been growing for many decades, long before platforms like Facebook even existed,” Facebook spokesperson Ariana Anthony told me in an email. “It also shows that meaningful engagement with friends and family on our platform is better for people’s well-being than the alternative.”
The share button, as noted previously, sparks thoughtless, emotion-based sharing that causes misinformation and outrage to spread. It’s social media’s worst feature. So, in the end, the company made its choice.
To its credit, Facebook looks for the problems in its services. But when the company doesn’t take action even when it knows what’s wrong, that’s telling. It would be misguided to argue Facebook isn’t responsible for plenty of good around the world. But it’s also getting hard to deny that the company’s relevance might be tied to some of its most damaging aspects. And it will always pick survival.
Meet Big Technology’s Headline Sponsor: Unfinished Live
Unfinished Live is a festival convening technologists, journalists, artists, and changemakers for thought-provoking conversations about building an equitable, sustainable future. You’ll hear from Ethereum co-founder Gavin Wood, Glitch CEO Anil Dash, journalists Casey Newton and Anne Helen Petersen, and many, many more. I’ll be there too, recording a live episode of Big Technology Podcast.
Get your free ticket by going to live.unfinished.com and using the promo code BIGTECH
News Briefs: Facebook Files Edition
This edition’s News Briefs features the rest of the Wall Street Journal’s “Facebook Files” stories. They’re the biggest stories in Big Tech this week. Let’s break them down:
Facebook Knows Instagram Is Toxic for Teen Girls, Company Documents Show (Wall Street Journal)
Facebook possesses some damning internal research about the way Instagram influences teen girls’ mental health. 20% of teens, for instance, say Instagram makes them feel worse about themselves. And in a small but meaningful percent of teens, Instagram is the source of suicidal thoughts. That Facebook did this research is a testament to the company. More should do the same. But it hasn’t addressed these problems in a meaningful way. And Facebook, incredibly, is still planning to release a version of Instagram for kids.
Facebook Says Its Rules Apply to All. Company Documents Reveal a Secret Elite That’s Exempt. (Wall Street Journal)
After Facebook created a program, called XCheck, to give high-profile users a special review before taking down their posts, it ballooned to more than 5.8 users. XCheck is now so massive that Facebook rarely conducts the additional reviews, simply letting content from these accounts stand. The result is a two-tier system for content moderation, one for VIP users and another for the rest of us.
Facebook Employees Flag Drug Cartels and Human Traffickers. The Company’s Response Is Weak, Documents Show. (Wall Street Journal)
Drug cartels and human traffickers make use of Facebook in developing countries, where the company’s enforcement of its policies is often inadequate. This is yet another consequence of scale for Facebook. A nearly 3 billion user platform is just too big to manage effectively.
Curated with our Sponsor: Important, Not Important
National committee will advise the President on AI competition and ethics (Important, Not Important #247)
There’s a new committee in town called the National Artificial Intelligence Advisory Committee (or NAIAC, for short) which will advise the President and officials on AI-related issues like “competitiveness,” employment, workforce accountability, and bias. Frankly, it’s astonishing that such a committee didn’t exist up until today. But it’s good we have one now. To stay up to date on matters of AI, science, and environmental news (and to have it curated and decoded) subscribe to Important, Not Important. You can check it out here.
Instagram boss Adam Mosseri on teenagers, Tik-Tok and paying creators (Recode Media Podcast)
QAnon and anti-vaxxers brainwashed kids stuck at home — now teachers have to deprogram them (CNBC)
'Concerned Citizen' At Theranos CEO Elizabeth Holmes' Trial Turns Out To Be Family (NPR)
She’s One Of Congress’s Leading Progressives — Just Not In Her Own Office, Staffers Say (BuzzFeed News)
Advertise with Big Technology?
Advertising on Big Technology makes everything you do easier. You’ll get in front of the tech world’s key decision-makers, helping you build brand awareness as you look to grow and tell your story. This newsletter has placements available in November and December, including some new ad formats. Email me at firstname.lastname@example.org to learn more.
This week on Big Technology Podcast: Inside The Theranos Trial — With NYT’s Erin Griffith
Erin Griffith is the New York Times reporter at the trial for Theranos founder Elizabeth Holmes. She joins Big Technology Podcast to bring us inside the courtroom, explaining why Holmes is on trial and whether she'll be a rare founder to face consequences for misleading investors. We also discuss whether Holmes is symbolic of the venture capital world's downsides or an outlier.
You can listen on Apple, Spotify, or wherever you get your podcasts.
Thanks again for reading, and see you next Thursday!