The Case To Reform The Share Button, According To Facebook’s Own Research
New leaked document shows Facebook’s Share button spreads misinformation pervasively after two hops down the chain.
Big Technology is a weekly newsletter dedicated to covering the tech world with honest, nuanced reporting. Join the 10,000+ subscribers who tune into Big Technology for tech news without the spin. Here’s an easy way to subscribe:
In spring 2019, Facebook researchers looked into whether the Share button helped amplify misinformation. In a report called “Deep Reshares and Misinformation,” they confirmed their suspicions.
The report noted that people are four times more likely to see misinformation when they encounter a post via a share of a share — kind of like a retweet of a retweet — compared to a typical photo or link on Facebook. Add a few more shares to the chain, and people are five to ten times more likely to see misinformation. It gets worse in certain countries. In India, people who encounter “deep reshares,” as the researchers call them, are twenty times more likely to see misinformation.
“Our data,” the researchers concluded, ”reveals that misinformation relies much more on deep reshares for distribution than major publishers do.”
A simple product tweak, the research indicated, would likely help Facebook constrain its misinformation problem more than an army of content moderators — all without removing a single post. In this scenario, adding some friction after the first share, or blocking sharing altogether after one share, could help mitigate the spread of misinformation on Facebook.
The study found that 38% of all “viewpoint views” (Facebook speak for views) of link posts with misinformation take place after two reshares. For photos, the numbers increase. 65% of views of photo misinformation take place after two reshares. Facebook pages, meanwhile, don’t rely on deep reshares for distribution. About 20% of page content is viewed at a reshare depth of two or higher.
“I'm an advocate of significant friction around sharing,” Aviv Ovadaya, a misinformation researcher and Harvard Belfer TAPP Fellow, told me. “This analysis supports that conclusion, that sharing as functionality, especially beyond one's friends of friends, helps lower quality content more than it helps higher quality content.”
Like Twitter’s Retweet and WhatsApp’s Forward, the Facebook Share button encourages people to thoughtlessly pass along posts that exploit their emotions and biases. The WhatsApp Forward amplified so much bad information that the company slowed it down. The Twitter Retweet amplified so many links people didn’t click that the company now asks to “read before you retweet.” Facebook put some sharing friction in place — including interstitials if you’re about to share an old article — but the research makes the need for more aggressive action clear.
What Facebook did with its research is unclear, however. The company, now Meta, didn’t respond to a request for comment. Twitter, too, did not respond to an email asking whether it had similar research. In the internal comments discussing the study, Facebook employees discussed the merits of blocking shares after a certain depth or simply using this data to assist Facebook’s teams looking for misinformation. It doesn’t appear the company took meaningful action.
This lack of action prompted Haugen’s Lawyer, Lawrence Lessig, to suggest that Apple should threaten Facebook with removal from the App Store if it didn’t put limits on reshares. This action would bring up all manner of anticompetitive issues and likely create a bad precedent. But Lessig, speaking on Big Technology Podcast, advocated for it nonetheless.
“Facebook regulating itself? We tried that, turns out that that doesn't work. But we can see other companies stepping up and trying to create a standard of safety,” Lessig said. “They have an extraordinary opportunity to leverage their brand around safety in a way that could actually help make the internet platform safer.”
The research report comes from Frances Haugen’s disclosures to the Securities and Exchange Commission, which were also provided to Congress in redacted form by her legal team. The redacted versions received by Congress were obtained by a consortium of news organizations, including Big Technology. This study became available to the consortium on Wednesday.
After some delay, it appears Haugen’s documents are on their way to being made available more widely. “The plan is to get these documents in every place around the world where Facebook is affecting them,” said Lessig.
Though wonky, these less explosive documents reveal a detailed picture of how Facebook operates behind the scenes. They’re valuable tools to understand how the company works, and the ways it might ameliorate its problems. In this case, the solution is clear: rein in the share button.
Meet Big Technology’s Headline Sponsor: On Deck Customer Success
Providing a premium customer experience is a crucial differentiator in today’s economy.
What exactly are the models and best incentives to build your customer success teams on? What does a five-star customer service experience even look like?
Introducing On Deck Customer Success (ODCS), a continuous community for senior customer experience operators who want to build customer-centric cultures and maximize their potential through professional development and access to a trusted peer network.
On Deck Customer Success is the place for confidential, meaningful conversations. You’ll get invaluable support from a network of capable CS leaders who understand the challenges you face and want to see you succeed in navigating your career.
The deadline for this cohort ends November 30th and spots are filling quickly!
Apple’s Federighi rails against app sideloading in single-note keynote (TechCrunch)
Apple rarely shows up to events that aren’t its own. But Craig Federighi, its head of software engineering, spoke here at Web Summit and railed against regulation. Apple taking its show on the road can mean only one thing: It’s legitimately afraid of what’s coming on the antitrust front. Outside of speeches, the company could head off its challenges by playing fair with competitors in its App Store.
Ordering food on an app is easy. Delivering it could mean injury and theft (NPR)
Behind delivery apps like Grubhub, Doordash, and Uber Eats, there’s a harsh reality that we rarely encounter. Drivers are managed by the "Patron Fantasma," or a ghost boss (aka: the algorithm) and can be injured, attacked, or robbed as they work. The apps demand such speed that they’re forced to use e-bikes. These workers became part of society’s critical infrastructure as governments around the world told people to “stay home.” Their plight in the aftermath is worth our attention.
We Haven’t Reached ‘Peak Newsletter.’ Not by a Long Shot. (Politico)
Newsletters are taking over. This is my (admittedly) biased take, but Politico columnist Jack Shafer explains why in a terrific column. The stories you get in newsletters, he writes, are there because you’ve requested them, not because some homepage editor decided they were important. And for advertisers, it’s often better to reach a smaller, targeted list with frequency than a massive but amorphous audience. This is why The Atlantic, The New York Times, and others are investing in email. It’s why it’s taking off at Substack. And it’s why I bet my career on it.
We’ve Entered the Age of the Hybrid Workplace (Sponsored)
Remote working at scale has been normalized, and successful organizations are focused on ways to deliver, manage, and improve it.
As a Big Technology subscriber, you're invited to get free access to “The State of the Digital Workplace” report with data and insights from 500+ executives!
You can check out the report here.
Advertise with Big Technology?
Advertising on Big Technology makes everything you do easier. You’ll get in front of the tech world’s key decision-makers, helping you build brand awareness as you look to grow and tell your story. This newsletter has placements available in Q1, including some new ad formats. Email me at email@example.com to learn more.
This week on Big Technology Podcast: The Motivations of Facebook Whistleblower Frances Haugen — With Her Lawyer Lawrence Lessig
Lawrence Lessig is Frances Haguen's lawyer and a Harvard Law School professor. He joins Big Technology Podcast to address the various questions about Haugen's motivations, backers, and intent that have percolated since she came forward. We start by addressing whether the leaked documents should be available to all and move into the conspiracies about her. A lively discussion follows.
You can listen on Apple, Spotify, or wherever you get your podcasts.
Thanks again for reading and see you next Thursday!