Frances Haugen was a product manager with Facebook for two years before she became disillusioned with the social media behemoth. On her way out the door, she combed through the company’s internal social network and left with a bevy of bombshell documents. In doing so, she may have finally given Congress a road map to end Facebook’s total lack of accountability.
The documents Haugen absconded with became the basis for The Wall Street Journal’s “Facebook Files” series, which details how the company has long known about the harm its platforms can cause to people and our social fabric. After weeks of secrecy, Haugen revealed herself as the Facebook whistleblower Sunday on CBS’ “60 Minutes.” Tuesday she testified before a Senate hearing — and made me more hopeful than I have ever been that our lawmakers might be up for the task of regulating social media.
The most important message she gave the committee can be summed up like this: Forget the content; focus on the algorithms.
Most of what Haugen told the Senate Commerce Committee’s subcommittee on consumer protection has been detailed in the media already. She confirmed that Instagram, which Facebook acquired in 2012, was well aware that its platform was “toxic for many teen girls.” She repeatedly noted that Facebook is devoting minimal resources in response to its platform’s being used to incite ethnic violence in Ethiopia and other developing countries.
But the most important message she gave the subcommittee can be summed up like this: Forget the content; focus on the algorithms.
For too long, the question of what to do about Facebook has been framed as a choice between limiting free speech or letting violent rhetoric spread unchecked. Instead, Haugen argues, the answer lies in stopping Facebook’s practice of letting computers decide what people want to see.
As it stands, Facebook’s primary algorithm uses “engagement-based ranking” to help determine what pops up in your news feed. In other words, if you like, comment on or share a piece of content, artificial intelligence programs pick up on what makes that content special and finds things it thinks are similar to show you.
In 2018, the company shifted the news feed’s algorithm to focus on what it called “meaningful social interaction” — downplaying news articles and boosting the number of posts from friends, family members and like-minded users at the top of people’s feeds. The idea was to calm things down after the tumult of the 2016 election. The result, as BuzzFeed co-founder Jonah Peretti noted in an email to Facebook, was that Facebook became a demonstrably angrier place, where the worst content bubbled to the top and was shared more aggressively:
Company researchers discovered that publishers and political parties were reorienting their posts toward outrage and sensationalism. That tactic produced high levels of comments and reactions that translated into success on Facebook. “Our approach has had unhealthy side effects on important slices of public content, such as politics and news,” wrote a team of data scientists, flagging Mr. Peretti’s complaints, in a memo reviewed by the Journal. “This is an increasing liability,” one of them wrote in a later memo.
The point of the current formula is to get people to stay on the site longer by showing users content that Facebook already knows they’ll engage with, whether it’s from a friend or a former schoolmate or an influencer with tens of thousands of followers. “It’s not even for you. It’s so you will give little hits of dopamine to your friends so they create more content,” Haugen explained to the committee.
Facebook co-founder, CEO and Chairman Mark Zuckerberg is aware of all of these factors. And as the owner of almost 58 percent of Facebook’s voting shares, he is uniquely positioned to foster changes in the system he built. Instead, Haugen argued, the company has been fixated on metrics and short-term growth rather than the big-picture aftereffects of its actions.








