Facebook Whistleblower Drops Bombshell And SEC Suit Alleging Culture Of Profit Over User Safety

Frances Haugen
Frances Haugen, a data scientist and former product manager at Facebook, has revealed herself as the source of a treasure trove of seemingly damning documents alleging that the social network "chooses profit over safety." It does this by "optimizing for content that gets engagement, or reaction," even when that content is knowingly harmful, based the social network's own research, she says.

The former product manager was hired in 2019 after being recruited by Facebook. She told 60 Minutes that she accepted the job only after being assured she could help Facebook quell misinformation, due to losing a friend to a rabbit hole of conspiracy theories.

We all know those knows kinds of people, and they tend to post and share conspiracy theories on social media sites (Facebook included). This is a notion Apple CEO Tim Cook touched on earlier this year, suggesting that "rampant disinformation and conspiracy theories [are] juiced by algorithms."

Disturbed by what she saw, Haugen began secretly copying tens of thousands of documents, which she anonymously shared with The Wall Street Journal. She also filed eight complaints (and possibly more) with the Securities and Exchange Commission (SEC) last month.

Some of the documents contained internal studies, including one on the harmful effects of Instagram, which is owned by Facebook. That particular study found that 13.5 percent of female teenagers say that what they see on Instagram increases their thoughts of suicide, and 17 percent of teen girls attributed Instagram to worsening their eating disorders. Incidentally, Facebook recently abandoned its Instagram Kids app after a fiery backlash.


"What's super tragic is Facebook's own research says, as these young women begin to consume this—this eating disorder content, they get more and more depressed. And it actually makes them use the app more. And so, they end up in this feedback cycle where they hate their bodies more and more," Haugen says.

Furthermore, Haugen says Facebook's research makes it clear that Instagram is not just potentially dangerous and "harms teenagers," but that it is "distinctly worse than other forms of social media."

She also alleges that the reason Facebook doesn't alter its algorithm to be safer is because it knows that would result is people spending less time on the site, in turn clicking on ads left often, which would affect Facebook's revenue.

"One of the consequences of how Facebook is picking out that content today is it is -- optimizing for content that gets engagement, or reaction. But its own research is showing that content that is hateful, that is divisive, that is polarizing, it's easier to inspire people to anger than it is to other emotions," Haugen says.

The allegations are disturbing, to say the least. Back in March, Facebook co-founder Mark Zuckerberg testified to Congress that the social network has "removed content that could lead to imminent real-world harm," and that it has an unprecedented fact-checking system in place.

Despite that, Haugen says during the 2020 election, Facebook turned off temporary safety systems designed to reduce misinformation, and when the election was over, changed the settings back to what they were "to prioritize growth over safety."

Interestingly, Haugen also says she has a lot of empathy for Zuckerberg, saying he "never set out to make a hateful platform." Maybe so, but if her allegations are true, it appears the social network has become one anyway.

Image Source: CBS News (60 Minutes)