Well, it stands to reason that Facebook wasn’t going to allow such comments to spread into the blogosphere without releasing an official statement. Today, the company has responded while not specifically addressing the comments that were made by Palihapitiya:
Chamath has not been at Facebook for over 6 years. When Chamath was at Facebook we were focused on building new social media experiences and growing Facebook around the world. Facebook was a very different company back then, and as we have grown, we have realized how our responsibilities have grown too. We take our role very seriously and we are working hard to improve. We’ve done a lot of work and research with outside experts and academics to understand the effects of our service on well-being, and we’re using it to inform our product development. We are also making significant investments more in people, technology and processes, and - as Mark Zuckerberg said on the last earnings call – we are willing to reduce our profitability to make sure the right investments are made.
In a nutshell, Facebook is trying to downplay or discredit Palihapitiya’s comments due to the fact that he hasn’t worked at the company for more than half a decade. That’s a long time in the world of tech, and especially for a company like Facebook which has grown to become the major player in the social media world. And its CEO, Mark Zuckerberg, has become involved in more philanthropic work (with some saying that he might even run for government office one day).
In other words, the questionable practices and methodology that Facebook might have used in the past have faded away says the company, and we’re now seeing the “Softer Side of Facebook”, if we can borrow a Sears tagline from many years ago.
Facebook’s response, however, doesn’t get to the core of what Palihapitiya is describing regarding society’s attachment to social media. As we’ve seen with the 2016 U.S. Presidential Election, the platform can be used for harm when it comes to spreading false stories that get ingrained into the minds of users as fact. This is a point that Zuckerberg himself has acknowledged. In a statement provided in late November 2016, Zuckerberg wrote:
The problems here are complex, both technically and philosophically. We believe in giving people a voice, which means erring on the side of letting people share what they want whenever possible. We need to be careful not to discourage sharing of opinions or mistakenly restricting accurate content. We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties.
Facebook has implemented a number of changes to combat the fake news problem since then, but it won’t be completely eliminated from the platform. As Palihapitiya pointed out, “short-term, dopamine-driven feedback loops” are being seeded into the minds of users and in effect are “destroying how society works.”
“No civil discourse, no cooperation; misinformation, mistruth. And it’s not an American problem—this is not about Russians ads. This is a global problem.”
Palihapitiya’s comments came not long after former Facebook President Sean Parker also blasted the social media giant. Parker likened Facebook’s practices to brain hacking, to keep users’ minds continuously stimulated that they would keep returning to the site for their next “fix”.
"We need to sort of give you a little dopamine hit every once in a while, because someone liked or commented on a photo or a post or whatever,” said Parker. “And that's going to get you to contribute more content, and that's going to get you… more likes and comments.”