In 2004 Mark Zuckerberg created Facebook in his Harvard University dorm, with the goal of connected Harvard students. Fast forward to today—the platform has over 2 billion active users and is being confronted with the consequences it has for society, both good and bad.
Last week Frontline aired a two-part documentary called The Facebook Dilemma, which examined social media platforms’ impact on privacy and democracy in the U.S. Facebook’s mission has always been to give people the power to share to make the world more open and connected. While social media platforms like Facebook have the power to do good, they also have several unintended negative consequences in the real world.
In the documentary Frontline interviews Wael Ghonim, an Egyptian activist who’s Facebook page played a key role in sparking the Arab Spring uprisings. He discussed the unintended consequences of Facebook’s algorithms, specifically the polarization of societies. By nature posts that are more extreme or aggressive in tone towards an opponent get people more excited, receive more likes and shares, and then due to the algorithm are distributed higher on news feeds and shown to more people. “These tools are just enablers for whomever. They don’t separate between what’s good and bad. They just look at engagement metrics,” said Ghonim.
Internet companies aren’t held legally responsible for what people write on their platforms, which means it is up to the individual companies to establish ground rules around what can and cannot be said. “We relied on what we thought was the public’s common sense and common decency to police the site,” said Tim Sparapani, former Facebook Director of Public Policy. “This was the greatest experiment in free speech in human history,” said Sparapani.
As Facebook’s user base grew exponentially, they did not develop the resources to police the content being uploaded or understand the impact the platform might have in different cultures and communities around the world.
Paired with a lack of regulation of content, Facebook partnered with data-gathering companies shortly before going public in 2012, allowing them to sell users’ data and become one of the most valuable advertising tools for other businesses.
“The opportunities for deception and dissemination of misinformation are enormous,” said Rand Waltzman, the former Program Manager for the Defense Advanced Research Projects, an agency of the U.S. Department of Defense. Waltzman’s years of research found that social media platforms “allow you to take misinformation and turn it into a serious weapon. It’s the scale that makes it a weapon.”
We’ve seen the weaponization of Facebook have an impact in the 2016 presidential election, during which Russia disseminated fake political posts to over 126 million Americans. Facebook claims to be a technological company, not a media company. Yet for many Facebook is a primary news source. Whether or not they like it, it seems Facebook has been given the role of news editor.
How does Facebook maintain an open environment upholding the First Amendment while protecting its users from the dangerous impacts of disinformation campaigns and polarization?
According to a 2018 Pew Research Center survey of 4,594 U.S. adults, 65% feel that social media companies often fail to anticipate how their products and services will impact society. While I believe Facebook has a responsibility to uphold the First Amendment, I also believe they should take responsibility for how their platform is being used and the real-world impact it’s having.
What a great piece. I do think you give too much credit to Facebook that somehow their impetus was always to provide a platform for connection. I think they always wanted to make money. They certainly could see what their algorithms were doing. They just didn’t care/
LikeLike