Facebook has been making changes to its platform in recent months. At its annual F8 conference, the company announced a redesign of the Facebook app and a shift in focus to issues of privacy, according to Social Media Today. The newest change however, is one that directly impacts freedom of speech. The change is in response to the role Facebook played in mass shootings like the Christchurch shooting, and most recently, the shooting at the Poway Mosque. In those cases and in others, extremist groups have used Facebook as a weapon to spread their influence, taking to Facebook Live to broadcast their violence to the world. In order to prevent this, Facebook announced a new regulation known as ‘Dangerous Individuals and Situations’ that will ban extremist groups from having a presence on Facebook.
This new regulation will remove users who “proclaim a violent mission or are engaged in violence”, according to Facebook’s community standards. Those impacted include the leader of the Nation of Islam, Louis Farrakhan, and Alex Jones, a right-wing commentator.
This change may seem like a move in the right direction for Facebook. Some are praising the company for are finally implementing policies that address Facebook’s influence on mass shootings and the spread of terrorism. However, others are wondering if this change is worth taking away the principles of freedom of speech? Plus, who’s to know if this change will actually reduce the amount of mass shootings. If these types of people are no longer able to use Facebook, can’t they just move to a different platform?
Well, to address these issues, we must first consider the fact that Facebook and other social media platforms are not required to abide by the laws of the First Amendment. Facebook is a private company and has every right to decide who gets to use their platform. The First Amendment only prohibits the government from suppressing speech.
Considering this, we might not want to get too concerned over free speech on Facebook because in order for the First Amendment protections to exist on social media, social media would need to be under government control. So, what it really comes down to is, whether people want powerful CEOs deciding what content to censor, or if people would rather want the government to make those decisions. This concept is one that many have debated as social media has grown over the years.
In March, the Joe Rogan Experience podcast (episode #1258) discussed this issue with Twitter CEO Jack Dorsey, Twitter’s global lead for legal, policy, and trust and safety, Vijaya Gadde, and Tim Pool, an independent journalist. In the podcast, the four discussed how Twitter decides what type of content to remove and what to focus on. Dorsey mentioned that currently, Twitter is most concerned with the type of content that could influence elections or fake news.
“Do you really want corporations to police what’s true and not true?” says Gadde.
“But you guys do that,” says Pool.
“We try not to do that. We don’t want to do that. But the places that we focus on is where we think that people are going to be harmed by this in a direct and tangible way that we feel a responsibility to correct,” Gadde says in response.
The conversation goes on debating different types of content Twitter has removed in the past such as deadnaming, misgendering and racial slurs. Pool discussed his concerns that companies like Twitter might be censoring content that they don’t agree with like political content for example. He makes a valid point, but in the end, Twitter has the right to do that if they wanted to, but it also doesn’t mean that they do. Regardless, it is the decision of a private company.
Do you think those who run companies like Facebook and Twitter should be able to police and censor content? Or do you think the government should intervene? Let us know in the comments.
If you are being defamed, cyberbullied, or threatened on social media or the Internet in general, contact the Internet attorneys at RM Warner Law. We can guide you through the path to justice. Get in touch today.