Social Media Censorship and the Law

Censorship is something that concerns Americans because it can stifle free expression, infringing on the First Amendment's guarantee of free speech. As debates, dialogues, and news reports increasingly take place on social media platforms like Facebook and Twitter, censorship practices on these sites have become an important part of the free speech discussion.

The following are some examples of content that has been removed from social media sites:

  • Copenhagen's iconic "Little Mermaid" statue;
  • A girlfriend's livestream of the fatal shooting of her boyfriend by a police officer;
  • A picture of a cat in a business suit;
  • A photo of a plus size model; and
  • A meme depicting and identifying a convicted rapist.

With the wide range of content that has been taken down by social media companies, it's difficult to determine what can and can't be posted on social media. The free speech provisions of the First Amendment prevent federal, state, and local governments from blatant internet censorship, which includes social media. However, social media companies (as with all private, non-governmental interests) have more freedom to restrict content on their sites.

Social media sites must manage information that comes from third party users, but they aren't generally liable for their users' content under Section 230 of the Communications Decency Act. They're also protected from liability for monetary damages pertaining to the copyright infringement activities of their users by "safe harbor" provisions under the Digital Millennium Copyright Act so long as they comply with specific take-down notices. This protection from liability gives companies like Facebook, Instagram, and Snapchat incentives to monitor their sites and remove certain content.

Content Removal From Social Media Sites

Social media companies engage in "content filtering" or "content monitoring." Sometimes footage or posts are removed due to reports from users who flag content that's considered inappropriate according to the rules of the site. Other content removal is based on newsworthy events.

For instance, it's common practice to scrub a user's social media presence if they're a suspect or have been arrested in a violent event like a mass shooting. Entities like YouTube use software programs to remove content that isn't consistent with the platform's views (for example, pro-Isis videos) and this is often done before the objectional videos have even aired.

Terms of Service Violations

There are several ways to get banned from social media sites by, for example, trolling, submitting self-promotional links or computer-generated spam, or posting sexually-explicit or violent material. However, these practices often overlap with the more general terms of service violations.

By becoming a user of any social media platform, you agree to the company's terms of service, which are the rules of how a user should conduct themselves on the site. When users violate a company's terms of service, the company can remove your account for any reason. Additionally, the companies have the authority to change or alter their terms of service at any time without informing users.

Terms of service violations are not only important because they allow for the removal of content from a website, but the violations also trigger the Computer Abuse and Fraud Act.

The Computer Abuse and Fraud Act

The Computer Abuse and Fraud Act (CFAA) is a broad federal law that criminalizes numerous activities related to computers and computer networks. The CFAA is not only a criminal law, but also gives private individuals and corporations the right to sue to recover damages.

The Act forbids anyone from accessing a computer "without authorization." The courts have interpreted this to mean that violating a company's terms of service is equivalent to "accessing that company's computers without authorization." Because the law threatens to burden free expression conduct (such as journalists or researchers using automated web browsing tools), some critics, including free speech advocates have voiced criticism of the CFAA's role in social media censorship.

Social Media Censorship Concerns

Obviously social media firms have a legal right to restrict content on their sites; but this works both ways. Consumers have the right to avoid their services and go to other platforms if they don't like a social network's censorship practices. However, platforms like Facebook, Instagram, Twitter, and YouTube dominate the industry, so there are not too many other places to go.

The limited choice in social media platforms has raised unfair trade concerns. Some critics believe that social media outlets have a bias against conservatism and suppress content expressing those viewpoints. However, left-leaning political advocates also have issues with social media dominance, which indicates problems that are more than just political.

Another complaint about social media censorship is the lack of transparency and consistency about the content that's taken down. Facebook addressed some of these criticisms by disclosing their specific rules for taking down content after it's been referred to their content moderators. The rules discuss topics including graphic violence, nudity, sexual activity, and hate speech.

For instance, Facebook's community standards detail removal of content that's used to promote violence against someone or which threatens someone based on race, ethnicity, national origin, sexual orientation, gender, gender identity, religious affiliation, disability, or disease. Twitter's rules say basically the same thing, but add "age" as another criterion.

Social Media Appeals Processes

If your content was taken down, you typically have the ability to appeal the decision with the social media provider. The appeals process for a takedown depends on the social media platform. For instance, Twitter suspends an entire account for violations, even if the offending content was a single tweet. When it comes to Facebook takedowns, the following appeals process will apply:

  • Facebook will notify you when your content is removed with an alert.
  • You can then click "Request Review" and your request is referred to a member of Facebook's community team.
  • Your appeal is reviewed by an actual human (not AI software) within 24 hours.
  • If Facebook determines that the takedown was a mistake, the content will be restored.

Get More Information about Social Media Censorship and the Law from an Attorney

The social media laws and regulations that involve content censorship are very complex and difficult to navigate. If you're impacted by social media censorship, then you should discuss your situation with an internet/social media lawyer right away.

Next Steps

Contact a qualified civil rights attorney to help you protect your rights.

Help Me Find a Do-It-Yourself Solution