The U.S. Supreme Court appears to be torn on whether to trigger sweeping changes to the internet. The U.S. Supreme Court on Monday heard arguments over Florida and Texas laws that restrict how platforms like Facebook and YouTube moderate speech. If the courts allow them to take effect, content on social media could look very different, with platforms forced to spread disgusting or hateful content that is now blocked or removed.
The high stakes in Monday’s debate gave new urgency to longstanding questions about free speech and online regulation. Are social platforms similar to newspapers, protected by the First Amendment, giving them editorial control over content? Or are they common carriers, like telephone providers or telegraph companies, that need to transmit protected speech without interference?
The ruling is expected in June, when the court typically makes many rulings and could have broad implications for social networking sites like Facebook, YouTube, X and TikTok doing business outside of Florida and Texas. “These cases have the potential to impact free speech online for a generation,” said Alex Abdo, director of litigation at Columbia University’s Knight First Amendment Institute, which filed a brief on the case. , but without taking sides.
Florida and Texas passed the laws under debate in 2021, shortly after social media platforms kicked former President Donald Trump off following the Jan. 6 insurrection. Conservatives have long argued that their views are unfairly scrutinized on major platforms. Laws banning companies from rigorous moderation are seen as a way to restore fairness to the web.
The laws were quickly put on hold after challenges were raised by NetChoice and the Computer and Communications Industry Association, two tech industry trade groups representing social platforms. If the Supreme Court now allows the laws to take effect, state governments in Florida and Texas would gain new powers to control social platforms and the content posted on them, a departure from today’s situation where platforms set their own terms of service and generally employ workers There has been a significant shift in the content that moderators regulate.
polar opposites
Monday’s debate lasted nearly four hours and highlighted the legal chaos inherent in regulating the internet. The justices raised questions about how social media companies are classified and treated under the law, and states and plaintiffs raised opposing views on social media’s role in mass communications.
The law itself leaves gaps in how exactly to carry out its mandate. Cliff Davidson, a Portland-based attorney at Snell Wilmer, said the questions the justices asked showed the court’s frustration with being “caught between two diametrically opposed positions. Each position imposes enormous costs and benefits on free speech.”
David Greene, senior staff attorney and director of civil liberties at the Electronic Frontier Foundation, a digital rights group, filed a brief urging the court to strike down the laws, which he said would allow social platforms to moderate without government interference. content, with clear benefits to the public. “When platforms have First Amendment rights to moderate the user-generated content they publish, they can create unique forums that accommodate diverse opinions, interests and beliefs,” he said.