Supreme Court liability case could cloud the online world

The Supreme Court’s announcement on Monday that it would rule on a pair of challenges to a basic law governing online speech has triggered earthquake warnings from internet experts.

Why is this important: A court decision to amend or strike down the law, Section 230 of the Communications Decency Act of 1996, would upend the legal landscape for every company or organization whose work involves user contributions – not just social networks, but online marketplaces, review sites, neighborhood groups and more.

Driving the news: The current Supreme Court session, which began on Monday, will address two cases that mark the court’s first comprehensive review of the immunity of social media companies from lawsuits for moderation practices and user-posted content.

  • In Gonzalez v. Google, the court will consider whether the law protects internet platforms when algorithms target users with recommended content.
  • Twitter, Inc. v. Taamneh is seeking a ruling on whether platforms can be required to violate anti-terrorism laws if they have policies against pro-terrorism content but do not remove all such posts.

The big picture: Congress has considered numerous laws amending Section 230, but passed only one narrow exception in 2018 intended to combat online sex trafficking.

  • Critics of the law are now hoping the court’s right-wing supermajority will accomplish what the legislature failed to do.

Yes, but: The thirst to punish “censorship” platforms could end up undermining freedom of expression online.

  • “If the Court were to significantly narrow Section 230 in a way that makes online services potentially liable for third-party content, it could result in a much lesser ability for people to speak freely online,” said Samir Jain, director policies at the Center. for democracy and technology, said Axios.
  • “Section 230 has really been an essential part of circulating free speech on the internet, especially through social media and other online services,” said Jain, who was one of the litigants in a 1997 case, Zeran v. America Online, Inc. ., which concluded that Section 230 provides Internet Service Providers with broad immunity from suit.
  • “The court could easily take this and then rule in a way that affects big issues that aren’t actually raised by the case,” Daphne Keller, director of platform regulation at Stanford Cyber, told Axios. Policy Center. “That could mean news feeds are purged of anything that creates a fear of legal risk, so they become super sanitized.”

The plot: The idea that Section 230 needs to be revisited or scrapped altogether for the digital age has supporters on both sides of the aisle.

  • Democrats tend to believe that platforms should do more to limit the proliferation of dangerous content online. Republicans believe they are unfairly censoring conservative speech online.
  • President Joe Biden has offered to change the law, but hasn’t made it a priority.
  • Justice Clarence Thomas said in 2020 that “in an appropriate case, we should consider whether the text of this increasingly important law aligns with the current state of immunity enjoyed by internet platforms”.
  • “It’s time social media platforms were held accountable for the content they recommend to children and families,” tweeted Jim Steyer, CEO of Common Sense Media, who lobbied the Biden administration to turn to Big Tech. “Until SCOTUS rules on the scope of Section 230, Big Tech will continue to act with impunity.”

The catch: Changing Section 230 has bipartisan support, but every change carries huge potential risks.

  • Experts say a court ruling finding platforms can be sued for how they filter, amplify or recommend content could lead to mixed results, with some companies leaving more harmful content behind while others remove more. material as needed.
  • “Allowing prosecution to proceed for the conduct challenged in this case would not automatically make the platforms liable,” said Matt Wood, president and general counsel of the Free Press, in a press release. “It would simply allow plaintiffs to pursue the difficult of proving in court that a platform knowingly provided substantial assistance to a terrorist organization.”
  • But the prospect of prosecution could have a “chilling effect”, he said. Such effects would hit smaller companies that lack legal resources and large budgets even more, according to an analyst note from New Street Research.

Between the lines: The Gonzalez case specifically asks whether federal law protects a platform when its algorithm targets a user with recommended content, with Gonzalez alleging that Google helped ISIS recruit through YouTube.

  • Whatever the court decides, the implications will go far beyond recommendation algorithms.
  • “You can’t analyze the amplification of other trust and security operations,” Matthew Schruers, president of the Computer & Communications Industry Association, which represents Big Tech companies, told Axios.
  • Neither Twitter nor Google would comment on the cases.

And after: The Supreme Court will hear arguments this quarter, with a decision likely by next summer.