Tech’s online content shield is damaged by product liability claims

Plaintiffs seeking to hold online platforms responsible for facilitating crimes committed by their users are finding success using a new avenue: product liability.

The platforms enjoy broad liability protection for user posts under Section 230 of the Communications Decency Act 1996, an act that has been credited with the rise of social media and of the Internet as we know it. It has also been criticized for shielding some of the world’s biggest companies when they turn a blind eye to the abuse and harm perpetrated by their offerings.

Federal judges overseeing lawsuits against Omegle.com LLC and Snap Inc. have reached opposing conclusions days apart on whether those companies are protected by the Section 230 liability shield. Omegle, a site Web that randomly matches users with video chat, and SnapChat, the person-to-person photo-sharing app, each allegedly facilitated the sexual abuse of a minor by an adult.

While Snap escaped trial in Texas, Omegle will have to continue fighting in Oregon, where a judge agreed the company was being treated as the maker of a faulty product, not the host of created content by the user.

“These social media companies have dangerously designed products and should be held liable for the harms they cause to their users,” said Carrie A. Goldberg, victims’ rights attorney representing the plaintiff in the Omegle case.

But supporters of Section 230 protections say the Oregon judge missed the Omegle ruling. According to them, the new trend of product liability litigation could have harmful consequences for freedom of expression on the Internet and the ability of vulnerable people to express themselves.

If courts start to limit protection in these cases, “Section 230 could still be removed, including when users are using the platform satisfactorily,” said Cathy Gellis, an internet lawyer who wrote amicus briefs supporting the law.

Publisher or product manufacturer?

Section 230, enacted in 1996 as part of a sweeping package of communications law amendments, provides a safe harbor for interactive computer services that host user content. The law was intended to facilitate the growth of the internet by allowing platforms to moderate content provided by users without being treated as the publisher, which would expose them to lawsuits for user posts.

Twitter Inc. and Meta Platforms Inc.‘s Facebook, for example, cannot be held liable if it hosts a defamatory post created by one of its users.

For more than two decades, the Section 230 liability shield was nearly impenetrable in courtrooms. But in 2021, the Ninth Circuit ruled that Snap could be sued for designing a feature on its app.

In this case, Lemmon vs. Snap, two boys were killed in a car crash after trying to take photos using SnapChat’s “speed filter,” which shows how fast a user is moving. The parents sued Snap, arguing that using the speed filter contributed to the deaths.

The appeals court concluded that section 230 did not apply because the parents treated “Snap as a manufacturer of products, accusing it of negligently designing a defective product” and not as a publisher .

The product liability argument has since gained traction among plaintiffs’ attorneys. In recent months, lawsuits filed in courts across the country have accused TikTok of encouraging young users to participate in the dangerous ‘blackout’ challenge and claimed that Facebook caused addiction, depression and suicide in children.

“Product liability law is a well-known vehicle for inducing companies to make safer products by internalizing the cost of safety,” said Matthew Bergman, an attorney with decades of litigation experience. related to asbestos products who founded the Social Media Victims Law Center last year. . He is an attorney in many new lawsuits.

Goldberg was one of the first to use the product liability action in a 2017 lawsuit against Grindr, which ultimately failed on appeal in the Second Circuit. But she said “the flurry of litigation we’re seeing is basically an endorsement of this concept.”

The legal tactic has drawn skepticism. Eric Goldman, professor of internet law at Santa Clara University School of Law, said the product analogy is not appropriate in the context of online platforms, which are almost always based on the speech.

“We’re talking about services that help people talk to each other,” Goldman said. “That’s not an appropriate analogy, because ultimately if we say complainants can decide whether voice products are properly designed, that just leads to outright censorship.”

LOOK: Article 230 in the crosshairs: is the Internet’s favorite law in danger?

Shared decisions

In the Snap case, U.S. District Judge Lee H. Rosenthal for the Southern District of Texas rejected the product liability theory. Its July 7 ruling found that Snap could not be held liable for facilitating the sexual grooming of a high school student by a teacher at the school.

Lawyers for the plaintiff argued that Snap’s app was negligently designed to make it easy for users to lie about their age to create an account and that it should have security features that would have prevented communication between the plaintiff and the teacher.

The judge, however, said Snap still treated Snap as a publisher, so the company could not be held liable.

Only a week later, however, Oregon District Judge Michael W. Mosman mostly denied Omegle’s motion to dismiss similar claims. The platform could be accused of helping facilitate sexual abuse by allowing a minor to randomly match with a sexual predator, the judge said.

Mosman was convinced by Goldberg’s product liability argument. “Plaintiff’s case does not rely on third-party content,” the judge wrote. “The plaintiff’s assertion is that the product is designed to connect people who should not be connected (underage children and adult males) and does so before any content is exchanged.”

Omegle said in a statement to Bloomberg Law that it disagrees with the court’s decision and that the lawsuit seeks to “hold Omegle accountable for the wrongful actions of a third-party chat service user”, which, according to him, is prohibited by article 230. .

Goldman said he believed the judge “cut corners” and misapplied the Lemon precedent, which Goldman said was narrowly written. The harm in this case stems from communication between users, which falls within the scope of the liability shield, he argued.

If Omegle randomly paired adults and minors but never allowed them to communicate, the platform could be held liable under Mosman’s interpretation, which doesn’t make sense, Goldman said. “The only time harm could happen isn’t because of the pairing, it’s because of the resulting conversation.”

Reducing Section 230 protections will also result in fewer safe spaces for vulnerable people on the internet, Gellis argued. Without the protections, platforms that don’t have perfect security features or moderation policies will shut down in the face of increased legal liability, the internet law attorney said.

When Congress amended Section 230 in 2018 to remove the safe harbor in online prostitution cases, the legal repercussions became “all consumer,” Gellis said. Consensual sex workers lost access to useful online tools and platforms like Craigslist’s personal encounters section, which the company removed under threat of legal action, she said.

“The bigger picture is that more people are being helped by Section 230 than hurt by Section 230,” Gellis said.

But Goldberg said platforms should be held accountable for “catastrophic injuries” in the cases his company brings. “Holding Omegle responsible for this kind of extreme abuse really has no impact on the free exchange of ideas on the internet,” she said. “There must be limits.”