Supreme Court Sidesteps Ruling on Scope of Internet Liability Shield

0
4


The Supreme Court handed twin victories to technology platforms on Thursday, sidestepping an effort to limit a powerful liability shield for user posts and ruling that a law allowing suits for aiding terrorism did not apply to the ordinary activities of social media companies.

The court’s unanimous decision in one of the cases, Twitter v. Taamneh, No. 21-1496, effectively resolved both cases and allowed the justices to duck difficult questions about the scope of a 1996 law, Section 230 of the Communications Decency Act.

In a brief, unsigned opinion in the case concerning the liability shield, Gonzalez v. Google, No. 21-1333, the court said it would not “address the application of Section 230 to a complaint that appears to state little, if any, plausible claim for relief.” The court instead returned the case to the appeals court “to consider plaintiffs’ complaint in light of our decision in Twitter.”

The Twitter case concerned Nawras Alassaf, who was killed in a terrorist attack at the Raina nightclub in Istanbul in 2017 for which the Islamic State claimed responsibility. His family sued Twitter, Google and Facebook, saying they had allowed ISIS to use their platforms to recruit and train terrorists.

Justice Clarence Thomas, writing for the court, said the “plaintiffs’ allegations are insufficient to establish that these defendants aided and abetted ISIS in carrying out the relevant attack.”

He wrote that the defendants transmitted staggering amounts of content. “It appears that for every minute of the day, approximately 500 hours of video are uploaded to YouTube, 510,000 comments are posted on Facebook, and 347,000 tweets are sent on Twitter,” Justice Thomas wrote.

And he acknowledged that the platforms use algorithms to steer users toward content that interests them.

“So, for example,” Justice Thomas wrote, “a person who watches cooking shows on YouTube is more likely to see cooking-based videos and advertisements for cookbooks, whereas someone who likes to watch professorial lectures might see collegiate debates and advertisements for TED Talks.

”But,” he added, “not all of the content on defendants’ platforms is so benign.” In particular, “ISIS uploaded videos that fund-raised for weapons of terror and that showed brutal executions of soldiers and civilians alike.”

The platforms’ failure to remove such content, Justice Thomas wrote, was not enough to establish liability for aiding and abetting, which he said required plausible allegations that they “gave such knowing and substantial assistance to ISIS that they culpably participated in the Reina attack.”

The plaintiffs had not cleared that bar, Justice Thomas wrote. “Plaintiffs’ claims fall far short of plausibly alleging that defendants aided and abetted the Reina attack,” he wrote.

The platforms’ algorithms did not change the analysis, he wrote.

“The algorithms appear agnostic as to the nature of the content, matching any content (including ISIS’ content) with any user who is more likely to view that content,” Justice Thomas wrote. “The fact that these algorithms matched some ISIS content with some users thus does not convert defendants’ passive assistance into active abetting. Once the platform and sorting-tool algorithms were up and running, defendants at most allegedly stood back and watched; they are not alleged to have taken any further action with respect to ISIS.”

A contrary ruling, he added, would expose the platforms to potential liability for “each and every ISIS terrorist act committed anywhere in the world.”

The court’s decision in the Twitter case allowed the justices to avoid ruling on the scope of Section 230, a 1996 law intended to nurture what was then a nascent creation called the internet.

Section 230 was a reaction to a decision holding an online message board liable for what a user had posted because the service had engaged in some content moderation. The provision said, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

Section 230 helped enable the rise of huge social networks like Facebook and Twitter by ensuring that the sites did not assume legal liability with every new tweet, status update and comment. Limiting the sweep of the law could expose the platforms to lawsuits claiming they had steered people to posts and videos that promoted extremism, urged violence, harmed reputations and caused emotional distress.

The case against Google was brought by the family of Nohemi Gonzalez, a 23-year-old college student who was killed in a restaurant in Paris during terrorist attacks there in November 2015, which also targeted the Bataclan concert hall. The family’s lawyers argued that YouTube, a subsidiary of Google, had used algorithms to push Islamic State videos to interested viewers.

A growing group of bipartisan lawmakers, academics and activists have grown skeptical of Section 230 and say that it has shielded giant tech companies from consequences for disinformation, discrimination and violent content across their platforms.

In recent years, they have advanced a new argument: that the platforms forfeit their protections when their algorithms recommend content, target ads or introduce new connections to their users. These recommendation engines are pervasive, powering features like YouTube’s autoplay function and Instagram’s suggestions of accounts to follow. Judges have mostly rejected this reasoning.

Members of Congress have also called for changes to the law. But political realities have largely stopped those proposals from gaining traction. Republicans, angered by tech companies that remove posts by conservative politicians and publishers, want the platforms to take down less content. Democrats want the platforms to remove more, like false information about Covid-19.

Halimah DeLaine Prado, Google’s general counsel, welcomed the court’s decision, or lack of one, in the Gonzalez case. “Companies, scholars, content creators and civil society organizations who joined with us in this case will be reassured by this result,” she said in a statement.

David McCabe contributed reporting.



Source: www.nytimes.com

LEAVE A REPLY

Please enter your comment!
Please enter your name here