This case is the first one that the Supreme Court will hear regarding Section 230 of the 1996 Communications Decency Act. That Act protects social media platforms like YouTube, Twitter and Facebook from being sued over third-party content on their sites.
The estate of a young woman killed in Paris by ISIS has sued Google/YouTube. The plaintiffs allege that ISIS posted “hundreds of radicalizing videos inciting violence and recruiting potential supporters” to YouTube. They also argue that YouTube’s algorithms promoted this content to “users whose characteristics indicated that they would be interested in ISIS videos.”
Clarence Thomas has been alluding in previous dissents on other court cases that it is time for the Supreme Court to decide whether Section 230 provides tech companies overly broad liability protections. . .
In the Gonzalez case, the court could rule that platforms would not be allowed to use computer algorithms to recommend content to users — something the platforms like YouTube, Twitter and Facebook rely on to generate ad revenue and increase user engagement. . .
While Section 230 protects Google against liability over the third-party posting of videos, Gonzalez’s petition alleges that Google recommended ISIS videos to users. The question before the court is whether Section 230 grants immunity for recommendations made by algorithms pushing certain content for users or if it only applies to editorial changes — like content removal — made by the platforms.
Section 230 says that “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider” (47 U.S.C. § 230). In other words, online intermediaries that host or republish speech are protected against a range of laws that might otherwise be used to hold them legally responsible for what others say and do. The protected intermediaries include not only regular Internet Service Providers (ISPs), but also a range of “interactive computer service providers,” including basically any online service that publishes third-party content.
The results of this suit, if the court finds in favor of the Gonzalez family, could seriously impact revenue of companies like YouTube and Facebook, but might – depending on how the Court rules – also affect what consumers are allowed to share on their platforms.
If that is the case, the question is: Who would decide what content is acceptable? I can’t help but think that the platforms could use a decision that touches on third-party content as an excuse to block speech that they oppose which, as we know, is already under way. A SCOTUS ruling regarding content could possibly be used by platforms to expand this censorship. If SCOTUS restricts their ruling to the algorithms that social media platforms use to push certain content, that might be a good thing.
How might a ruling allowing lawsuits affect posts like those here by me on this blog – and by you in our comment section? Could I be sued if a comment by one of you is thought to be a threat against a person or group? Would the comment sections of blogs and publications disappear? Would small blogs that discuss politics and controversial issues disappear altogether because of the legal risks of hosting them?
More on this subject from the National Law Review.