YouTube Algorithm Sparks Supreme Court Case

John Lister's picture

The Supreme Court will rule on a key challenge to the way tech companies are responsible for user content. The outcome could affect the long-running "publisher vs platform" debate.

The case centers on Section 230 of the Communications Decency Act, which broadly says Internet companies aren't legally responsible for content they post, including cases of defamation.

The validity and interpretation of that rule has been challenged many times since it was created in 1996, partly because technology has evolved. Critics of the rule say it was written when the main issue was whether web hosting companies were responsible for material on their customer's websites.

The big question now is how it applies to websites that host user-generated content such as social media and video sites. The companies that run such sites argue they are merely platforms and don't exercise editorial control in the same way as publishers.

Moderation May Remove Protection

Many legal challenges have argued that because such sites have content moderation, for example blocking some posts for violating terms of use, they do in fact act as editors and bear responsibility for the material that is publishes. Courts have struggled to define a clear line where sites exercise enough control to become "publishers".

The latest case comes at the topic from a slightly different angle: rather than concentrating on content moderation, it deals with the way tech companies decide which content to show or promote to users. This often involved algorithms that try to figure out what content will be most likely to appeal to a user's interests or spark an emotional response that makes them more likely to engage with and share the material.

Terror Attacks Follow Videos

This case is brought by the family of a woman killed in a terrorist attack. They say YouTube bears some responsibility because it recommended videos that encouraged extreme views and were posted by known members of terror groups. How much of a role the videos played in encouraging the attacks is specific to this case, but is not the wider legal point the Supreme Court will address. (Source: nbcnews.com)

YouTube argued that the case should be thrown out because Section 230 means it automatically bears no responsibility. The family argue that the fact YouTube decided (through its algorithm) to promote the videos so they'd be more likely to be seen by particular users means Section 230 doesn't apply. (Source: scotusblog.com)

What's Your Opinion?

Is Section 230 still relevant today? How much responsibility should tech companies bear for user content on their site? What's the dividing line between publisher and platform?

Rate this article: 
Average: 4.8 (8 votes)

Comments

matt_2058's picture

IMO, Section 230 is still relevant as summarized in this piece. At the same time, I do not think it should apply when an entity curates information in such a way that it targets a user.

It seems like the rule was written for internet providers and site hosting companies. As I read it, it does not alleviate responsibility for DIRECT and pointed involvement in a questionable activity. When reading "Internet companies" in the first quote, I believe it refers to ISPs. Not so much as the term 'internet company' today refers to any business with an internet presence.

"Section 230 of the Communications Decency Act, which broadly says Internet companies aren't legally responsible for content they post, including cases of defamation."
&
"The validity and interpretation of that rule has been challenged many times since it was created in 1996, partly because technology has evolved. Critics of the rule say it was written when the main issue was whether web hosting companies were responsible for material on their customer's websites."

Chief's picture

Section 230 does not allow a platform to be sued for content.
Let the platforms know they will lose their protection the second they create "fact checkers".

If someone is being "provocative" are they following the platform rules?

What's the point of having an internet if Big Brother controls it?

People need to re-learn the concept of "sticks and stones" and get away from this whole "emotional wellbeing" folderol.

If it's libel, deal with it in court.
If it's emotional, see a shrink.