Google Bows To Indian Court Over Religious Content

Dennis Faas's picture

Both Facebook and Google have removed online material alleged to pose a risk of social unrest in India, following legal threats to block the sites completely.

The two companies are among 21 firms hit with a civil lawsuit regarding offensive content.

The lawsuit, brought by a private citizen, claims the material involves religious issues that could provoke unrest.

Google's problems appear tied to its sites Blogger and YouTube. Google says it has taken down the material pursuant to a court request, its standard policy regarding legal issues, regardless of country.

The material has been removed locally but remains accessible elsewhere.

Google Rejects Nearly Half of Demands

In its latest published figures, covering January to June 2011, Google received 68 government requests to remove 358 items. In the absence of court orders, the company complied in just over half of these cases.

Google noted that many of the requests involved offensive language about religious leaders, and it removed only clips "that appeared to violate local laws prohibiting speech that could incite enmity between communities." (Source: google.com)

There are currently two separate court cases about the subject going on in India.

Aside from the civil case, there is also a criminal case sparked by a formal complaint from an actor and journalist.

The journalist argues that the websites have hosted images that offend multiple religions, including cartoons that portray the prophet Mohammed, and "distorted" images of the Hindu goddess Saraswati. (Source: yahoo.com)

A local court will decide next week if that case should be thrown out before it reaches a scheduled trial.

Ethnically Harmful Content Barred By Law

Indian law gives Internet firms 36 hours after being notified to remove offensive material. The rules say the threshold for offensiveness is that the material is "ethnically harmful" or "grossly harmful." (Source: wsj.com)

Officials in that country have previously argued web firms should screen any material uploaded by users before allowing it to go live, thus keeping offensive material from becoming available online.

Firms like Facebook and Google say they have too many users to make this approach practical.

In fact, Google said last year users uploaded a total of 60 hours of video every minute, which would have required an average of 3,600 people working around the clock to pre-screen all those clips.

Rate this article: 
No votes yet