WhatsApp Restricts Message Forwarding to Combat Fake News

John Lister's picture

A messaging service owned by Facebook is purposely limiting the amount of times users can forward a message. It's an attempt to slow the spread of false information.

The change to WhatsApp follows a six month trial in India sparked off by several cases where bogus stories led to lynch mobs.

WhatsApp lets users send text messages, video, images and documents. One of its core functions is a group system that lets a set of friends, family members, work colleagues or people with a share interest send a single message that reaches everyone in the group. The maximum group size is 256.

One Message Could Reach 5,000 People

Until now, a single message could be forwarded to 20 groups. As the BBC calculates, that means somebody could theoretically send the same message to 5,120 people, assuming no duplicate group members. With the new limit, the maximum number of forwards is now 5. This means the same message can only be sent to a maximum of 1,280 users instead of 5,120. (Source: bbc.co.uk)

Of course, the people who receive the message will still be able to forward it on again. However, the testing in India shows a dramatic slowdown in the way the number of people seeing a specific message grew.

India was selected for the test following reports of as many as two dozen people being killed by mobs after messages falsely accused them of kidnapping children. The Washington Post said the problem was partly because so many people in India are getting online for the first time and don't have enough experience or understanding in spotting bogus stories. (Source: washingtonpost.com)

Messages Can't Be Filtered

Reducing the pace of message forwarding is an important step because of the way WhatsApp works. It uses end-to-end encryption - meaning that only the users who send and receive messages can read them. That's a big selling point in countries where users fear government snooping or political crackdowns. However, it also makes it impossible (in theory) for WhatsApp to use automated scanning of messages to spot potentially dangerous false claims.

As well as reducing the forwarding limit, WhatsApp is also removing a button that lets users forward a message with a single click or tap. The idea is to act as a gentle nudge that means users have to take an extra step to forward a message, which could be an opportunity for them to think twice about whether the contents are genuine.

What's Your Opinion?

Is this a worthwhile change? Does it go far enough? Should messaging services try to deal with social problems associated with their products or is it entirely the responsibility of customers how they use it?

Rate this article: 
Average: 5 (1 vote)

Comments

kitekrazy's picture

I guess that would be a PPOV. (political point of view) See what happened to the Catholic kids. There are certain legal battles and complaints that some of these social type apps censor or remove any conservative viewpoints.

The media competes to produce the news first and accuracy is never a thought. It takes a day or two to clarify false information. By then it isn't "news".