Online Politics Riddled with 'Bots' Perpetuating Fake News

John Lister's picture

Political discussion on social media is often driven and manipulated by automated "bots," a university department claims. The word "bots" is a short form for "robot," and refers to online activity carried out using an automated program.

The Oxford Internet Institute says it studied events in nine countries and found the bots played a role in every case. However, social media groups have questioned such studies saying they are often based on flawed research methods.

The study looked at activity on Facebook and Twitter among users in Brazil, Canada, China, Germany, Poland, Russia, Taiwan, Ukraine and the US. It covered posts relating to security, major political events and elections.

Social Media News feeds Gamed

Though the methods varied, the bots weren't actually writing or creating content. Instead, they were automatically sharing content - usually false or misleading information - on a massive scale. In particular, they took advantage of the way social media sites decide which content users see as a priority.

In one example, the bots would retweet bogus stories that had originally been placed on online news sites that had little or no editorial quality controls. In another example, the bots helped spread a false claim about a missing Malaysian Airlines plane, with the claim originally posted on an account that purported to come from an air traffic controller.

In both cases, the idea wasn't just to spread to more people online, but also to appear in "trending" lists of popular stories, in turn increasing the chances that mainstream media such as TV stations would repeat or discuss the claims.

Russian Accounts Often Bogus

The problem was biggest in Russia, where the researchers estimated that almost half of accounts that regularly post political content are operated by bots rather than real people. (Source: washingtonpost.com)

The lead researcher of the study says the risks of censorship mean social media sites, rather than governments, should be the ones to tackle the problems. Philip Howard said sites should adjust their algorithms to take more account of how reliable cited media sources are, as well as introducing a more random element so that users see opposing viewpoints. He also called for sites to make their algorithms available for independent vetting to see if they are flawed. (Source: bbc.co.uk)

What's Your Opinion?

Do you think bots are a serious problem when it comes to social media discussion? Is the answer to give users more control over what content they see and how it is selected? How do you decide which shared stories to take seriously?

Rate this article: 
Average: 4.9 (7 votes)

Comments