Google Autocomplete Could Breach Court Orders

John Lister's picture

Google has inadvertently revealed the names of rape victims whose identity is legally secret. It's all down to over-enthusiastic behavior by the search engine's "autocomplete" feature.

Autocomplete works when a user starts typing a term into the Google search bar, then the search bar presents  a drop-down menu with suggested terms based on what is being typed in. The user can then click or tap on any of these terms to carry out the search without having to type out the search query in full. As they continue typing more characters, the list of suggested terms will update to become more relevant.

Publishing Names Is Illegal

The list of terms is automatically generated based on two factors: the popularity of terms that other users have previously searched for, and the content of web pages indexed by Google. The theory is that combining this information might make it easier to suggest a relevant search term even if it's for a current topic that hasn't had a lot of historical searches.

The problem comes in cases highlighted by a British newspaper in which the suggested search terms identified rape victims. Under British law, victims names must be kept confidential (unless they waive anonymity) even if the accused person is acquitted in a trial. The only exception would be if the accuser is later convicted of perjury.

Website Forum Posts Fed Into Algorithm

It appears the autocomplete suggestions were generated by the algorithm based on websites such as discussion forums where users had (usually illegally) revealed the victims names. For example, somebody searching for the name of the accused or convicted person might be shown an autocomplete search suggestion that included the name of the victim. (Source: thetimes.co.uk)

The problem likely arose because Google's vetting systems aren't as tight for autocomplete as for its search results. In many cases, the web pages that named the victim were labeled by Google so that they didn't show up on search result pages. However, Google's systems don't appear to have removed the pages from the database it uses to influence the autocomplete suggestions. (Source: independent.co.uk)

What's Your Opinion?

Should this situation count as Google 'publishing' the names and thus breaking the law? Does Google need to rethink the way it generates autocomplete suggestions? Is there a limit to how a company can vet automatically generated material?

Rate this article: 
Average: 5 (7 votes)

Comments

Dennis Faas's picture

This would be very difficult to vet based on an algorithm, unless they somehow flag the accused names from only reliable sources and not forum posts, though this really isn't ideal.

Also worth noting is that I've used Google Image Search (images.google.com) and have searched for someone accused of a crime, and have seen pictures of victims on the same page. A good example would be to search for images of Bill Cosby, who was recently convicted of "aggravated indecent assault". If you keep viewing the images they will eventually show the victims.

eric's picture

meh, i don't think it would be terribly difficult for Google to tell the auto-complete algorithm to not pull from the list of sites that are blocked in search results. Probably it would just delay the auto-complete algorithm by a noticeable amount.

Chief's picture

What's the point of having search engines if they are all censored?
The entire reason anything exists is because someone has published it.
A search engine is just like a telephone or an automobile.
Either may be used for legal as well as illegal activity.
Blame the persons outing the information illegally - not the carriers.