X
2011

Google Fined $65,000 For Slandering Search Suggestion By A French Court

December 30, 2011 0

France — It seems that France is not a big admirer of Google’s automated search suggestions, and this lack of human interference is costing the company. According to a French newspaper, The Local reports that Google France has been forced to pay €50,000 ($65,000) to a French insurance company called Lyonnaise de Garantie, after its search engine automatically added the word “escroc” (“crook” or “swindler”) after the company’s name.

The American internet giant’s search engine includes Google Suggest, an auto-complete technology which suggests the rest of the phrase based on the first few characters or words typed in. But occasionally, those suggestions can be detrimental to the reputation of a company or individual. That is why there is a whole new industry around online reputation management for search results and even search suggestions.

Image Source: (SearchEngineLand)

In France, Google has been sued and convicted numerous times over those search suggestions. Now, a court has ordered Google France to pay €50,000 (about $65,000) over a search suggestion for a query for an insurance company called Lyonnaise de Garantie. The Local reports that upon typing the name of the company, Lyonnaise de Garantie, into the search engine, it was automatically followed by the French word “escroc” meaning crook or swindler.

French news site The Local reports:

A Paris court held that the addition of the offending word “was offensive towards the company.” The court said that Google should be able to exercise “human control” over the functioning of words suggested by its search engine.

Google said the auto-complete functionality was not the “expression of a human thought”, an “opinion” or a “value judgement or criticism” but was the result of its automatic algorithm.

Google explains how the feature works in a help center article:

As you type, Google’s algorithm predicts and displays search queries based on other users’ search activities. In addition, if you’re signed in to your Google Account and have Web History enabled, you may see search queries from relevant searches that you’ve done in the past. All of the predicted queries that are shown in the drop-down list have been typed previously by Google users.

For certain queries, Google will show separate predictions for just the last few words. Below the word that you’re typing in the search box, you’ll see a smaller drop-down list containing predictions based only on the last words of your query. While each prediction shown in the drop-down list has been typed before by Google users, the combination of your primary text along with the completion may be unique.

Predicted queries are algorithmically determined based on a number of purely objective factors (including popularity of search terms) without human intervention. The autocomplete data is updated frequently to offer fresh and rising search queries.

Google does have what it refers to as a “narrow set of removal policies” in place for porn, violence, hate speech, and terms that are frequently used to find content that infringes upon copyrights. In another incident, BFM Business news reported that the company ran into similar problems when an individual found his name was automatically followed by “violeur” (“rapist”).

But as you can see now, the suggestion appears to have been removed.