Pages

Friday, June 7, 2013

Google Ordered by German Court to Limit Defamatory Autocomplete Results

It looks like I was too mired in finals to catch this story when it came out, but I just learned of it today.  Around last November I was clued into issues surrounding Google’s autocomplete by this story which prompted me to write an essay on whether similar litigation would succeed in the United States.  The essay is forthcoming in the University of Illinois Journal of Law, Technology & Policy, but I have not put it on SSRN just yet.

As I mention in the essay, lawsuits against websites and other internet content providers are often futile when the offending content has been provided by a third party.  This is due to §230 of the Communications Decency Act which states that internet content providers are not to be treated as publishers of information that is provided by third parties.  Courts have interpreted this provision to broadly immunize websites from lawsuits arising from content provided by third parties.  For this reason, there is an added obstacle to lawsuits against Google in the United States that may not be present in other countries.  This is because autocomplete results are formed by algorithms that monitor popular searches and suggest terms that are commonly associated with names or topics.  These users doing the searches may be considered third party content providers from whose conduct Google is immunized.  I argue in the essay that this obstacle may be overcome in the case of autocomplete defamation, however, since the information that is posted through autocomplete is not provided for purposes of posting on the internet, it is derived from individual searches that people do not expect to be broadcast to the world.

Courts are generally leery of imposing liability on websites because courts are worried that liability will chill internet speech.  Additionally, a concern that is particularly relevant for websites like Google is that the breadth of content they are dealing with is massive.  There is a strong argument that Google cannot be expected to monitor the output of its autocomplete algorithm with respect to all of its search terms.  This argument is weaker, however, if Google is required to remove offending content only upon notification by the person defamed.  This seems to be what the German court is requiring.


Successful lawsuits like the one in Germany are relevant to potential lawsuits in the United States because if Google complies with the resulting court orders, this compliance may serve as a model for how Google may regulate its content in the United States if plaintiffs successfully sue for defamation.  While theoretical arguments about chilling speech and restricting features may have persuasive impact, situations like the one in Germany may provide an opportunity to see the actual effects of restrictions on autocomplete, and show whether these theoretical concerns will play out in the real world. 

No comments:

Post a Comment