The Internet is a remarkable platform for giving each of us a voice to reach a global audience. In some cases, unfortunately, people abuse this freedom by publishing unlawful content. Europe’s E-Commerce Directive provides clear rules for dealing with this content without sacrificing the Internet’s broader free expression mission. Importantly, the law says platforms should not be forced to become Internet police, monitoring all content to prevent certain material from ever getting online.
In a Paris courtroom today, former Formula One head Max Mosley's lawyers asked a judge to upset this balance by imposing an alarming new model of automated censorship. He wants web companies to build software filters, in an attempt to automatically detect and delete certain content. Specifically, Mr. Mosley demands that Google build a filter to screen Google’s index and proactively block pages containing images from our results – without anyone, much less a judge, ever seeing it or understanding the context in which the image appears.
We sympathize with Mr. Mosley, and with anyone who believes their rights have been violated. We offer well-established tools to help people to remove specific pages from our search results when those pages have clearly been determined to violate the law. In fact, we have removed hundreds of pages for Mr. Mosley, and stand ready to remove others he identifies.
But the law does not support Mr. Mosley’s demand for the construction of an unprecedented new Internet censorship tool. In repeated rulings, Europe’s highest court has noted that filters are blunt instruments that jeopardise lawful expression and undermine users’ fundamental right to access information. A set of words or images may break the law in one context, but be lawful in another. As an example, a filter might end up censoring news reports about Mr. Mosley’s own court case.
While constituting a dangerous new censorship tool, the filter would fail to solve Mr. Mosley’s problems. Pages removed from search results remain live on the Internet, accessible to users by other means – from following links on social networks to simply navigating to the address in a browser. As an example, one page Mr. Mosley sought to remove comes from a blog, which according to public sources, receives the vast majority of its visits from sources other than web search.
This not just a case about Google, but the entire Internet industry. If Mr. Mosley’s proposal prevails, any start-up could face the same daunting and expensive obligation to build new censorship tools -- despite the harm to users’ fundamental rights and the ineffectiveness of such measures.
We don’t hold paper makers or the people who build printing presses responsible if their customers use those things to break the law. The true responsibility for unlawful content lies with the people who produce it; how web companies work to reduce this content is set out in the E-Commerce Directive. We hope that the courts of France and Germany, where Mr. Mosley has also filed suit, will reject his request for a censorship machine.
Posted by Daphne Keller, Associate General Counsel
No comments :
Post a Comment
You are welcome to comment here, but your remarks should be relevant to the conversation. To keep the exchanges focused and engaging, we reserve the right to remove off-topic comments, or self-promoting URLs and vacuous messages