Cross-party committee tells government to consider legislation if Google does not develop filtering technology to preserve privacy
The UK Government’s Joint Committee on Privacy and Injunctions has asked Google to censor its searches, filtering out references deemed to be a breach of privacy. Should the search giant fail to comply, the committee has suggested the government introduce laws forcing Google to filter searches.
The report, written by MPs and peers, addresses privacy issues on sites including Twitter and Facebook, but Google gets the brunt of criticism for its “totally unconvincing” stance on developing censoring technology.
Twitter’s role in privacy breaches was highlighted by the Ryan Giggs incident last year, where details of the footballer’s so-called “superinjunction”, whose existence should not have been mentioned, were anonymously leaked through the site and repeated upwards of 75,000 times.
While the flouting of the injunction was highly publicised, decimating the footballer’s privacy in the process, the committee notes that measures are in place on Twitter to take action against those in contempt of court. The company recently announced it would begin censoring content on a country-by-country basis to comply with national laws, and a precedent has been set in identifying and charging users who break court orders, allowing for further civil prosecutions in the future.
The committee suggests that in the case of Twitter and Facebook “when granting an injunction, courts should be proactive in directing the claimant to serve notice on Internet content platforms, such as Twitter and Facebook”. It also promoted more stringent action in civil contempt of court cases to act as a deterrent.
In Google’s case, the proposed action to ensure court orders are met was more direct. Citing the Max Mosley case in 2011, where the former F1 boss took the search engine to court to demand proactive filtering of links to illegally obtained images of him, the committee said that Google should develop technology to actively comply with such privacy orders.
“We recommend that if legislation is necessary to require them to do so it should be introduced,” the report said.
Google said that while it does not currently have a system to censor links to, and copies of, offending content, such a pro-active monitoring was not necessary as part of the Electronic Commerce Directive and that any algorithms would not fit in with the company’s policy.
“As a policy matter, I do not think that would be a good idea,” said Daphne Keller, associate general counsel at Google, “simply because an algorithm or a computer programme that tried to do something like that would not have the ability that a judge does or any person does to see the context, to see if a particular phrase is actually appearing in a news report or in political commentary.”
The committee dismissed Google’s objections to pro-active monitoring as “totally unconvincing” and pressed the pro-active response.
Based on the full report however, some members of the committee suggested leaving content removals in human hands due to the potential pitfalls of automated censorship. The proposed amendment, which did not include any reference to legislation, was rejected 6-13.
Google’s secrets are an open book. Try our quiz