18 Nov 2013

Google and Bing pledge to block online child abuse images

Technology giants Google and Microsoft agree to block 100,000 search terms that lead to child abuse images, in a move welcomed by David Cameron. But experts warn the impact will be minimal.

The leading US tech companies have agreed to introduce new software that will not only block certain searches, but will also trigger warnings that the requested images are illegal.

The restrictions will be launched in the UK first, before being expanded to other English-speaking countries and 158 other languages in the next six months.

A further 13,000 search terms linked with child sex abuse will flash up with warnings from Google and charities warning the user that the content could be illegal and pointing them towards help. Microsoft, which runs the search engine Bing, and Google made the announcement on the morning of an internet safety summit at Downing Street.

Prime Minister David Cameron hailed the decision by the two internet giants, which account for 95 per cent of search engine traffic, as “significant progress”.

He said he would work with the National Crime Agency (NCA), and would resort to legislation if the companies were unable to prevent child abuse material being blocked.

When he called on internet firms to tackle child abuse images in June, the companies had insisted that it “couldn’t be done, shouldn’t be done”, Mr Cameron added.

Read more from Technology Producer Geoff White: Google’s move ‘no great leap forward’

The ‘dark web’

However some child protection experts have said that most illegal child abuse images are accessed on hidden peer-to-peer (P2P) and encrypted networks, that are under the radar of the mainstream search engines.

Joe McNamee, executive director at European Digital Rights, told Wired.co.uk: “The UK government has focused on getting private companies to take unproven technologies to address problems that have never been subject to any credible amount of independent analysis.

“Is it responsible and defensible to replace real action against crime with superficial actions against the symptoms of the crime? Is it responsible and defensible to adopt any policy on such an important issue with so little proper analysis? Is it reasonable and defensible to use unproven – and possibly counterproductive – technologies to do this?”

Claire Perry MP, the prime minister’s adviser on preventing sexualisation of children, insisted that the change was a “massive step forward”, but she said that the next step was to tackle sharing of abuse pictures on what is commonly called the “dark web”, using methods such as peer-to-peer networks.

The government is expected to announce a special joint task force involving UK and US intelligence forces, aimed at tackling child abuse footage on peer-to-peer networks.

‘We’ve listened’

Google chief executive Eric Schmidt told the Daily Mail that Google had been working with Microsoft and law enforcement agencies since the summer following strong warnings from the government to take action.

“We’ve listened, and in the last three months put more than 200 people to work developing new, state-of-the-art technology to tackle the problem,” he said. “We’ve fine tuned Google search to prevent links to child sexual abuse material from appearing in our results.

“While no algorithm (instructions for software) is perfect – and Google cannot prevent paedophiles adding new images to the web – these changes have cleared up the results for over 100,000 queries that might be related to the sexual abuse of kids.”

Google’s new technology will also be able to remove up to thousands of copies of an illegal video in one hit.

Calls for the internet companies to take action against searching for illegal content reached boiling point following the trials of child killers Mark Bridger and Stuart Hazel earlier this year. Bridger, who murdered five-year-old April Jones, and Hazel, who killed 12-year-old Tia Sharp, both used the internet to search for child abuse images before the killings.