Are Search Engines Doing Enough To Deter Child Abuse?

Are search engines like Google and Bing doing enough to combat child exploitation and those seeking out images of it? That’s a question that’s on a lot of people’s minds this week as...
Are Search Engines Doing Enough To Deter Child Abuse?
Written by Chris Crum

Are search engines like Google and Bing doing enough to combat child exploitation and those seeking out images of it? That’s a question that’s on a lot of people’s minds this week as Microsoft has said that it will include new pop-ups aiming to deter those seeking out such content on Bing. Google, on the other hand, has reportedly elected not to take this path, suggesting that its methods for combatting the problem work better.

Do you think Bing is doing the right thing? Should Google follow suit? Share your thoughts in the comments.

Last week, UK Prime Minister David Cameron gave a speech about the Internet and pornography, calling on search engines to do more to keep children safe.

“Government needs to do more,” Cameron said. “We need to give CEOP (the Child Exploitation and Online Protection Centre) and the police all the powers they need to keep pace with the changing nature of the internet.”

He then announced that starting next year, they’ll link up existing databases across police forces to form one large database full of illegal images of children.

“The internet service providers and the search engine companies have a vital role to play and we’ve already reached a number of important agreements with them,” said Cameron, adding that a new UK-US taskforce is being formed “to lead a global alliance with the big players in the industry” to eliminate child exploitation images.

Cameron said that in Britain, Google, Microsoft and Yahoo have already been engaged on a major campaign to deter people who are searching for child abuse images. He wouldn’t go into detail about the campaign, he said, because it could “undermine its effectiveness”. He did say that it is “robust, “hard-hitting,” and a “serious deterrent” to people looking for these images.

Currently, reported images are immediately added to a list, and are blocked by search engines. But Cameron doesn’t think it’s good enough for the search engines to wait until images are reported. He said they’re “not doing enough to take responsibility,” and even said they’re “denying responsibility”.

Cameron refuses the argument that search engines shouldn’t be involved in finding out where illegal images are “because the search engines are just the pipe that delivers the images, and that holding them responsible would be a bit like holding the Post Office responsible for sending illegal objects in anonymous packages.”

“That analogy isn’t really right, because the search engine doesn’t just deliver the material that people see, it helps to identify it,” Cameron said.

“Companies like Google make their living out of trawling and categorising content on the web, so that in a few key strokes you can find what you’re looking for out of unimaginable amounts of information,” he said. “That’s what they do. They then sell advertising space to companies based on your search patterns. So if I go back to the Post Office analogy, it would be like the Post Office helping someone to identify and then order the illegal material in the first place and then sending it on to them, in which case the Post Office would be held responsible for their actions.”

“So quite simply we need the search engines to step up to the plate on this issue,” he added. “We need a situation where you cannot have people searching for child abuse images and being aided in doing so. If people do try and search for these things, they are not only blocked, but there are clear and simple signs warning them that what they are trying to do is illegal, and where there is much more accountability on the part of the search engines to help find these sites and block them.”

He said the UK government has already insisted that warning pages are placed wherever child abuse sites have been identified and taken down.

Cameron said, “There are some searches which are so abhorrent and where there could be no doubt whatsoever about the sick and malevolent intent of the searcher – terms that I can’t say today in front of you with the television cameras here, but you can imagine – where it’s absolutely obvious the person at the keyboard is looking for revolting child abuse images. In these cases, there should be no search results returned at all. Put simply, there needs to be a list of terms – a blacklist – which offer up no direct search returns.”

“So I have a very clear message for Google, Bing, Yahoo! and the rest: you have a duty to act on this, and it is a moral duty,” he added. “I simply don’t accept the argument that some of these companies have used to say that these searches should be allowed because of freedom of speech.”

He then asked search engines to commit to stop offering results on a blacklist of search terms that would be given by the CEOP.

“There’s one further message I have for the search engines. If there are technical obstacles to acting on this, don’t just stand by and say nothing can be done, use your great brains to overcome them,” he said. “You’re the people who’ve worked out how to map almost every inch of the earth from space. You’ve designed algorithms to make sense of vast quantities of information. You’re the people who take pride in doing what they say can’t be done.”

Cameron then suggested the search companies hold hackathons to tackle child safety.

You can read the full transcript of Cameron’s speech here.

Peter Davies, chief executive of the CEOP, had this to say, following Cameron’s speech: “Anything which helps stop the distribution of this material or deters those who feed the market by accessing it online can only be a good thing and, working with the world’s leading technology companies like Microsoft, Google and Facebook, we’re ready to hear their ideas on other ways to stop illegal child abuse material being viewed online, and to support their work.”

“But let’s not be blinded to the fact that our work is not just about stopping people from accessing the images that already exist on the internet. We need to continue our work on stopping them from being produced and distributed in the first place by catching child sex offenders and safeguarding children to stop them suffering more horrendous abuse,” Davies added.

According to the BBC, Bing has become the first search engine to introduce pop-up warnings for people in the UK seeking out child abuse images. Yahoo, the report says, is considering doing something similar. Google, however, does not intend to, the report says. BBC News shares statements from both Microsoft and Google on the matter:

Microsoft said the notifications aimed “to stop those who may be drifting towards trying to find illegal child abuse content on the web via search engines”.

A spokesman said: “This is in addition to Microsoft’s existing and longstanding policy of removing any verified links to illegal content of this sort from Bing as quickly as possible.”

“Microsoft has been, and remains, a strong proponent of proactive action in reasonable and scalable ways by the technology industry in the fight against technology-facilitated child exploitation. We have teams dedicated globally to abuse reporting on our services and the development of new innovations to combat child exploitation more broadly.”

Interestingly, just a few months ago, we had to report that Bing was actually suggesting people search for some pretty questionable things, like “sex games online for children,” “sex games for kids,” “sex games for kids in bed,” “sex kids movies, “sex kids free,” “sex kids site,” “sex kids picture,” and “sex children to children movie” among others. That is, these terms were appearing in the autosuggest search box. Even just typing “sex” into Bing would include a suggestion for “sex games online for children”. These types of suggestions did not occur on Google.

Bing suggestions

The whole thing was brought to our attention via a reddit thread. The Bing suggestions were even showing up in Facebook’s Graph Search, thanks to the partnership between Facebook and Bing.

Facebook Graph Search

When asked about all of this, a Microsoft spokesperson simply told us, “We’re reviewing the guidelines for search suggestions related to this type of query.”

Since then (and that was in April), Bing’s suggestions do appear to have significantly gotten better, at least for the queries referenced in our article. The Facebook issue appears to have been resolved as a result of Bing’s efforts.

Here’s the quote from the Google spokesperson shared by BBC News:

“We use purpose-built technology and work with child safety organisations to find, remove and report it, because we never want this material to appear in our search results. We are working with experts on effective ways to deter anyone tempted to look for this sickening material.”

Additionally, Yahoo says it is working with the CEOP and others.

Some are criticizing Google’s approach. According to the Daily Mail, “Google has infuriated child safety campaigners and experts by refusing to take part [in the alert system Bing is using], because it believes its own methods in tackling the problem are more effective.”

The piece also quotes John Carr, a government adviser on online child safety, as saying, “‘What Bing and Yahoo! are doing is brilliant. If they show it can be done effectively, it will be very difficult for Google to continue to refuse as well.”

I guess we’ll see.

So far, Google hasn’t had a whole lot to say about the matter. You would think a post on its Europe Policy blog would be in order. This is the place Google typically responds to issues raised by governments in Europe.

Do you think the search engines are doing enough to deter criminals looking for child abuse images? What would be the most effective way to combat this issue? Let us know what you think in the comments.

Get the WebProNews newsletter delivered to your inbox

Get the free daily newsletter read by decision makers

Subscribe
Advertise with Us

Ready to get started?

Get our media kit

Advertise with Us