CHANGE ONE WORD EVERYTHING ELSE DONE ALREADY

28 June 2017

In her recent webcast Dawn Hawkins of NCOSE (National Center on Sexual Exploitation) told her audience that big tech companies actually have a policy to prevent the publication of sexually explicit (pornographic) materials but they don’t really enforce it. Ms Hawkins also added that she knows Google put the technology together to enforce their policy. A recent article in The Financial Times echoes Ms Hawkins statement.

In ‘Four ways Google will help to tackle extremism’ Kent Walker wrote in The Financial Times (18 June 2017, London edition) how Google developed a network of technological solutions and paid activist’s to remove violent or extremist terrorist content. What if we would change just one word but would use the same approach, namely the same technological solutions, same network of paid and voluntary activist to stop the presence of sexually explicit (pornographic) material on the web?

Consider the following: “Terrorism Pornography is an attack on open societies, and addressing the threat posed by violence and hate is a critical challenge for us all. Google and YouTube are committed to being part of the solution. We are working with government, law enforcement and civil society groups to tackle the problem of violent extremism online. There should be no place for terrorist pornographic content on our services.

         While we and others have worked for years to identify and remove content that violates our policies, the uncomfortable truth is that we, as an industry, must acknowledge that more needs to be done. Now.

         We have thousands of people around the world who review and counter abuse of our platforms. Our engineers have developed technology to prevent re-uploads of known terrorist pornographic content using image-matching tools. We have invested in systems that use content-based signals to help identify new videos for removal. And we have developed partnerships with expert groups, counter-extremism agencies and other technology companies to help inform and strengthen our efforts.

         Today, we are pledging to take four additional steps.

         First, we are increasing our use of technology to help identify extremist and terrorism pornography-related videos. (…..) We have used video analysis models to find and assess more than 50 per cent of terrorism pornography-related content we have removed over the past six months. We will now devote more engineering resources to apply our most advanced machine learning research to train new “content classifiers” to help us more quickly identify and remove such content.

         Second, because technology alone is not a silver bullet, we will greatly increase the number of independent experts in YouTube’s Trusted Flagger programme. Machines can help identify problematic videos, but human experts still play a role in nuanced decisions about the line between violent propaganda and religious or newsworthy speech……….”

Could this be a way forward in dealing with sexual exploitation online?