IWF CTO on How Technology Can Address Child Abuse Images Online

0

As our work and social life went digital during the Covid-19 pandemic, a much darker trend was also developing; a sharp increase in the number of people accessing child sexual abuse images online. The Internet Watch Foundation (IWF), a charity dedicated to finding and removing such images from the web, detected 8.8 million attempts to access illegal material in the first month of the lockdown. 2020 in the UK alone. The magnitude of the problem is likely to be much greater.

The Internet Watch Foundation evaluates hundreds of thousands of images of child abuse online each year. (Photo by Serghei Turcanu / iStock)

For the IWF, the pandemic has exacerbated an already worrying increase in the volume of child pornography material shared and viewed online. It says the amount of images and videos it detects each year has increased 1,420% since 2011, and said in November that its analysts detected 200,000 illegal images in 2021, the first time it reached that dark milestone in a single calendar year.

But if technology is part of this problem, it can also be part of the solution. Technical instructor spoke to IWF CTO Dan Sexton about how his team is developing bespoke software to support the association’s work.

IWF CTO on the importance of staying on site

Sexton joined the IWF as CTO in February, having headed IT and IT services in the engineering department at the University of Cambridge. “I started my career in the IT help desk about 20 years ago, asking people to turn their computers off and on,” he recalls. “Since then, I have worked in the public sector in local authorities and universities, as well as in software development in the private sector.

As CTO, Sexton is responsible for the association’s technical department, which includes a three-person team overseeing the IT infrastructure and a four-person development team which is currently under expansion. “My job has three main areas: one is internal IT infrastructure, the other is software development as we do a lot of product development tailor-made for the area we are working in, and the last part is ” be a voice of expertise for our management. team, advising our board of directors and talking to external partners from government and industry, ”he explains.

This last part of the job was one of the most enjoyable, says Sexton. “A lot of the people who work in this space are very policy-oriented, and having a voice with some technical background behind it has been welcomed,” he says. “Being able to translate very technical things in a way that people understand has been really rewarding, and I always felt like a welcome voice in the room.”

Being able to translate very technical things in a way that people understand has been truly rewarding, and I always felt like a welcome voice in the room.

Working with the tech community (IWF member companies include Apple, Google, and Microsoft, as well as some of the biggest names in telecommunications and social media), government organizations and law enforcement, analysts from the ‘IWF receive and assess reports of material abuse online, either submitted by these agencies or by the public through the organization’s hotline. They then endeavor to remove the offending images and ensure that they are not reposted.

To do this effectively, the IWF must store a large amount of highly sensitive and poignant images on its systems, and Sexton says, unsurprisingly, security is a high priority for this reason. It also means that the IWF is unable to join the many organizations that have rushed to embrace cloud computing. “The really sensitive items are all kept on site and we have a dedicated and secure airspace network where all data and images are stored,” he says. “We also work a lot with CAID (the Home Office’s Child Abuse Image Database), so we need to make sure we’re following the same practices as they do when it comes to store and process large quantities of this type of material. ”

The IWF is one of the few organizations authorized to store and copy this type of data. “It’s a huge responsibility and it means cybersecurity is an integral part of my role,” Sexton says.

How IWF’s hash technology tracks down illegal images of child abuse

At the heart of the IWF’s work are analysts who receive reports from the public or track down child pornography, says Sexton. “These guys take public reports and proactively go to the Internet to find websites that host or store child sexual abuse content, rate that content and try to get it removed,” he says. “My team is looking at the tools we can develop to help them do their jobs as efficiently as possible. ”

One way to do this is to use hashes, assigning a unique value to an offending image that can be shared with IWF members. “One of the tools we have developed is called IntelliGrade, which enables automatic image evaluation and categorization,” explains the IWF CTO. “We generate both perceptual and cryptographic hashes that our members can integrate with their automated detection systems to automatically find, block and delete this material. ”

This helps amplify the work of IWF analysts, says Sexton. “We want to make our analysts’ workflows as efficient as possible,” he says. “They process hundreds of thousands of images per month, and the use of technology allows them to browse through those images faster and gather more information. I think this is one of the ways we can have a bigger impact on tackling this problem – it’s not just about saying “these images are illegal”, we are creating tools that make it possible to record items such as estimated age, gender and type of activity. In this way, we can create richer data sets that give us more information about the problem we are facing.

Sexton says investing in proprietary technology pays off for the IWF. “There are tools used by other helplines and law enforcement agencies,” he says. “But having an in-house software development team to create tools specifically for our processes means we’re able to quickly change direction as we expand our work and [if we] want to collect more data, we can schedule it in.

“One of the main goals right now is almost duplicate images,” he continues. “We have found that analysts assess one image, but another may appear, almost identical. Rather than forcing them to re-evaluate this, we are looking for ways to signal that it is the same and copy the information. Having the ability to do this internally has been really good because we’ve been able to respond quickly to the needs of the analysts.

At the moment, development is mostly handled by the IT team, but Sexton says he wants to put more power in the hands of the analysts themselves by using low-code or no-code solutions. “My software manager wants to put more capacity in the hands of the analysts themselves,” he says. “I can see tremendous value in that. Some of my staff actually worked as analysts themselves before joining the tech team, which was very helpful as they know what analysts need and it keeps a close relationship between the teams.

The role of machine learning and AI in tackling images of child abuse online

Developing more automated tools to help IWF analysts is a major goal for Sexton going into 2022. He adds that the organization can also play a role in helping third parties to develop artificial intelligence tools. to combat the problem of abusive images.

“We’re looking for machine learning classifiers who can look at an image and say what’s in it,” he says. “It could potentially remove some aspect of human evaluation and help our analysts work a little faster. There are also the training and testing elements of these models, as we see other security technology vendors developing systems that rely on learning from datasets. This is a potential future for us, as we are one of the few organizations that owns this type of dataset. We seek to improve ourselves internally but also to have a greater impact externally.

Indeed, the IWF CTO says he is motivated by the opportunity to “make a social impact and do something beyond yourself” through his work at the IWF. “I think, like most people who work here, motivation is part of that larger goal,” Sexton said. “When I was in university, there was an aspect of it, because you contribute to society through excellence in teaching and research. But in an IT context, it was mostly about helping others in their work.

“IWF is very different in that you can really see the direct impact you have in helping the most vulnerable in society using new and emerging technologies. ”

News editor

Matthew Gooding is editor-in-chief of Technical instructor.

Share.

Comments are closed.