The RCMP is working with ato identify new online child exploitation images and rescue at-risk children.
The internet holds a vast collection of sexually explicit photos of children that were nearly impossible for the RCMP to find and they are looking to change that.
To do this, RCMP will now work along side , and researchers from the University of Manitoba, using their artificial intelligence technology.
“For every single one of our files, there’s a child at the end of it,” said Cpl. Dawn Morris-Little, an investigator at the RCMP-led National Child Exploitation Coordination Centre (NCECC) in Ottawa.
“Images that look homemade or images that are unknown — those take priority because you don’t know when it was created, and those children could still be at risk.”
The artificial intelligence technology, called computer vision, mimics human vision by using algorithms to scan unknown photos and picks out the ones that have a high probability of being child exploitation.
“What would take weeks for an investigator would take the algorithm minutes or hours to scan,” explained Brad Leitch, head of product development at Two Hat Security.
“The algorithm can eliminate the photos of trees and doughnuts and Eiffel Towers pretty successfully and put those high-probability, exploitative images at the top of the list so we can identify victims and make prosecutions more quickly.”
To keep moving forward, taking place July 6 and 7 in Vancouver. Sponsored by the RCMP, Microsoft, Magnet and Two Hat Security, the is an attempt to build a bridge between three diverse disciplines – law enforcement, academia and the technology sector – for the greater good.
The goal is to work together to build technology and global policy that helps stop online child exploitation.
“We are hopeful that by encouraging teamwork and partnerships across these three vital industries, we will come closer to ridding the internet of online child exploitation,” reads a Two Hat release.
Since 2011, the RCMP has used software called PhotoDNA to help identify known and documented explicit photos.
The RCMP reports that PhotoDNA works by converting photos into a hash code, which is like a unique fingerprint for each image. That hash code is added to a database, and if it’s ever found again anywhere in the world, online or on a hard drive, PhotoDNA will flag it.
But, with the rise of smartphones and tablets, creating new child exploitation content has never been easier.
In 2016, the NCECC received 27,000 cases, almost double the number reported in 2015. That is approximately 25,000,000 images containing child sexual abuse imagery.
This new content can’t be identified by PhotoDNA, since it hasn’t been added to its database yet.
“The numbers are only going up, so we need to be handling these cases in a much smarter way,” added Sgt. Arnold Guerin, who works in the technology section of the Canadian Police Centre for Missing and Exploited Children (CPCMEC), which includes the NCECC.
“New technology can provide us with tools to review cases in an automated way, and bubble up to the top the ones that need to be dealt with right away.”
Guerin said that often minutes matter in child exploitation investigations.
“If we seize a hard drive that has 28 million photos, investigators need to go through all of them,” said Guerin. “But how many are related to children? Can we narrow it down? That’s where this project comes in, we can train the algorithm to recognize child exploitation.”
The RCMP add that achieving 100 per cent accuracy with the algorithm isn’t the goal, investigators will still have to go through all the material to make sure nothing is missed. The algorithm is meant to prioritize what police look at first, to make sure they’re using their time and resources efficiently.
It can also reduce work loads and help protect the health and wellness of investigators.
“We see images that no one wants to see, so maintaining our mental health is a priority,” said Morris-Little. “Anything that takes the human element out of cases is going to reduce the risk of mental health injury to an investigator.”
Guerin said technology like computer vision can act as a shield, sifting through material before it gets to an investigator.
The computer vision product is still in development, but Guerin hopes the RCMP will be able to use it later this year.
“If I could reduce the amount of toxicity officers have to endure every day, then I’m keeping them as healthy as possible, while also keeping more kids safe,” he added.
carmen.weld@bpdigital.ca
Like us on and follow us on .