Griffeye releases new AI that can identify child sexual abuse content in videos
Law enforcement agencies worldwide will benefit from cutting-edge artificial intelligence that detects child sexual abuse in video material. The technology will help investigators automatically filter out relevant information and speed up victim identification.
Gothenburg, Sweden, June 5, 2020 – Griffeye, used by law enforcement agencies around the world to manage, analyze and share digital media in child sexual abuse (CSA) investigations, today announced an upgrade of their AI technology, Griffeye Brain, that will make it possible for law enforcement to automatically detect CSA content in both images and videos.
Griffeye Brain was first released in May 2018 to detect CSA content in still images. Since then it has been successful in helping police officers working on child exploitation cases to identify relevant content in massive image sets which can contain tens of millions of images and videos.
“Since the first release of Griffeye Brain two years ago, it has been transforming investigations by automating the processing and sorting of image content so that the investigator can prioritize analysis and investigative work such as victim identification,” said Johann Hofmann, CEO of Griffeye. “Now, with the added capability to also detect CSA content in videos, we hope to further help speed up investigations and the safeguarding of victims while at the same time lowering the physical and mental burden for investigators who have to sift through this harrowing material.”
The proliferation of CSA content is reaching mammoth proportions because of the development of technology. Not only is it easier than ever before to produce the material, but cloud services and the dark web also enable criminals to share and store vast amounts of content anonymously. Online CSA crime has also seen exponential growth during the coronavirus pandemic, with the National Center for Missing and Exploited Children receiving 4.1 million reports of children being abused online in April 2020, in comparison to four times less, 1 million, in April 2019. It is not unusual that cases today contain millions of images and videos, with videos making up the highest proportion of content. This is material which police investigators have to trawl through in order to identify critical evidence which can help perpetrators and safeguard the victims, so automating this process in order to pick out content relevant to specific cases is critical in enabling law enforcement to save children’s lives faster.
“As technology continues to feed the growth and complexity of child sexual abuse investigations, we must apply solutions that can scale to meet these ever rising threats,” said John Madsen, Law Enforcement Liaison Officer at Griffeye. “I think the solution lies in both collaboration and in leveraging artificial intelligence. A portion of the workflow can now be handled by AI, leaving charging decisions and victim identification to the investigator, and increasing efficiency in many of these cases.”
The Griffeye Brain CSA classifier has been trained on real CSA case data at law enforcement agencies in the US, Australia and Europe. Training the algorithm on live data is necessary for it to work and produce high quality results.
Griffeye revolutionized law enforcement investigation processes with the release of its digital investigation platform in 2015. Today, Griffeye is world-leading in its field and used by over 4,000 police agencies across the world for processing, sorting and analyzing large volumes of images and videos – especially in cases containing child sexual abuse material.
- Email: Griffeye@clarity.pr
- Phone: +44 (0) 7895 831 744