Overview

Why CEASE?

Every year, the National Center for Missing & Exploited Children reviews 25 million child sexual abuse images.

And in 2016, the Internet Watch Foundation found 57,335 URLs confirmed to contain child sexual abuse imagery. Over half of the children depicted were aged 10 or younger. *

Child sexual material has always existed in some form. But with the advent of the internet, it’s become much easier to upload, share, download, and even sell material — with only a few clicks of a mouse. Predators have taken advantage of advances in technology to connect on forums, newsgroups, social networks, and more.

With CEASE, we are tackling this problem head on. In collaboration with law enforcement and major universities, we are building cutting-edge artificial neural networks to detect new, uncataloged child sexual abuse material on the internet.

By detecting CSAM automatically, CEASE will help law enforcement sift through millions of reported images to prioritize the ones that require immediate action. And social networks can identify CSAM before it’s even posted.

*statistics from Thorn and the Internet Watch Foundation Annual Report 2016

Partnering with the best

Our partnership with the RCMP grants us permanent access to millions of CSAM images. This permanent dataset is allowing us to train a model that is accurate to the highest percentile.

Thanks to a generous grant from Mitacs, we are collaborating with major universities across Canada. Interns from University of Laval, University of Manitoba, University of British Columbia Okanagan, and Simon Fraser University are joining the Two Hat Security team in our Kelowna office to train the model.

From random forest to artificial neural network (ANN)

Our first iteration of CEASE is already highly accurate in identifying CSAM. Officers can load a dataset of positive and negative images to rapidly train the model. With a clean, simple interface, officers can easily navigate the system, and can even import and export models to facilitate sharing among agencies.

We are currently in phase two of CEASE. For the next several months, five interns from leading Canadian universities will build the next iteration — an ensemble of artificial neural networks working together to create high 90th-percentile accuracy.

The future of CEASE

On the internet, images of child abuse can be shared in a fraction of a second. At times, the scope of the problem feels insurmountable. But with recent groundbreaking advances in artificial intelligence, we believe that a solution is possible.

One day, law enforcement will use CEASE to identify CSAM in that same fraction of a second. More children will be saved from further victimization than ever before. One day, social networks will have access to technology that detects CSAM before it’s ever posted on their platform.

And with the help of our partners, that day is coming soon.

Want to be part of the solution? We would love to hear from you.

Still have questions? Check out our FAQ.

Contact Us

Hello! Send us your question, and we'll get back to you as soon as possible.

Start typing and press Enter to search