Protect Children From Online Predators

How to Prevent Child Exploitation in Your Social Product

Request a Demo

Send us your details in the form below, and we'll be in touch within 24 hrs.


Thank you, your message has been sent successfully. We’ll be in touch within 1 business day.
Uh oh... someone with that email address has already requested a demo in the past.

There are two things that you should never, ever allow in your social product: grooming (the act of befriending a child and gaining their trust in order to exploit them sexually) and child sexual abuse material (CAM/CSAM).

Have you thought about how you will detect predatory behavior or illegal imagery in your product?

Most people have (thankfully) never had to deal with child exploitation or pedophilia, and have no idea what to do when the worst case scenario happens on their platform. Having an action plan to stop child exploitation is non-negotiable, both legally and morally. You need to have a system in place to prevent, catch, and action potential child sexual exploitation incidents.

Our content filter detects grooming and filters it instantly before it even reaches its intended victim. It’s then flagged and sent to an escalation queue for immediate review, and, if necessary, action. We’ve partnered with major law-enforcement agencies to study real online conversations between child predators and their victims, so we can better and more accurately detect grooming behavior for social platforms. Grooming is subtle. It starts slowly, as pedophiles use seemingly-innocent language to gain their victim’s trust, prior to abusing them. In those early stages, it’s crucial that we detect the warning signs.

The language of child exploitation becomes more explicit when abuse progresses. Community Sift labels, removes, and flags more obvious high-risk content in real-time, giving you the tools to protect children in your product.

On the image front, we provide an advanced pornography-detection tool for images, another powerful weapon in the battle against child victimization. Just like our text filter, pornographic images are located in real-time and blocked before they even reach your community. By combining the text and image classification features of Community Sift, you are armed with a more complete picture of high-risk users.

Protecting children from pedophiles and sexual abuse is our most important key mandate at Two Hat. We are now using the cutting-edge technology behind the Community Sift content filter to develop additional software that uses deep algorithms and machine learning to detect CAM/CSAM automatically. The ultimate goal is to assist law enforcement and prosecutors in finding predators more quickly and bringing them to justice. The end game is to save lives.

When coupled with the social internet, grooming and child exploitation can lead to cyberbullying, self-harm, and in the most tragic cases, suicide. We strongly recommend that social products look closely at bullying and conversations about self-harm/suicide in their communities, because this is a very real cycle, and has a tragic and heartbreaking effect on its victims.

We’ve identified the four stages and effects of online child abuse as follows:

First, the predator identifies a vulnerable target to victimize, then works to gains their trust (a process known as “grooming.”) The predator tricks the child into meeting, either online or in real life. Images are taken and distributed online. Tragically, the images are shared on social networks, often leading to cyberbullying and harassment from the victim’s peers or strangers. The victim is then forced to relive the abuse, again and again. Finally, the victim suffers from depression, engages in self-harm, and (in the very worst case possible scenario) sometimes even commits suicide.

Unfortunately, we’ve seen this heartbreaking pattern at work again and again. Amanda Todd, Ryan Halligan, Jessica Logan, and Tyler Clementi were all victims of sexual exploitation that was later posted on social media. All were bullied mercilessly. All committed suicide.

We cannot allow this to continue. We will not allow this to continue.

Child exploitation by pedophiles is the toughest issue we will ever have to face. Let’s be honest — even sitting down to write this is painful. In a word, it sucks. We wish we lived in a world where we didn’t have to think about it. It’s visceral, and emotional, and unthinkable, and it hurts us where we’re most vulnerable — children and families.

When it comes to child sexual abuse, there are no gray areas. As a business, you have a legal and moral obligation to prevent it from happening in your product. Community Sift has done much of this hard (and, let’s face it, heartbreaking) work for you. We can help ensure that both your most vulnerable users, and your reputation, are protected from predators. We want to save as many lives as possible, and have committed millions of dollars and years of development to tackle this most devastating of tragic social issues. It would be our honour to help you solve this problem.

Let’s work together to eliminate child exploitation.

Request A Demo

Want to see what Community Sift can do for your social product? Request a demo.

Get Our Newsletter