WEBVTT

00:00:05.340 --> 00:00:07.888 align:center
[This talk contains mature content]

00:00:10.245 --> 00:00:11.865 align:center
Five years ago,

00:00:11.889 --> 00:00:15.143 align:center
I received a phone call
that would change my life.

00:00:16.299 --> 00:00:18.965 align:center
I remember so vividly that day.

00:00:19.749 --> 00:00:21.650 align:center
It was about this time of year,

00:00:21.674 --> 00:00:23.632 align:center
and I was sitting in my office.

00:00:24.196 --> 00:00:27.262 align:center
I remember the sun
streaming through the window.

00:00:28.096 --> 00:00:29.416 align:center
And my phone rang.

00:00:30.069 --> 00:00:31.256 align:center
And I picked it up,

00:00:32.098 --> 00:00:36.076 align:center
and it was two federal agents,
asking for my help

00:00:36.100 --> 00:00:38.806 align:center
in identifying a little girl

00:00:38.830 --> 00:00:44.197 align:center
featured in hundreds of child
sexual abuse images they had found online.

00:00:45.649 --> 00:00:48.225 align:center
They had just started working the case,

00:00:48.249 --> 00:00:51.119 align:center
but what they knew

00:00:51.143 --> 00:00:55.966 align:center
was that her abuse had been broadcast
to the world for years

00:00:55.990 --> 00:01:01.139 align:center
on dark web sites dedicated
to the sexual abuse of children.

00:01:02.109 --> 00:01:06.494 align:center
And her abuser was incredibly
technologically sophisticated:

00:01:06.518 --> 00:01:11.106 align:center
new images and new videos every few weeks,

00:01:11.130 --> 00:01:15.179 align:center
but very few clues as to who she was

00:01:15.203 --> 00:01:16.879 align:center
or where she was.

00:01:17.828 --> 00:01:19.214 align:center
And so they called us,

00:01:19.238 --> 00:01:21.928 align:center
because they had heard
we were a new nonprofit

00:01:21.952 --> 00:01:25.410 align:center
building technology
to fight child sexual abuse.

00:01:26.100 --> 00:01:28.387 align:center
But we were only two years old,

00:01:28.411 --> 00:01:31.543 align:center
and we had only worked
on child sex trafficking.

00:01:32.448 --> 00:01:34.571 align:center
And I had to tell them

00:01:34.595 --> 00:01:35.792 align:center
we had nothing.

00:01:36.784 --> 00:01:40.739 align:center
We had nothing that could
help them stop this abuse.

00:01:41.767 --> 00:01:45.341 align:center
It took those agents another year

00:01:45.365 --> 00:01:48.366 align:center
to ultimately find that child.

00:01:49.357 --> 00:01:51.632 align:center
And by the time she was rescued,

00:01:51.656 --> 00:01:58.144 align:center
hundreds of images and videos
documenting her rape had gone viral,

00:01:58.168 --> 00:01:59.859 align:center
from the dark web

00:01:59.883 --> 00:02:02.900 align:center
to peer-to-peer networks,
private chat rooms

00:02:02.924 --> 00:02:06.144 align:center
and to the websites you and I use

00:02:06.168 --> 00:02:08.770 align:center
every single day.

00:02:09.720 --> 00:02:13.535 align:center
And today, as she struggles to recover,

00:02:13.559 --> 00:02:17.698 align:center
she lives with the fact
that thousands around the world

00:02:17.722 --> 00:02:20.810 align:center
continue to watch her abuse.

00:02:22.498 --> 00:02:24.926 align:center
I have come to learn
in the last five years

00:02:24.950 --> 00:02:27.467 align:center
that this case is far from unique.

00:02:28.623 --> 00:02:32.466 align:center
How did we get here as a society?

00:02:33.994 --> 00:02:37.753 align:center
In the late 1980s, child pornography --

00:02:37.777 --> 00:02:43.030 align:center
or what it actually is,
child sexual abuse material --

00:02:43.054 --> 00:02:44.905 align:center
was nearly eliminated.

00:02:45.713 --> 00:02:50.217 align:center
New laws and increased prosecutions
made it simply too risky

00:02:50.241 --> 00:02:51.813 align:center
to trade it through the mail.

00:02:52.737 --> 00:02:56.882 align:center
And then came the internet,
and the market exploded.

00:02:57.814 --> 00:03:01.170 align:center
The amount of content in circulation today

00:03:01.194 --> 00:03:04.038 align:center
is massive and growing.

00:03:04.925 --> 00:03:08.169 align:center
This is a truly global problem,

00:03:08.193 --> 00:03:10.026 align:center
but if we just look at the US:

00:03:10.050 --> 00:03:12.761 align:center
in the US alone last year,

00:03:12.785 --> 00:03:18.059 align:center
more than 45 million images and videos
of child sexual abuse material

00:03:18.083 --> 00:03:21.761 align:center
were reported to the National Center
for Missing and Exploited Children,

00:03:21.785 --> 00:03:26.166 align:center
and that is nearly double
the amount the year prior.

00:03:27.131 --> 00:03:32.417 align:center
And the details behind these numbers
are hard to contemplate,

00:03:32.441 --> 00:03:38.161 align:center
with more than 60 percent of the images
featuring children younger than 12,

00:03:38.185 --> 00:03:42.687 align:center
and most of them including
extreme acts of sexual violence.

00:03:43.354 --> 00:03:48.654 align:center
Abusers are cheered on in chat rooms
dedicated to the abuse of children,

00:03:48.678 --> 00:03:51.161 align:center
where they gain rank and notoriety

00:03:51.185 --> 00:03:54.091 align:center
with more abuse and more victims.

00:03:54.747 --> 00:03:57.342 align:center
In this market,

00:03:57.366 --> 00:04:01.324 align:center
the currency has become
the content itself.

00:04:02.527 --> 00:04:06.333 align:center
It's clear that abusers have been quick
to leverage new technologies,

00:04:06.357 --> 00:04:09.372 align:center
but our response as a society has not.

00:04:10.175 --> 00:04:14.346 align:center
These abusers don't read
user agreements of websites,

00:04:14.370 --> 00:04:18.172 align:center
and the content doesn't honor
geographic boundaries.

00:04:19.160 --> 00:04:25.273 align:center
And they win when we look
at one piece of the puzzle at a time,

00:04:25.297 --> 00:04:29.314 align:center
which is exactly how
our response today is designed.

00:04:29.338 --> 00:04:32.825 align:center
Law enforcement works in one jurisdiction.

00:04:32.849 --> 00:04:36.263 align:center
Companies look at just their platform.

00:04:36.287 --> 00:04:38.991 align:center
And whatever data they learn along the way

00:04:39.015 --> 00:04:41.017 align:center
is rarely shared.

00:04:41.906 --> 00:04:47.507 align:center
It is so clear that this
disconnected approach is not working.

00:04:48.147 --> 00:04:52.257 align:center
We have to redesign
our response to this epidemic

00:04:52.281 --> 00:04:53.803 align:center
for the digital age.

00:04:54.206 --> 00:04:57.148 align:center
And that's exactly
what we're doing at Thorn.

00:04:57.815 --> 00:05:01.432 align:center
We're building the technology
to connect these dots,

00:05:01.456 --> 00:05:03.778 align:center
to arm everyone on the front lines --

00:05:03.802 --> 00:05:06.626 align:center
law enforcement, NGOs and companies --

00:05:06.650 --> 00:05:10.159 align:center
with the tools they need
to ultimately eliminate

00:05:10.183 --> 00:05:12.676 align:center
child sexual abuse material
from the internet.

00:05:14.075 --> 00:05:15.346 align:center
Let's talk for a minute --

00:05:15.370 --> 00:05:16.889 align:center
(Applause)

00:05:16.913 --> 00:05:18.217 align:center
Thank you.

00:05:18.241 --> 00:05:20.581 align:center
(Applause)

00:05:22.304 --> 00:05:24.821 align:center
Let's talk for a minute
about what those dots are.

00:05:25.827 --> 00:05:29.038 align:center
As you can imagine,
this content is horrific.

00:05:29.062 --> 00:05:32.910 align:center
If you don't have to look at it,
you don't want to look at it.

00:05:32.934 --> 00:05:37.903 align:center
And so, most companies
or law enforcement agencies

00:05:37.927 --> 00:05:39.590 align:center
that have this content

00:05:39.614 --> 00:05:43.066 align:center
can translate every file
into a unique string of numbers.

00:05:43.090 --> 00:05:44.564 align:center
This is called a "hash."

00:05:44.588 --> 00:05:46.731 align:center
It's essentially a fingerprint

00:05:46.755 --> 00:05:49.153 align:center
for each file or each video.

00:05:49.177 --> 00:05:53.790 align:center
And what this allows them to do
is use the information in investigations

00:05:53.814 --> 00:05:56.841 align:center
or for a company to remove
the content from their platform,

00:05:56.865 --> 00:06:02.039 align:center
without having to relook
at every image and every video each time.

00:06:02.700 --> 00:06:04.851 align:center
The problem today, though,

00:06:04.875 --> 00:06:08.626 align:center
is that there are hundreds
of millions of these hashes

00:06:08.650 --> 00:06:12.260 align:center
sitting in siloed databases
all around the world.

00:06:12.718 --> 00:06:13.869 align:center
In a silo,

00:06:13.893 --> 00:06:16.969 align:center
it might work for the one agency
that has control over it,

00:06:16.993 --> 00:06:21.123 align:center
but not connecting this data means
we don't know how many are unique.

00:06:21.147 --> 00:06:24.663 align:center
We don't know which ones represent
children who have already been rescued

00:06:24.687 --> 00:06:27.576 align:center
or need to be identified still.

00:06:27.600 --> 00:06:31.770 align:center
So our first, most basic premise
is that all of this data

00:06:31.794 --> 00:06:34.197 align:center
must be connected.

00:06:34.822 --> 00:06:40.991 align:center
There are two ways where this data,
combined with software on a global scale,

00:06:41.015 --> 00:06:44.423 align:center
can have transformative
impact in this space.

00:06:44.968 --> 00:06:47.590 align:center
The first is with law enforcement:

00:06:47.614 --> 00:06:51.245 align:center
helping them identify new victims faster,

00:06:51.269 --> 00:06:52.485 align:center
stopping abuse

00:06:52.509 --> 00:06:55.413 align:center
and stopping those producing this content.

00:06:55.945 --> 00:06:58.611 align:center
The second is with companies:

00:06:58.635 --> 00:07:02.256 align:center
using it as clues to identify
the hundreds of millions of files

00:07:02.280 --> 00:07:03.874 align:center
in circulation today,

00:07:03.898 --> 00:07:05.085 align:center
pulling it down

00:07:05.109 --> 00:07:11.927 align:center
and then stopping the upload
of new material before it ever goes viral.

00:07:14.198 --> 00:07:15.844 align:center
Four years ago,

00:07:15.868 --> 00:07:17.407 align:center
when that case ended,

00:07:18.804 --> 00:07:22.543 align:center
our team sat there,
and we just felt this, um ...

00:07:24.139 --> 00:07:27.477 align:center
... deep sense of failure,
is the way I can put it,

00:07:27.501 --> 00:07:31.152 align:center
because we watched that whole year

00:07:31.176 --> 00:07:32.496 align:center
while they looked for her.

00:07:32.520 --> 00:07:36.487 align:center
And we saw every place
in the investigation

00:07:36.511 --> 00:07:38.899 align:center
where, if the technology
would have existed,

00:07:38.923 --> 00:07:41.227 align:center
they would have found her faster.

00:07:42.188 --> 00:07:44.124 align:center
And so we walked away from that

00:07:44.148 --> 00:07:47.103 align:center
and we went and we did
the only thing we knew how to do:

00:07:47.127 --> 00:07:49.761 align:center
we began to build software.

00:07:50.193 --> 00:07:52.445 align:center
So we've started with law enforcement.

00:07:52.469 --> 00:07:56.890 align:center
Our dream was an alarm bell on the desks
of officers all around the world

00:07:56.914 --> 00:08:01.458 align:center
so that if anyone dare post
a new victim online,

00:08:01.482 --> 00:08:04.971 align:center
someone would start
looking for them immediately.

00:08:05.828 --> 00:08:08.785 align:center
I obviously can't talk about
the details of that software,

00:08:08.809 --> 00:08:11.418 align:center
but today it's at work in 38 countries,

00:08:11.442 --> 00:08:14.416 align:center
having reduced the time it takes
to get to a child

00:08:14.440 --> 00:08:16.770 align:center
by more than 65 percent.

00:08:16.794 --> 00:08:21.164 align:center
(Applause)

00:08:25.946 --> 00:08:28.961 align:center
And now we're embarking
on that second horizon:

00:08:28.985 --> 00:08:34.650 align:center
building the software to help companies
identify and remove this content.

00:08:35.697 --> 00:08:38.229 align:center
Let's talk for a minute
about these companies.

00:08:38.774 --> 00:08:44.006 align:center
So, I told you -- 45 million images
and videos in the US alone last year.

00:08:44.784 --> 00:08:48.671 align:center
Those come from just 12 companies.

00:08:50.387 --> 00:08:56.815 align:center
Twelve companies, 45 million files
of child sexual abuse material.

00:08:56.839 --> 00:08:59.639 align:center
These come from those companies
that have the money

00:08:59.663 --> 00:09:04.220 align:center
to build the infrastructure that it takes
to pull this content down.

00:09:04.244 --> 00:09:06.655 align:center
But there are hundreds of other companies,

00:09:06.679 --> 00:09:09.345 align:center
small- to medium-size companies
around the world,

00:09:09.369 --> 00:09:11.423 align:center
that need to do this work,

00:09:11.447 --> 00:09:16.872 align:center
but they either: 1) can't imagine that
their platform would be used for abuse,

00:09:16.896 --> 00:09:22.741 align:center
or 2) don't have the money to spend
on something that is not driving revenue.

00:09:23.436 --> 00:09:26.725 align:center
So we went ahead and built it for them,

00:09:26.749 --> 00:09:31.718 align:center
and this system now gets smarter
with the more companies that participate.

00:09:32.469 --> 00:09:34.194 align:center
Let me give you an example.

00:09:34.963 --> 00:09:38.841 align:center
Our first partner, Imgur --
if you haven't heard of this company,

00:09:38.865 --> 00:09:42.007 align:center
it's one of the most visited
websites in the US --

00:09:42.031 --> 00:09:47.039 align:center
millions of pieces of user-generated
content uploaded every single day,

00:09:47.063 --> 00:09:49.921 align:center
in a mission to make the internet
a more fun place.

00:09:50.516 --> 00:09:52.368 align:center
They partnered with us first.

00:09:52.392 --> 00:09:55.735 align:center
Within 20 minutes
of going live on our system,

00:09:55.759 --> 00:09:59.331 align:center
someone tried to upload
a known piece of abuse material.

00:09:59.355 --> 00:10:01.463 align:center
They were able to stop it,
they pull it down,

00:10:01.487 --> 00:10:04.953 align:center
they report it to the National Center
for Missing and Exploited Children.

00:10:04.977 --> 00:10:06.885 align:center
But they went a step further,

00:10:06.909 --> 00:10:11.042 align:center
and they went and inspected the account
of the person who had uploaded it.

00:10:11.590 --> 00:10:16.301 align:center
Hundreds more pieces
of child sexual abuse material

00:10:16.325 --> 00:10:18.143 align:center
that we had never seen.

00:10:18.656 --> 00:10:22.188 align:center
And this is where we start
to see exponential impact.

00:10:22.212 --> 00:10:23.980 align:center
We pull that material down,

00:10:24.004 --> 00:10:27.554 align:center
it gets reported to the National Center
for Missing and Exploited Children

00:10:27.578 --> 00:10:30.089 align:center
and then those hashes
go back into the system

00:10:30.113 --> 00:10:32.577 align:center
and benefit every other company on it.

00:10:32.601 --> 00:10:37.385 align:center
And when the millions of hashes we have
lead to millions more and, in real time,

00:10:37.409 --> 00:10:41.947 align:center
companies around the world are identifying
and pulling this content down,

00:10:41.971 --> 00:10:46.532 align:center
we will have dramatically increased
the speed at which we are removing

00:10:46.556 --> 00:10:50.850 align:center
child sexual abuse material
from the internet around the world.

00:10:50.874 --> 00:10:56.346 align:center
(Applause)

00:10:58.712 --> 00:11:01.932 align:center
But this is why it can't just be
about software and data,

00:11:01.956 --> 00:11:03.728 align:center
it has to be about scale.

00:11:03.752 --> 00:11:07.265 align:center
We have to activate thousands of officers,

00:11:07.289 --> 00:11:09.666 align:center
hundreds of companies around the world

00:11:09.690 --> 00:11:13.298 align:center
if technology will allow us
to outrun the perpetrators

00:11:13.322 --> 00:11:17.447 align:center
and dismantle the communities
that are normalizing child sexual abuse

00:11:17.471 --> 00:11:19.023 align:center
around the world today.

00:11:19.568 --> 00:11:22.218 align:center
And the time to do this is now.

00:11:22.792 --> 00:11:28.589 align:center
We can no longer say we don't know
the impact this is having on our children.

00:11:29.192 --> 00:11:33.650 align:center
The first generation of children
whose abuse has gone viral

00:11:33.674 --> 00:11:35.384 align:center
are now young adults.

00:11:35.955 --> 00:11:38.540 align:center
The Canadian Centre for Child Protection

00:11:38.564 --> 00:11:41.260 align:center
just did a recent study
of these young adults

00:11:41.284 --> 00:11:45.920 align:center
to understand the unique trauma
they try to recover from,

00:11:45.944 --> 00:11:48.767 align:center
knowing that their abuse lives on.

00:11:49.717 --> 00:11:54.563 align:center
Eighty percent of these young adults
have thought about suicide.

00:11:55.070 --> 00:11:59.132 align:center
More than 60 percent
have attempted suicide.

00:12:00.076 --> 00:12:05.293 align:center
And most of them live
with the fear every single day

00:12:05.317 --> 00:12:09.780 align:center
that as they walk down the street
or they interview for a job

00:12:09.804 --> 00:12:12.094 align:center
or they go to school

00:12:12.118 --> 00:12:14.543 align:center
or they meet someone online,

00:12:14.567 --> 00:12:18.225 align:center
that that person has seen their abuse.

00:12:19.051 --> 00:12:23.956 align:center
And the reality came true
for more than 30 percent of them.

00:12:24.760 --> 00:12:29.346 align:center
They had been recognized
from their abuse material online.

00:12:30.526 --> 00:12:33.802 align:center
This is not going to be easy,

00:12:33.826 --> 00:12:36.669 align:center
but it is not impossible.

00:12:36.693 --> 00:12:39.369 align:center
Now it's going to take the will,

00:12:39.393 --> 00:12:40.982 align:center
the will of our society

00:12:41.006 --> 00:12:44.560 align:center
to look at something
that is really hard to look at,

00:12:44.584 --> 00:12:46.927 align:center
to take something out of the darkness

00:12:46.951 --> 00:12:49.046 align:center
so these kids have a voice;

00:12:50.614 --> 00:12:55.560 align:center
the will of companies to take action
and make sure that their platforms

00:12:55.584 --> 00:12:58.897 align:center
are not complicit in the abuse of a child;

00:12:59.709 --> 00:13:03.660 align:center
the will of governments to invest
with their law enforcement

00:13:03.684 --> 00:13:08.778 align:center
for the tools they need to investigate
a digital first crime,

00:13:08.802 --> 00:13:12.885 align:center
even when the victims
cannot speak for themselves.

00:13:14.250 --> 00:13:17.948 align:center
This audacious commitment
is part of that will.

00:13:18.773 --> 00:13:24.180 align:center
It's a declaration of war
against one of humanity's darkest evils.

00:13:24.767 --> 00:13:26.707 align:center
But what I hang on to

00:13:26.731 --> 00:13:30.180 align:center
is that it's actually
an investment in a future

00:13:30.204 --> 00:13:33.278 align:center
where every child can simply be a kid.

00:13:33.861 --> 00:13:35.055 align:center
Thank you.

00:13:35.400 --> 00:13:41.544 align:center
(Applause)

