中級 441 タグ追加 保存
動画の字幕をクリックしてすぐ単語の意味を調べられます!
単語帳読み込み中…
字幕の修正報告
[This talk contains mature content]
Five years ago,
I received a phone call that would change my life.
I remember so vividly that day.
It was about this time of year,
and I was sitting in my office.
I remember the sun streaming through the window.
And my phone rang.
And I picked it up,
and it was two federal agents, asking for my help
in identifying a little girl
featured in hundreds of child sexual abuse images they had found online.
They had just started working the case,
but what they knew
was that her abuse had been broadcast to the world for years
on dark web sites dedicated to the sexual abuse of children.
And her abuser was incredibly technologically sophisticated:
new images and new videos every few weeks,
but very few clues as to who she was
or where she was.
And so they called us,
because they had heard we were a new nonprofit
building technology to fight child sexual abuse.
But we were only two years old,
and we had only worked on child sex trafficking.
And I had to tell them
we had nothing.
We had nothing that could help them stop this abuse.
It took those agents another year
to ultimately find that child.
And by the time she was rescued,
hundreds of images and videos documenting her rape had gone viral,
from the dark web
to peer-to-peer networks, private chat rooms
and to the websites you and I use
every single day.
And today, as she struggles to recover,
she lives with the fact that thousands around the world
continue to watch her abuse.
I have come to learn in the last five years
that this case is far from unique.
How did we get here as a society?
In the late 1980s, child pornography --
or what it actually is, child sexual abuse material --
was nearly eliminated.
New laws and increased prosecutions made it simply too risky
to trade it through the mail.
And then came the internet, and the market exploded.
The amount of content in circulation today
is massive and growing.
This is a truly global problem,
but if we just look at the US:
in the US alone last year,
more than 45 million images and videos of child sexual abuse material
were reported to the National Center for Missing and Exploited Children,
and that is nearly double the amount the year prior.
And the details behind these numbers are hard to contemplate,
with more than 60 percent of the images featuring children younger than 12,
and most of them including extreme acts of sexual violence.
Abusers are cheered on in chat rooms dedicated to the abuse of children,
where they gain rank and notoriety
with more abuse and more victims.
In this market,
the currency has become the content itself.
It's clear that abusers have been quick to leverage new technologies,
but our response as a society has not.
These abusers don't read user agreements of websites,
and the content doesn't honor geographic boundaries.
And they win when we look at one piece of the puzzle at a time,
which is exactly how our response today is designed.
Law enforcement works in one jurisdiction.
Companies look at just their platform.
And whatever data they learn along the way
is rarely shared.
It is so clear that this disconnected approach is not working.
We have to redesign our response to this epidemic
for the digital age.
And that's exactly what we're doing at Thorn.
We're building the technology to connect these dots,
to arm everyone on the front lines --
law enforcement, NGOs and companies --
with the tools they need to ultimately eliminate
child sexual abuse material from the internet.
Let's talk for a minute --
(Applause)
Thank you.
(Applause)
Let's talk for a minute about what those dots are.
As you can imagine, this content is horrific.
If you don't have to look at it, you don't want to look at it.
And so, most companies or law enforcement agencies
that have this content
can translate every file into a unique string of numbers.
This is called a "hash."
It's essentially a fingerprint
for each file or each video.
And what this allows them to do is use the information in investigations
or for a company to remove the content from their platform,
without having to relook at every image and every video each time.
The problem today, though,
is that there are hundreds of millions of these hashes
sitting in siloed databases all around the world.
In a silo,
it might work for the one agency that has control over it,
but not connecting this data means we don't know how many are unique.
We don't know which ones represent children who have already been rescued
or need to be identified still.
So our first, most basic premise is that all of this data
must be connected.
There are two ways where this data, combined with software on a global scale,
can have transformative impact in this space.
The first is with law enforcement:
helping them identify new victims faster,
stopping abuse
and stopping those producing this content.
The second is with companies:
using it as clues to identify the hundreds of millions of files
in circulation today,
pulling it down
and then stopping the upload of new material before it ever goes viral.
Four years ago,
when that case ended,
our team sat there, and we just felt this, um ...
... deep sense of failure, is the way I can put it,
because we watched that whole year
while they looked for her.
And we saw every place in the investigation
where, if the technology would have existed,
they would have found her faster.
And so we walked away from that
and we went and we did the only thing we knew how to do:
we began to build software.
So we've started with law enforcement.
Our dream was an alarm bell on the desks of officers all around the world
so that if anyone dare post a new victim online,
someone would start looking for them immediately.
I obviously can't talk about the details of that software,
but today it's at work in 38 countries,
having reduced the time it takes to get to a child
by more than 65 percent.
(Applause)
And now we're embarking on that second horizon:
building the software to help companies identify and remove this content.
Let's talk for a minute about these companies.
So, I told you -- 45 million images and videos in the US alone last year.
Those come from just 12 companies.
Twelve companies, 45 million files of child sexual abuse material.
These come from those companies that have the money
to build the infrastructure that it takes to pull this content down.
But there are hundreds of other companies,
small- to medium-size companies around the world,
that need to do this work,
but they either: 1) can't imagine that their platform would be used for abuse,
or 2) don't have the money to spend on something that is not driving revenue.
So we went ahead and built it for them,
and this system now gets smarter with the more companies that participate.
Let me give you an example.
Our first partner, Imgur -- if you haven't heard of this company,
it's one of the most visited websites in the US --
millions of pieces of user-generated content uploaded every single day,
in a mission to make the internet a more fun place.
They partnered with us first.
Within 20 minutes of going live on our system,
someone tried to upload a known piece of abuse material.
They were able to stop it, they pull it down,
they report it to the National Center for Missing and Exploited Children.
But they went a step further,
and they went and inspected the account of the person who had uploaded it.
Hundreds more pieces of child sexual abuse material
that we had never seen.
And this is where we start to see exponential impact.
We pull that material down,
it gets reported to the National Center for Missing and Exploited Children
and then those hashes go back into the system
and benefit every other company on it.
And when the millions of hashes we have lead to millions more and, in real time,
companies around the world are identifying and pulling this content down,
we will have dramatically increased the speed at which we are removing
child sexual abuse material from the internet around the world.
(Applause)
But this is why it can't just be about software and data,
it has to be about scale.
We have to activate thousands of officers,
hundreds of companies around the world
if technology will allow us to outrun the perpetrators
and dismantle the communities that are normalizing child sexual abuse
around the world today.
And the time to do this is now.
We can no longer say we don't know the impact this is having on our children.
The first generation of children whose abuse has gone viral
are now young adults.
The Canadian Centre for Child Protection
just did a recent study of these young adults
to understand the unique trauma they try to recover from,
knowing that their abuse lives on.
Eighty percent of these young adults have thought about suicide.
More than 60 percent have attempted suicide.
And most of them live with the fear every single day
that as they walk down the street or they interview for a job
or they go to school
or they meet someone online,
that that person has seen their abuse.
And the reality came true for more than 30 percent of them.
They had been recognized from their abuse material online.
This is not going to be easy,
but it is not impossible.
Now it's going to take the will,
the will of our society
to look at something that is really hard to look at,
to take something out of the darkness
so these kids have a voice;
the will of companies to take action and make sure that their platforms
are not complicit in the abuse of a child;
the will of governments to invest with their law enforcement
for the tools they need to investigate a digital first crime,
even when the victims cannot speak for themselves.
This audacious commitment is part of that will.
It's a declaration of war against one of humanity's darkest evils.
But what I hang on to
is that it's actually an investment in a future
where every child can simply be a kid.
Thank you.
(Applause)
コツ:単語をクリックしてすぐ意味を調べられます!

読み込み中…

【TED】ジュリー・コードゥア: ネットから児童性的虐待コンテンツを消し去る方法 (How we can eliminate child sexual abuse material from the internet | Julie Cordua)

441 タグ追加 保存
吳承叡 2019 年 11 月 14 日 に公開
お勧め動画
  1. 1. クリック一つで単語を検索

    右側のスプリクトの単語をクリックするだけで即座に意味が検索できます。

  2. 2. リピート機能

    クリックするだけで同じフレーズを何回もリピート可能!

  3. 3. ショートカット

    キーボードショートカットを使うことによって勉強の効率を上げることが出来ます。

  4. 4. 字幕の表示/非表示

    日・英のボタンをクリックすることで自由に字幕のオンオフを切り替えられます。

  5. 5. 動画をブログ等でシェア

    コードを貼り付けてVoiceTubeの動画再生プレーヤーをブログ等でシェアすることが出来ます!

  6. 6. 全画面再生

    左側の矢印をクリックすることで全画面で再生できるようになります。

  1. クイズ付き動画

    リスニングクイズに挑戦!

  1. クリックしてメモを表示

  1. UrbanDictionary 俚語字典整合查詢。一般字典查詢不到你滿意的解譯,不妨使用「俚語字典」,或許會讓你有滿意的答案喔