字幕表 動画を再生する
Big data is not an unmitigated good. Like many things in society, in fact probably all
things, it comes with risks as well and it comes with a dark side. And one dark side
of course is privacy. That exists today, it'll exist tomorrow, maybe it gets bigger with
big data as well. But there is something else to play for, something else that's a little
more troubling still. And that is if you will, propensity. It is big data algorithms making
a prediction of what you are likely to do before you've actually done it.
Now the criminal justice system has never really dealt with this sort of problem before.
Typically you have to commit a crime before you are penalized for that crime. But what
if it is simply a prediction that you have a likelihood of committing a crime? Would
society be remiss not to intervene? If I could tell with a 98 percent statistical accuracy
that you are likely to shoplift in the next 12 months, public safety requires that I interact.
And maybe I don't put you into jail, it's not Minority Report it's not pre-crime, I
have a social worker knock on your door and say, "We'd like to help you. We'd like to
get you an after school job if your teenager. We'd like to sort of support you."
Well that sounds like it's a benefit but in reality if you think about it, this person
is gonna be stigmatized in the eye of his peers, school teachers, parents. In fact he'll
probably feel stigmatized in his own eyes and feel badly and we might even encourage
towards this sort of behavior that we want to prevent. The point is that he will have
been a victim of a prediction about him. And he can rightly say, "I will be the two percent
that will not shoplift that I'll exercise my moral choice." So the solution seems to
be in a big data world we want to somehow sanctify the notion of the human volition
of human free will and to preserve that as a central attribute.