字幕表 動画を再生する
[MUSIC PLAYING]
SPEAKER 1: This is the [INAUDIBLE].
[MUSIC PLAYING]
DAVID MALAN: Hello, world.
This is the CS50 Podcast.
My name is David Malan.
And I'm here again, with CS50s own, Brian Yu.
BRIAN YU: Good to be back.
DAVID MALAN: So in the news, of late, has been this app for iOS
and Android called FaceApp, as some of you might have heard.
Brian, have you used this app before?
BRIAN YU: So people keep talking about this.
I don't have it myself, on my phone.
But one of our teaching fellows for CS50 does have it
and was actually showing it to me earlier, today.
It's kind of scary, what it can do.
So the way it seems to work is that you open up the app, on your phone.
And you can choose a photo of yourself, from your photo library, a photo of you
or a friend.
And you submit it.
And then you can apply any number of these different image filters
to it, effectively.
But they can do a variety of different things.
So they can show you what they think you would look like when you are older,
and show you an elderly version of yourself,
or what you looked like when you were younger.
They can change your hairstyle.
They can do all sorts of different things to the background of the image,
for instance.
So it's a pretty powerful image tool.
And it does a pretty good job of trying to create a realistic looking photo.
DAVID MALAN: Yeah.
It's striking.
And I discovered this app two years after everyone else did, it seems.
Because someone sent me a modified photo of myself
recently, whereby I was aged in the photo.
And it was amazing.
The realism of the skin, I thought was compelling.
And what was most striking to me, so much so,
that I forwarded it to a couple of relatives afterward,
is that I look like a couple of my older relatives do in reality.
And it was fascinating to see that the app was seeing these familial traits
in myself, that even I don't see when I look in the mirror, right now.
But apparently, if you age me and you make my skin
a different texture, over time, oh my god,
I'm going to actually look like some of my relatives, it seems.
BRIAN YU: Yeah.
It's incredible what the app can do.
A human trying to do this type of thing on their own,
might not be able to do at all.
And I think that really just speaks to how powerful machine learning has
gotten, at this point.
That these machines have been trained to look at these huge data
sets of analyzing younger and older pictures
probably, and trying to understand, fundamentally, how you translate
one younger photo to an older photo.
And now, they've just gotten really good at being able to do that in a way
that humans, on their own, never would've been able to.
DAVID MALAN: So this, is then, related to our recent chat
about machine learning, more generally.
Where, I assume, the training data in this case
is just lots, and lots, and lots of photos of people, young and old,
and of all sorts.
BRIAN YU: Yeah.
That would be my guess.
So FaceApp doesn't publicly announce exactly how their algorithm is working.
But I would imagine that it's probably just a lot of training data,
where you give the algorithm a whole bunch of younger photos and older
photos.
And you try and train the algorithm to be
able to figure out how to turn the younger photo into the older photo.
Such that you can give it a new younger photo as input and have it
predict what the older photo is going to look like.
DAVID MALAN: It's amazing.
It's really quite fascinating too, to allow
people to imagine what they might look like in different clothes,
or I suppose, with different makeup on, or so forth.
Computers can do so much of this.
But it's actually quite scary too, because a corollary
of being able to mutate people's faces in this way, digitally,
is that you can surely identify people, as well.
And I think that's one of the topics that's been getting a lot of attention
here, certainly in the US, whereby a few cities, most recently, have actually
outlawed, outright, the police's use of facial recognition
to bring in suspects.
For instance.
Somerville, Massachusetts, which is right around the corner from Cambridge,
Massachusetts, here, did this.
And I mean, that's actually the flip side of the cool factor.
I mean, honestly, I was pretty caught up in it,
when I received this photo of myself some 20, 30, 40 years down the road.
Sent it along happily to some other people.
And then didn't really stop to think, until a few days later,
when I started reading about FaceApp and the implications thereof.
That actually, this really does forebode a scary future,
where all too easily can computers, and whatever humans own them,
pick us out in the crowd or track, really
in the extreme, your every movement.
I mean, do you think that policy is really the only solution to this?
BRIAN YU: So I think that certainly, technology
is going to get good enough that facial recognition is
going to keep getting better.
Because it's already really, really good.
And I know this from whenever photos get posted on Facebook.
And I'm in the background corner of a very small part
of the image, Facebook, pretty immediately, is able to tell me,
oh, that's me in the photo.
When I don't even know if I would have noticed myself in the photo.
DAVID MALAN: I know.
Even when it just seems to be like a few pixels, off to the side.
BRIAN YU: Yeah.
So technology is certainly not going to be the factor that holds anyone back,
when it comes to facial recognition.
So if a city wants to protect itself against the potential implications
of this, then I think policy is probably the only way to do it.
Though it seems like the third city that most recently banned facial recognition
in the city is Oakland.
And it looks like their main concern is the misidentification of individuals,
and how that might lead to the misuse of force, for example.
And certainly, facial recognition technology is not perfect, right now.
But it is getting better and better.
So I can understand why more and more people might
feel like they could begin to rely on it, even though it's not 100% accurate
and may never be 100% accurate.
DAVID MALAN: But that too, in and of itself, seems worrisome.
Because if towns, or cities, are starting
to ban it on the basis of the chance of misidentification,
surely the technology, as you say, is only going to get better, and better,
and better.
And so that argument, you would think, is going to get weaker, and weaker,
and weaker.
Because I mean, even just a few years ago, was Facebook,
you noted, claiming that they could identify
humans in photos with an accuracy-- correct me if I'm wrong-- of 97.25%.
Whereas, humans, when trying to identify other humans and photos,
had an accuracy level of 97.5%.
So almost exactly the same statistic.
So at that point, if the software is just as good, if not better,
than humans' own identification, it seems like a weak foundation
on which to ban the technology.
And really, our statement should be stronger than just, oh,
there's this risk of misidentification.
But rather, this is not something we want societally, no?
BRIAN YU: Yeah.
I think that, especially now that facial recognition technology
has gotten better.
But when the Facebook did that study, I think that was back in 2014, or so.
So I would guess that Facebook's facial recognition
abilities have gotten even better than that, over the course of the past five
years, or so.
So facial recognition is probably better when
a computer is doing it than when humans are doing it,
by now, or at least close to as good.
And so given that, I do think that when it
comes to trying to decide on how we want to shape the policies in our society,
that we should not just be looking at how accurate these things are.
But also, looking at what kind of technologies
do we want to be playing a role in our policing system, and in the way
that the society runs, and the rules there.
DAVID MALAN: And I imagine this is going to play out differently
in different countries.
And I feel like you've already seen evidence of this,
if you travel internationally.
Because customs agencies, in a lot of countries,
are already photographing, even with those silly little webcams,
when you swipe your passport and sign into a country.
They've been logging people's comings and going, for some time.
So really the technology is just facilitating all the more of that
and tracking.
I mean, in the UK, for years, they've been
known as having