Placeholder Image

字幕表 動画を再生する

  • [MUSIC PLAYING]

  • SPEAKER 1: This is the [INAUDIBLE].

  • [MUSIC PLAYING]

  • DAVID MALAN: Hello, world.

  • This is the CS50 Podcast.

  • My name is David Malan.

  • And I'm here again, with CS50s own, Brian Yu.

  • BRIAN YU: Good to be back.

  • DAVID MALAN: So in the news, of late, has been this app for iOS

  • and Android called FaceApp, as some of you might have heard.

  • Brian, have you used this app before?

  • BRIAN YU: So people keep talking about this.

  • I don't have it myself, on my phone.

  • But one of our teaching fellows for CS50 does have it

  • and was actually showing it to me earlier, today.

  • It's kind of scary, what it can do.

  • So the way it seems to work is that you open up the app, on your phone.

  • And you can choose a photo of yourself, from your photo library, a photo of you

  • or a friend.

  • And you submit it.

  • And then you can apply any number of these different image filters

  • to it, effectively.

  • But they can do a variety of different things.

  • So they can show you what they think you would look like when you are older,

  • and show you an elderly version of yourself,

  • or what you looked like when you were younger.

  • They can change your hairstyle.

  • They can do all sorts of different things to the background of the image,

  • for instance.

  • So it's a pretty powerful image tool.

  • And it does a pretty good job of trying to create a realistic looking photo.

  • DAVID MALAN: Yeah.

  • It's striking.

  • And I discovered this app two years after everyone else did, it seems.

  • Because someone sent me a modified photo of myself

  • recently, whereby I was aged in the photo.

  • And it was amazing.

  • The realism of the skin, I thought was compelling.

  • And what was most striking to me, so much so,

  • that I forwarded it to a couple of relatives afterward,

  • is that I look like a couple of my older relatives do in reality.

  • And it was fascinating to see that the app was seeing these familial traits

  • in myself, that even I don't see when I look in the mirror, right now.

  • But apparently, if you age me and you make my skin

  • a different texture, over time, oh my god,

  • I'm going to actually look like some of my relatives, it seems.

  • BRIAN YU: Yeah.

  • It's incredible what the app can do.

  • A human trying to do this type of thing on their own,

  • might not be able to do at all.

  • And I think that really just speaks to how powerful machine learning has

  • gotten, at this point.

  • That these machines have been trained to look at these huge data

  • sets of analyzing younger and older pictures

  • probably, and trying to understand, fundamentally, how you translate

  • one younger photo to an older photo.

  • And now, they've just gotten really good at being able to do that in a way

  • that humans, on their own, never would've been able to.

  • DAVID MALAN: So this, is then, related to our recent chat

  • about machine learning, more generally.

  • Where, I assume, the training data in this case

  • is just lots, and lots, and lots of photos of people, young and old,

  • and of all sorts.

  • BRIAN YU: Yeah.

  • That would be my guess.

  • So FaceApp doesn't publicly announce exactly how their algorithm is working.

  • But I would imagine that it's probably just a lot of training data,

  • where you give the algorithm a whole bunch of younger photos and older

  • photos.

  • And you try and train the algorithm to be

  • able to figure out how to turn the younger photo into the older photo.

  • Such that you can give it a new younger photo as input and have it

  • predict what the older photo is going to look like.

  • DAVID MALAN: It's amazing.

  • It's really quite fascinating too, to allow

  • people to imagine what they might look like in different clothes,

  • or I suppose, with different makeup on, or so forth.

  • Computers can do so much of this.

  • But it's actually quite scary too, because a corollary

  • of being able to mutate people's faces in this way, digitally,

  • is that you can surely identify people, as well.

  • And I think that's one of the topics that's been getting a lot of attention

  • here, certainly in the US, whereby a few cities, most recently, have actually

  • outlawed, outright, the police's use of facial recognition

  • to bring in suspects.

  • For instance.

  • Somerville, Massachusetts, which is right around the corner from Cambridge,

  • Massachusetts, here, did this.

  • And I mean, that's actually the flip side of the cool factor.

  • I mean, honestly, I was pretty caught up in it,

  • when I received this photo of myself some 20, 30, 40 years down the road.

  • Sent it along happily to some other people.

  • And then didn't really stop to think, until a few days later,

  • when I started reading about FaceApp and the implications thereof.

  • That actually, this really does forebode a scary future,

  • where all too easily can computers, and whatever humans own them,

  • pick us out in the crowd or track, really

  • in the extreme, your every movement.

  • I mean, do you think that policy is really the only solution to this?

  • BRIAN YU: So I think that certainly, technology

  • is going to get good enough that facial recognition is

  • going to keep getting better.

  • Because it's already really, really good.

  • And I know this from whenever photos get posted on Facebook.

  • And I'm in the background corner of a very small part

  • of the image, Facebook, pretty immediately, is able to tell me,

  • oh, that's me in the photo.

  • When I don't even know if I would have noticed myself in the photo.

  • DAVID MALAN: I know.

  • Even when it just seems to be like a few pixels, off to the side.

  • BRIAN YU: Yeah.

  • So technology is certainly not going to be the factor that holds anyone back,

  • when it comes to facial recognition.

  • So if a city wants to protect itself against the potential implications

  • of this, then I think policy is probably the only way to do it.

  • Though it seems like the third city that most recently banned facial recognition

  • in the city is Oakland.

  • And it looks like their main concern is the misidentification of individuals,

  • and how that might lead to the misuse of force, for example.

  • And certainly, facial recognition technology is not perfect, right now.

  • But it is getting better and better.

  • So I can understand why more and more people might

  • feel like they could begin to rely on it, even though it's not 100% accurate

  • and may never be 100% accurate.

  • DAVID MALAN: But that too, in and of itself, seems worrisome.

  • Because if towns, or cities, are starting

  • to ban it on the basis of the chance of misidentification,

  • surely the technology, as you say, is only going to get better, and better,

  • and better.

  • And so that argument, you would think, is going to get weaker, and weaker,

  • and weaker.

  • Because I mean, even just a few years ago, was Facebook,

  • you noted, claiming that they could identify

  • humans in photos with an accuracy-- correct me if I'm wrong-- of 97.25%.

  • Whereas, humans, when trying to identify other humans and photos,

  • had an accuracy level of 97.5%.

  • So almost exactly the same statistic.

  • So at that point, if the software is just as good, if not better,

  • than humans' own identification, it seems like a weak foundation

  • on which to ban the technology.

  • And really, our statement should be stronger than just, oh,

  • there's this risk of misidentification.

  • But rather, this is not something we want societally, no?

  • BRIAN YU: Yeah.

  • I think that, especially now that facial recognition technology

  • has gotten better.

  • But when the Facebook did that study, I think that was back in 2014, or so.

  • So I would guess that Facebook's facial recognition

  • abilities have gotten even better than that, over the course of the past five

  • years, or so.

  • So facial recognition is probably better when

  • a computer is doing it than when humans are doing it,

  • by now, or at least close to as good.

  • And so given that, I do think that when it

  • comes to trying to decide on how we want to shape the policies in our society,

  • that we should not just be looking at how accurate these things are.

  • But also, looking at what kind of technologies

  • do we want to be playing a role in our policing system, and in the way

  • that the society runs, and the rules there.

  • DAVID MALAN: And I imagine this is going to play out differently

  • in different countries.

  • And I feel like you've already seen evidence of this,

  • if you travel internationally.

  • Because customs agencies, in a lot of countries,

  • are already photographing, even with those silly little webcams,

  • when you swipe your passport and sign into a country.

  • They've been logging people's comings and going, for some time.

  • So really the technology is just facilitating all the more of that

  • and tracking.

  • I mean, in the UK, for years, they've been

  • known as having hundreds, thousands of CCTVs, closed-circuit televisions.

  • Which, I believe, historically were used really

  • for monitoring, either in real time or after the fact, based on recordings.

  • But now, you can imagine software just scouring a city, almost like a Batman.

  • I was just watching, I think, The Dark Knight, the other day,

  • where Bruce Wayne is able to oversee everything going on in Gotham,

  • or listen in, in that case, what bias people's cell phones.

  • It just feels like we're all too close to the point where

  • you could do a Google search for someone, essentially on Google Maps,

  • and find where they are.

  • Because there are so many cameras watching.

  • BRIAN YU: Yeah.

  • And so, those privacy concerns, I think, are

  • part of what this whole recent controversy has been

  • with facial recognition and FaceApp.

  • And in particular, with FaceApp, the worry

  • has been that when FaceApp is running these filters to take your face

  • and modify it to be some different face.

  • It's not just a program that's running on your phone

  • to be able to do that sort of thing.

  • It's that you've taken a photo, and that photo

  • is being uploaded to FaceApp servers.

  • And now your photo is on the internet, somewhere.

  • And potentially, it could stay there and be used for other purposes.

  • And who knows what might happen to it.

  • DAVID MALAN: Yeah.

  • I mean, you, and some other people on the internet,

  • dug into the privacy policy that FaceApp has.

  • And if we read just a few sentences here,

  • one of the sections in the "Terms of Service"

  • are that, "You grant FaceApp consent to use the user content,

  • regardless of whether it includes an individual's name, likeness, voice,

  • or persona sufficient to indicate the individual's identity.

  • By using the services, you agree that the user consent

  • may be used for commercial purposes.

  • You further acknowledge that FaceApp's use

  • of the user content for commercial purposes

  • will not result in any injury to you or any other person you

  • authorized to act on your behalf."

  • And so forth.

  • So you essentially are turning over your facial property, and any photos

  • thereof, to other people.

  • And in my case, it wasn't even me who opted into this.

  • It was someone else who uploaded my photo.

  • And, at the time, I perhaps didn't take enough offense or concern.

  • But that too is an issue, ever more so, when folks are using services

  • like this, not to mention Facebook and other social media apps,

  • and are actually providing, not only their photos, but here is my name,

  • here is my birthday, here are photos from what I did yesterday,

  • and God knows how much more information about you.

  • I mean, we've all tragically opted into this, under the guise of,

  • oh, this is great.

  • We're being social with other people online.

  • When really, we're providing a lot of companies with treasure

  • troves of information about us.

  • And now, governmental agencies seem to be hopping on board, as well.

  • BRIAN YU: Yeah.

  • Facebook, especially.

  • It's just scary how much they know about exactly who you are

  • and what your internet browsing habits are like.

  • It's all too often that I'll be reading about something, on the internet,

  • that I might be interested in purchasing.

  • And all of a sudden, I go and check Facebook,

  • and there's an advertisement for the very thing

  • that I was just thinking about purchasing.

  • Because Facebook has their cookies installed

  • on so many websites that are just tracking every website you visit.

  • And they can link that back to you and know exactly what you've been doing.

  • DAVID MALAN: Yeah.

  • I know.

  • And I was thinking that the other day, because I

  • was seeing ads for something, that I actually went ahead and bought

  • from some website.

  • I don't even remember what it was.

  • But I was actually annoyed that the technology wasn't smart enough

  • to opt me out of those same adverts, once I had actually

  • completed the transaction.