Placeholder Image

字幕表 動画を再生する

  • Good morning.

  • Uh, hello.

  • My name is Kyle Oba.

  • Uh, thank you to all organizers and volunteers for You know, this has been a amazing experience so far.

  • Very ex.

  • I didn't be here.

  • You can probably tell him a little bit nervous, so I know I'm just gonna dive right in.

  • All right, So I work in a company called Pot au Chocolat.

  • We work on local local problems with local clients when you try to find local solutions as much as possible for a research and design and development company.

  • Uh, my partner and I started this company before moving to Hawaii, but she's actually frank.

  • Y eso um, that's where this whole, like local centered design and research comes from.

  • We partner with other researchers, designers and programmers, but enough about me.

  • Um, I'm sorry.

  • You can't see my slides.

  • Oh, no.

  • Um, should I just try to restart that?

  • Okay.

  • Oh, boy.

  • Okay, cool.

  • Thank you.

  • Thank you.

  • Uh, that's my presentation.

  • Uh, he noticed a thing, uh, used to work.

  • Okay.

  • So cool.

  • Um, all right.

  • So diving right in.

  • So the unprecedented.

  • So Shoshana Zubov's recently published a book called The Age of Surveillance.

  • capital of him.

  • I'm not gonna pretend to have read the whole thing because it's like over 700 pages, and I'm a busy person.

  • But, um, yeah, so that's his idea of the unprecedented and in her own words, the unprecedented is necessarily unrecognizable.

  • So when we encounter something that's unprecedented, we automatically interpret it through the lens Is that we're familiar with, or the of familiar categories thereby rendering invisible precisely which is unprecedented.

  • When I read this housing.

  • Wow, this is amazing.

  • Um, I originally was going to center this talk all around contacts, but I really like her idea here.

  • Um And so if you were living or visiting Hawaii about a year ago, exactly almost, you would have received this on your phone.

  • It's all caps, so you know it's important.

  • Ballistic missile threat inbound to Hawaii.

  • Seek immediate shelter.

  • And then if you were like, is that really?

  • It says this is not a drill, which you know.

  • Usually I'm people are drilling beginning eight in the morning on Saturday.

  • But whatever.

  • Uh, yes.

  • Oh, from my own experience.

  • You know, I saw this.

  • You know, it was Saturday morning.

  • I just kind of calmly like doing stuff in the kitchen.

  • Look at my phone, and I guess I'm gonna die, right?

  • And, uh, you know, usually when I see you know, we're all gonna die, But, you know, But when I see this type of message on my phone and maybe some of you got this last night, but it's like there's a flood coming.

  • Maybe you know, there's a hurricane on its way.

  • Maybe maybe you should care.

  • Maybe right.

  • And so that's kind of the lens that I used to interpret this message, which is kind of like with Zuba office saying it was totally the wrong lens to use.

  • It was just a lens that was familiar to me.

  • So I immediately started filling my water filter in the kitchen with water, which seems reasonable.

  • But like when you within, like, 38 minutes later, when they say just kidding, just just did.

  • That's not Riel.

  • I really, like, had time to evaluate, like, my own behavior and and it really didn't make any sense.

  • I mean, like, if we get hit by a missile eyes pouring water, I mean, I don't know, didn't really matter.

  • Okay, cool.

  • So that's the idea of the unprecedented how we require these lenses in order to interpret new things.

  • And so obviously this talk with a little bit about artificial intelligence, which is kind of weird, broad term.

  • So, like palaces.

  • House is relevant to us today.

  • So as builders and designers as a lot of us are, we need to create that new lens for ourselves in order to interpret that which is unprecedented to us, into the community or into society.

  • Um, yeah.

  • One of the common themes to a lot of the work that we do is that we'd like to try to find the UN relatable things that are invisible things that are undefined and try to make them more relatable, visible and articulate them more fully.

  • And it's only through the creation of these new lenses that we can then further these discussions with our communities.

  • So, as an example of that, I wanted to take you all through a project that we did with the Honolulu Museum of Art, which is just down the street.

  • Uh, and I highly recommend it.

  • Um, so about the time that that alert came out, we had the opportunity to do this project with the museum, and we called it a design intervention, and we wanted to sort of forward the discussion of surveillance technologies.

  • So at the time, they had this program called Classified, that was produced by the Doors to Theatre, which is in the museum, which was to highlight screening.

  • Uh, well highlights of the program we're screening of films including Lower Pa Tresses, Citizen for documentary about Edward Snowden.

  • And there was also a panel discussion.

  • So the panel discussion took place on, you know, just one particular day involved Kate Crawford, one of the co founders of the analysis, to Trevor Paige.

  • When an artist who works with surveillance technologies Laura, Pa Trice, the director Edward Snowden actually Skyped in.

  • And Ben Weiser, who's his a c L.

  • U attorney, and it was moderated by Hasan Elahi, another artist who works with surveillance technology.

  • Oh, jeez, or surveillance in general, um, and so we were invited to sort of come and create an experience.

  • And so we dubbed it a design intervention to complement what was happening in the panel.

  • So So the museum called it the my profile tour.

  • Um, I'm not sure how I feel about that name, but you know that it's stuck.

  • So that's what it is.

  • It's my semi pro fire to my profile.

  • Picture is a little It's a little 2000 for a name, I think a little bit, but it's cool.

  • Uh, and, uh, so the purpose of our of our designer invention was to sort of raise raise awareness about things in you that can go wrong.

  • So the fallibility of face detection, face recognition technology and problematic data privacy issues.

  • So this is, like, about a year ago?

  • Um, so we weren't Well, I'll just leave it at that.

  • So we weren't really sure what was gonna happen with those issues, but we knew they were problematic anyway, eh?

  • So this was this was mentioned to use public to specific technologies and the use cases of machine machine learning in the same giant.

  • Amorphous A, I think on to replace sort of those marketing buzzwords with some a little bit more articulate discussion and specific discussion about these issues.

  • All right, so what did we build?

  • So we built individuals is, I don't know, elevator pitch that we built in vivid, individualized tours based on guests faces matched are currently on display in the museum.

  • Okay.

  • And I know a lot of you are thinking Isn't that what Google did in their app for Google Arts and culture?

  • And now I'm not lying when I tell you this, and you know that that's what liars say.

  • But anyway, we stay.

  • They released their out, like, two weeks before we went live, and it was just sort of uncanny.

  • I actually didn't find out about their capital after we after the panel, and I was like, Oh, wow, that that's disappointing.

  • But at the same time, it was really interesting because they were doing something very, very similar.

  • I can, So I'm gonna kind of walking through how we did it and some of the differences.

  • But I think it was an uncanny coincidence, and I think a lot of it had to do with how good certain types of technology, like face recognition and detection we're getting at about that time just and how easy it was to apply.

  • So, um, all right, so cool that my Provo tear, how did it work?

  • So, Step one, you had to opt in.

  • Not everybody had to do this, uh, you can imagine at a panel discussion where Edward Snowden skyping in in the state of Hawaii, where he used to work.

  • It's a little bit weird to have somebody ask you to take a picture of you before you go into the panel.

  • So you had to opt in, and then if you did often, we took a picture of you and it said, Hey, this is the picture we're taking.

  • Would you like to opt in?

  • Uh, if you did, we kind of labeled you in the real world.

  • We gave you a sticker and a number on there was sort of, like, sort of a symbolic.

  • Like you're this is happening to you?

  • Uh, yeah.

  • I mean, you know, let's be fair.

  • Not every website does this.

  • Okay, well, they don't give you a sticker.

  • Well, actually, the stickers are thing, but anyway, Okay.

  • Uh oh, yeah.

  • Incidentally, asked me later.

  • I have stickers about, uh All right, cool.

  • So the four step customized tour.

  • So the panel was an hour long.

  • So before you walked in, we took your picture.

  • When you walked out, we handed you a customized tour.

  • So where there was a lot that went into sort of printing and formatting and laying these things out ahead of time so that only the live customized content would be applied.

  • Uh, cool.

  • So the process.

  • So I know this is a little small for you.

  • See, I'm sorry about that.

  • This is sort of like version from the ah version of what we put out in a handout.

  • And so basically the top part of it is stuff that happened ahead of time.

  • And the bottom part is stuff that happened on site that day, and I'll walk you through that.

  • So step one of I think almost every one of these machine learning slash you know, whatever you wanna call it, artificial intelligence projects is data collection, right?

  • Or getting data from somewhere.

  • And so, as you can imagine, when you're working with the museum who hasn't really done this type of thing before, data collection is hard.

  • Data collection is hard.

  • I in my experience, no matter what project I'm working on, so we had to walk through the entire museum, find out where all the art waas write down all the numbers associated with that art and figure out what they were called, what room they were in and then get a photo of everything that was that was a lot of work.

  • And then we sort of processed the images of the art as well as the sculpture.

  • So anything that they had that was gonna be on this display at that time, that wasn't in sort of like a special exhibition where we weren't allowed to photograph the art.

  • Um, we pass it through a face detection neural network, which is essentially just says, Hey, I think there's a face here and I think it's inside of this rectangle and then you can highlight that or crop it out.

  • So in this case, we just cropped out the faces, and then we used a second neural network, which was a landmark prediction neural network, which essentially says, Hey, we're I think the eyebrows are probably based on you telling me this is a face.

  • I think this is where the eyebrows are.

  • The eyes, the nose, the mouth of the jaw, et cetera, and the image on the bottom there, which is me.

  • I just wanted to throw in here that although we did a lot of this in python because the beginning of every Web project using machine learning is to install python on your machine.

  • Uh, this is something that you can actually do in the web.

  • So I wanted to put this link here.

  • This is an example that runs in ML five gs and is rendered with P five gs or, you know, sort of like put into the browser with P five gs.

  • It's super cool.

  • So if you're like if you love javascript, you can do all of this stuff from the comfort of your JavaScript.

  • But just remember, you have to sell Python because that's just required.

  • But don't be afraid of iPhone.

  • It's just executed pseudo code.

  • All right, Uh, next.

  • Let's see.

  • That's a joke for the pipe on people.

  • All right, so this prepares us for a new miracle conference comparison so we could actually pass that set of landmarks to a face recognition neural network with the men essentially render that image into a bunch of numbers like Big Vector, right?

  • And so now that we have to face and we know where the landmarks are, we've turned it into a bunch of numbers.

  • We can start matching into things.

  • Okay, so, like, on the stable, that stuff away.

  • And then on the day of the event people come in, we take their picture, right, and then we find the landmarks on their face, and then we can then render them into numbers and then match them to the art that they're similar to Cool.

  • And then, since we have faced scary facial recognition software, we can then say, Hey, do you want your tour?

  • Oh, we know who you are.

  • So here's your tour.

  • Right.

  • So that was another thing we had, Um, so it wasn't just about matching faces to art.

  • We also did a number of other things.

  • So this is the tour folded up that we handed to them.

  • So it has their face, it hasn't landmarks.

  • And it has a number of super problematic classifications that we said We're attributes of you based on your face.

  • And this was not meant to be like True is meant to be, like, super problematic and something that people should.

  • Question s so special.

  • Thanks to Kyle MacDonald, who's an artist, um, who trained the neural network to work with these classifications on the label faces of the wild data set.

  • Um, it's another one of things where you can pop that thing in the anyway.

  • I won't go into it.

  • Um, so these labels were super problematic, and I think in some cases, kind of insulting it might tell you you're not attractive, right?

  • And some, uh, it might tell you, you look tired, you know, stuff like that.

  • And you know, that's not something you want to hear from computer anybody, But nonetheless, computers will do that.

  • And so the next page, we actually, this is the stuff that was besides, the labels was custom generated, so here were able to say, Okay, we think you look like this painting.

  • We think you look like this sculpture and then based on everything that was in the frame when you took your photo, we match it.

  • We're matching you using a different neural network that does object matching.

  • And I'll go into that a little bit later.

  • And then because of your classifications that we gave you were going to say, this piece of art is something you might be interested in based on what was strongest in your classifications and then and then we created a path through other pieces of art that sort of like provided this sort of like trail that you could follow from one our piece of art to the neck.

  • This was an educational project.

  • So what we really wanted to do was to provide people away to question the technology and have it be personal so that people could have further more articulate discussion in the community.

  • So we also wanted to highlight where our technologies failed.

  • I'll go into that a little bit.

  • Um, and then also we provided details about how the R R was chosen very technical level if people wanted to do to get mad as well, so cool.

  • So one of the things that we really wanted to do was to say, you know, what are the ways that we can use to find art that we can match to people?

  • And so one of the techniques that we came across with something called Reverse Image Search and so reverse image search essentially taken image.

  • Use an object detection neural network, and you kind of short circuit it to provide you with numbers instead of names of things.

  • And then, with those numbers, you can sort of compare them to other objects and find in the data set other objects that look the same.

  • So just kind of like that, that picture clean itself.

  • One of the object of the Noron Agassi used was called V G 16.

  • It's 16 layers.

  • You slice off the last layer that basically tells you what the thing is, and you're left with a giant vector for every object.

  • That's numbers.

  • Um, here's another example of finding stuff that's similar to the base on the left.

  • So once you've got these numbers, you basically have coordinates in this multi dimensional space, and you can use traditional uh, distance algorithms to see what's close toe wet.

  • And then once you do something called PC analysis or principal component analysis, you can slam 4000 dimensions down into two dimensions, using this algorithm, which essentially maintains closeness of points in multi dimensional space in the two dimensional translation.

  • And then once you're in two dimensions, you can use your traditional graphing algorithm or graph algorithms to find shortest chaps like you would to find a route through the city.

  • So in this case, we have a picture of a person you may have heard of him and the and the vase on the right hand side.

  • And we tried to find items in the museum collection that would sort of like transition for one to the next.

  • I'm not sure how convincing this is.

  • I kind of think it's convincing.

  • Maybe it's like bias on me for selecting this, but anyway, um, you kind of get the picture.

  • This is how we selected the art.

  • Uh, so lessons learned, Ah, we had to try a lot of different ways to try to find things that would work to Matt to convincingly match you to something that's in the art, right to a piece of art.

  • And what we got to this point, we're like, Yeah, that's pretty good.

  • Zuckerberg looks like that guy.

  • So Okay, we're like, we're done.

  • All right.

  • Uh, but there's also mistakes that are made, and I should've put, like, a warning or something on the previous slide so that we started out with this thing called the open Sea vi har classifier to find faces.

  • Turned out, it's not that good.

  • Or at least it was good for a while.

  • But then when you know these convolution.

  • All neural networks came out.

  • We ended up using one bite called the lib, which hasn't put that pipe on a P.

  • I, um it got much better, but we're getting results like this, and I just I mean, it's like automatic screenshots save, right?

  • Right.

  • Uh, also privacy policy.

  • This was one of the least I lesser successful parts of the project.

  • We put a giant wall sized terms and conditions on the wall to remind people that we were taking their images and making conclusions about them and recording it on a computer and basically saying all the horrible things we could do with that information.

  • And they were the very end we say, like, but we're not going to do.

  • And we're gonna do it your data in 30 days.

  • But you should really think about this, Uh, and I don't know what it is about putting terms and conditions in front of people, but they don't want to read.

  • Um, so almost nobody read this, which is a little disappointing for our graphic designer that we work with.

  • But it did.

  • She's a lesson that it was really incumbent on us.

  • to make this more approachable and consumable.

  • Uh, another super difficult thing to deal with was using the classifications and labeling the data that exists in the why are exist currently, right?

  • If you look at this, it's probably too small for you to see.

  • But if you look at this up close, you'll see that there's only four ethnicities available in the label.

  • Faces in the wild.

  • Um, attribute classifier Sze, uh, black, white, Asian and Indian.

  • I thought Asians and Indians were kind of, but whatever, uh, and you know, we're in Hawaii.

  • We're doing this project in Hawaii.

  • There's a lot of Southeast Asians, right?

  • If you just say Asia and you're like, I don't know and it's everybody almost And, um yeah, and there's no Hawaiian Right.

  • So you know that there's some kind of leaving out a giant portion of the population, and then also I'd probably need to go into it.

  • But, you know, art collections that you're using this for also have a bias, uh, kind of taking a wider view.

  • You know, I found this when we were doing a research.

  • There's obvious bias here.

  • I don't need to go into it but the data set has a bias.

  • The engineers have a bias.

  • There's bias out there people, anyway.

  • And then, uh so this is Hawaii.

  • Why is weird?

  • And this is why it's my concern.

  • Why is super unique In United States, you're number one in terms of the percentage of nonwhite population.

  • Even looking at this chart, there's no category for Hawaiians.

  • There's also no category for mixed, mixed race or ethnicity.

  • So so why is that right?

  • Because the bar graft who didn't really allow for that and then also locally, our police department just bought 1200 axle on body cameras.

  • Which sounds a lot like quitting may actually body cameras.

  • And I'm a little worried about that because, you know, excellent has an aye aye division headquartered in Scottsdale, Arizona.

  • Uh, and you know, if bias of the thing, maybe this is something we should be concerned about.

  • Why looks very different than Scottsdale, Arizona, Um, and a worthwhile.

  • So I think, in the words of I'm gonna leave you with a couple of quotes, so worthwhile, uh, worthwhile exercise would be to delete the word technology from vocabulary in order to see how quickly capitalism's objectives are exposed.

  • Marketing is powerful, right?

  • Aye, aye, I believe is one of those terms which, if not Maur clearly defined.

  • It's essentially just marketing, right?

  • And a I mean anything that's all powerful and inevitable.

  • And as long as we we allow that term to be just going around without properly defining it were basically giving into this sort of desire to just say it's inevitable.

  • It's all powerful.

  • It's gonna happen.

  • Let's just let it right.

  • So where is it?

  • So there's an opportunity for us.

  • I'm gonna is gonna go through this real quick, but this is actually happening, like people are getting tickets in China because their picture goes across the intersection, right?

  • We don't really have to let that happen.

  • But we do.

  • Uh, also, um, this quote by every anymore, is also inefficiency is precisely what shelters us from the inhumanity of Taylor is, um, and market fundamentalism.

  • So when if it inefficiency is the result of the deliberative commitment by democratically run community, there's no need to eliminate it, even if the latest technologies can accomplish that in no time.

  • I love this guy's books.

  • I highly recommend them essentially sometimes inefficiency is a good thing we should think about when I think about it when we eliminated and then finally So technology is not and never can be a thing in itself, isolated from economies and it kind of economics and society.

  • This means that technological inevitability does not exist.

  • And I can't I love this quote.

  • Um, it's everything that we love about the work that we do.

  • It's learning about the community's.

  • I'm learning about the technology that they use and understanding when it's appropriate to apply the technology and when it's appropriate to step back and just sort of listen.

  • Um, so I encourage you all to take this.

  • You know?

  • What, did they end on a high note?

  • So, uh, to take what I have said it would go forth to your local communities and see how you can create your own lens, uh, and use your lens to shape the future with these technologies because none of this is inevitable.

  • So thank you very much.

Good morning.

字幕と単語

ワンタップで英和辞典検索 単語をクリックすると、意味が表示されます

B1 中級

HIとAI - カイル大場|JSConf Hawaii 2019 (HI and AI - Kyle Oba | JSConf Hawaii 2019)

  • 2 0
    林宜悉 に公開 2021 年 01 月 14 日
動画の中の単語