Placeholder Image

字幕表 動画を再生する

  • What are you doing?

  • So I don't know.

  • I followed the GPS and it looked like he had their biking roads, But in the back road stop.

  • And we went inside this tunnel and I noticed he had the markings that whole like No way.

  • So I figured this is no place to bite.

  • But if you look at the GPS, if you can film it because I don't want to look like an asshole Look, the GPS shows going Said the tone Oh, yeah, but your GPS is this for the car or this is walking.

  • You show us.

  • So I hope this word the bitch.

  • And next year yeah, and go walking.

  • And this is just a way.

  • Chores, This is way gonna have died.

  • If a car come away, we could have died.

  • And this is dangerous up.

  • Or if you're fucking hear me, I'm sorry for the for the backwards.

  • But that was scary.

  • And I was dangerous.

  • And people like, see us and they were assholes.

  • But it's apple guiding knows this way.

  • The police is coming.

  • What?

  • Yes, that's OK.

  • I'm telling the same.

  • Assume you if anything happens with the police, I'm suing apple, that's all this is dangerous, and I'm I'm blaming Apple for this, and I'm blaming them public you have also Grange writes, one.

  • Yeah, you have also brains, right?

  • When I first saw this Internet video, to me, it's more than just another funny Internet video.

  • To me, it's a video that shows some sign of the times, some sign of the times about how technology has impact on our lives every day.

  • In this case, it's apples algorithm.

  • The technology honored the hood of this technology that under the hood of this, uh, navigation system, and it shows tourists around the city firms to them, and it goes wrong.

  • But algorithms like this, they're virtually everywhere in our society.

  • We may not be aware.

  • There they are there.

  • For instance, when you take your smartphone and you log on to your social media profile or any other news feet, there's an algorithm at work because it's algorithm, it gives it.

  • Make sure that you have a tailor mate, news feet, a newsfeed that's personalized for you.

  • Which is great, of course, because we all want personalized service is in life.

  • But it also means that this algorithm may actually define your few on the world because this algorithm select what you see and what you don't.

  • Another simple example is when you go into a supermarket and you used a self skin register.

  • That's also an algorithm at work.

  • Because this algorithm feeds itself with data and then select which groceries should be checked.

  • Checked which customers should be checked and which can just move on now.

  • Another example is text filings.

  • It's that time of year again that you have to do your text filing and, of course, text.

  • Authorities also use algorithms.

  • They use algorithms for risk analysis based again on data.

  • They determined which filings should be subject to further inspection because there's a risk off fraud or 40 40 40 filings.

  • And even when buying a house these days, apparently the stooling available that if you place your first bit on the house, you can actually be guided by an algorithm that this algorithm tells you what the best bits the best first bit on this house is.

  • So that's one.

  • Another thing I see in this Internet video is that we trust technology by default.

  • I mean, if you have ever bean to this tunnel yourself to the Peter B.

  • Hi internal.

  • You know, this is not a place for cyclists.

  • We all know.

  • And it's a place for cars, of course, but, um, we may actually find thes American tourists a bit stupid, but we should also look into the mirror because isn't this what we do all the time?

  • Ourselves as well?

  • A very tiny, simple example.

  • Most of us probably use a spell checker when using words.

  • Do you ever think about the suggestions?

  • Probably not.

  • Probably.

  • You just follow them blindly.

  • We have trust in technology by default and even very complex situations.

  • We have this trust, the self driving car.

  • There have been experiments that when you step into a self driving car for self driving vehicle for the first time, and it's this vehicle is actually one very complex large algorithm that the 1st 15 minutes or so you're a bit uncomfortable.

  • You're like, what's going on here?

  • But after that, you're fully relaxed.

  • You don't notice anything around you at all.

  • You probably start watching and Netflix movie, so that's that's the second thing.

  • That sign of the times.

  • The third is that should something go wrong, and we've seen this in the video.

  • It's never your own fault.

  • It's always someone else's fault.

  • In this case, it's apples fault.

  • So let's sue Apple.

  • So three things about this video.

  • First of all, I noticed that we are addicted to technology.

  • Second, we trust his technology, but default and third, we always blame someone else.

  • Now the question I have for you today is that if this technology is so dominant, we're moving into a new era.

  • We used to say that Big Brother was watching us.

  • And is this about privacy?

  • George Orwell wrote it.

  • But in reality we're moving into a world where Big Brother is guiding us.

  • Where Big Brother is guiding us into our decisions.

  • And the question I have for you today is Big Brother guiding us into the right direction.

  • No.

  • Do you understand this topic?

  • We must dig into technology just a little bit and I will keep it very simple today.

  • I will.

  • I will use a very simple example.

  • Think of think of it as think of algorithms s baking an apple pie.

  • If your grandmother makes an apple pie, she's basically using an algorithm because an algorithm is nothing more than a set of instructions to get to a desired outcome, which is, in this case, the yummy apple pie.

  • Now, if you tell your grandmother that you love her apple pie, but you would even love it better without the raisins in sight.

  • If she loves you, she might listen to you.

  • And in technology terms, we would call such a feedback loop, machine learning or even artificial intelligence.

  • We also must think about trust and what trust actually is.

  • And this is important because there is a very common misconception about trust.

  • Many of us think that we trust people, organizations, governments, companies because we know things about them because we have information on them.

  • But in reality, this couldn't be further away from the truth, because in most cases we trust we would blind trust in people or in organizations, and again a very simple example.

  • If you go into a supermarket and you buy a pack off milk, what you do is you probably only checked the date.

  • We have no information, whatever on the quality of the milk, you don't know where it came from.

  • You're not interested in you just buy it, you consume it, that's it.

  • And it's actually when you buy the speck of milk, it's a leap into an unknown world.

  • So trust is every time you do that, you leap in an unknown world, and we do that hundreds of times every day because think of sitting here in this theater.

  • None of you has has thought before about the quality of this building.

  • And if it will not collapse on us today?

  • Of course not.

  • We couldn't do that.

  • So we have a blind trust in the quality off this building.

  • We do that hundreds of times every day.

  • But there's also another version of trust, and I call that dumb trust.

  • And don't trust is, of course, what happens in video a few minutes ago.

  • Dump trust happens to us all the time to us as well, when we're using digital technology such as Epps on your smartphone, because there is one vital difference between buying a pack of milk and using an app on your smartphone in when I buy this speck of milk, I don't have any information on the milk, but I do know that as a system of checks and balances.

  • I knew doubt at this quality inspection.

  • I do know that there's experts looking into the production looking into the logistics off this milk production.

  • And that's where this why I can give my trust blinds.

  • Now Let's move over to the digital world, the digital riel.

  • How is how are these checks and balances in this in this digital world?

  • Well, they're not so good.

  • Of course, we have oversight bodies that look into digital technology.

  • But they came on.

  • I think that's privacy.

  • There hardly is any audience or inspection on the question.

  • Does this technology actually do what it's supposed to do?

  • There are no checks and balances.

  • There are no quality inspectors.

  • There are no oversight bodies looking into that.

  • So when I talk about this, you might think that I'm a bit of a pessimist because I talk about the negative side about the risks about what could go wrong.

  • What I'm not, I'm an optimist.

  • When it comes to digital technology, I really think that we're just we've just gotten so we're just getting started when it comes to artificial intelligence or any other digital technology, and that in the next 5 to 10 years, we will see lots of opportunities to improve our lives, to make our lives easier and to bring real value, for instance, in in healthcare real improvements.

  • But we must make sure that's the beautiful face.

  • Winds from the ugly face because historically technology has always had to face is a beautiful face and an ugly face.

  • We could even think back of the Stone Age when, um when people for the first time in history at Stone Instruments, this was a perfect way two for cooking or cutting trees.

  • But it was also a very efficient way to kill your neighbor or think off nuclear energy.

  • This this nuclear invention is a fantastic way to produce massive amounts of energy.

  • But it also brings brings along a lot of toxic waste which takes ages to diminish in nature, which is a really problem.

  • And the same is true for all this beautiful digital technology that is in our world now.

  • And we must find a way to make sure that the beautiful face off this digital technology will dominate over the ugly face.

  • Because, as I just said, this whole concept of blind trust is everywhere around us, and it's actually it's beautiful.

  • It's great.

  • Wish you cherish it.

  • It's It's the only way we can live.

  • So if we really want the beautiful face off the digital technology to dominate, we shoot, implement Maur checks and balances.

  • We should look into quality off all these systems into the question off.

  • Do they actually do what they are supposed to do now?

  • Of course, the $1,000,000 question today is.

  • How are we going to fix that?

  • And to be honest, I don't have a simple success recipe today.

  • It's It's really a wicked problem, and it has many perspectives.

  • But I do know two things.

  • One is that I hope and expect that in the next 5 to 10 years we will start asking governments, politicians cos that they improve their checks and balances on developing digital technology that they will actually implement quality inspection that they will look into.

  • They will make sure that what they develop is robust, and there are at this moment politicians working on this.

  • There are politicians working on creating oversight bodies in this area.

  • Let's encourage them because it's very important and the second thing is that I also hope that the next time you use a self skin register or the next time you use a news feet on your eye pouring your social media that at least you think back of today on that at least you realize that doesn't algorithm it work that may guide you and that at least you start asking questions if all goes right, because the best advice that we can give in this domain was given by this police officer in the video.

  • I just showed you in perfect Doug Lish, she said, You have also brains, Thank you.

What are you doing?

字幕と単語

動画の操作 ここで「動画」の調整と「字幕」の表示を設定することができます

B1 中級

お兄ちゃんが私たちを導いてくれる|ナート・ヴィーラード|TEDxHaarlem (Big brother is guiding us | Nart Wielaard | TEDxHaarlem)

  • 0 0
    林宜悉 に公開 2021 年 01 月 14 日
動画の中の単語