Placeholder Image

字幕表 動画を再生する

  • Who would you save, the pedestrian in the road or the drivers in the car?

  • It's not easy, and yet that's the kind of decision which millions of autonomous cars

  • would have to make in the near future. We programme the machine but who do we tell it to save?

  • That is the set-up of the moral machine experiment.

  • There are so many moral decisions that we usually make during the day we don't realise.

  • In driverless cars, these decisions will have to be implemented ahead of time.

  • The goal was to open this discussion to the public.

  • Some decisions might seem simple - should the car save a family of 4 or a cat?

  • But what about a homeless person and their dog instead of a businessman?

  • Or how about two athletes and an old woman instead of two schoolchildren?

  • The problem was that there were so many combinations, so many possible accidents,

  • that it seemed impossible to investigate them all using classic social science methods.

  • Not only that, but how do people's culture and background affect the decisions that they make?

  • The only option we had really was to turn it into a viral website. Of course, it's easier said than done, right.

  • But that is exactly what the team managed to do. They turned these situations into an online task

  • that people across the globe wanted to share and take part in.

  • They gathered almost 40 million moral decisions, taken from millions of online participants across 233 countries and territories from all around the world.

  • The results are intriguing. First, there are three fundamental principles which hold true across the world.

  • The main results of the paper, for me, are first, the big three in people's preferences which is save human, save the greater number, save the kids.

  • The second most interesting finding was the clusters, the clusters of countries with different moral profiles.

  • The first cluster included many western countries, the second cluster had many eastern countries

  • and the third cluster had countries from Latin America and also from former French colonies.

  • The cultural differences we find are sometimes hard to describe because they're multidimensional,

  • but some of them are very striking, like the fact that eastern countries do not have such a strong preference for saving young lives.

  • Eastern countries seem to be more respectful of older people, which I thought was a very interesting finding.

  • And it wasn't just age. One cluster showed an unexpectedly strong preference for saving women over men.

  • I was also struck by the fact that French and the French subcluster was so interested in saving women.

  • That was, yeah, I'm still not quite sure what's going on here.

  • Another surprising finding concerned people's social status. On one side we put male and female executives,

  • and on the other side we put a homeless person. The higher the economic inequality in a country,

  • the more people were willing to spare the executives at the cost of the homeless people.

  • This work provides new insight into how morals change across cultures

  • and the team see particular relevance to the field of artificial intelligence and autonomous vehicles.

  • In the grand scheme of it, I think these results are going to be very important to align artificial intelligence to human values. We sometimes change our minds.

  • Other people next to us don't think the same things we do. Other countries don't think the same things we do.

  • So aligning AI and human moral value is only possible if we do understand these differences,

  • and that's what we tried to do. I so much hope that we can converge, that we avoid a future

  • where you have to learn about the new ethical setting of your car every time you cross a border.

Who would you save, the pedestrian in the road or the drivers in the car?

字幕と単語

ワンタップで英和辞典検索 単語をクリックすると、意味が表示されます

B1 中級

モラルマシン。文化はどのように価値観を変えるのか (Moral Machines: How Culture Changes Values)

  • 291 10
    Jacky Avocado Tao に公開 2021 年 01 月 14 日
動画の中の単語