Placeholder Image

字幕表 動画を再生する

  • Hey, Vsauce. Michael here. Do you want to be infected with Ebola

  • without having to leave your own home or deal with other people?

  • Well, you might be in luck. You can already download an Ebola virus

  • genome.

  • Right here on the Internet, right now. And if you're willing to wait

  • a few years for 3D bioprinting technology to progress

  • a little bit, you can just acquire one then, submit the genome to it

  • and ta da! All you can print Ebola.

  • Or anthrax or whatever it is you wish to mass-produce at home

  • to wipe out humanity.

  • Are humans going to go extinct

  • soon? Will human extinction be

  • anthropogenic? That is the result of human

  • action. Or will it be one of the good old-fashioned kinds

  • of extinction Earth's history knows pretty well?

  • The Global Catastrophic Risks Survey, issued by Oxford University's

  • Future of Humanity Institute placed our risk of extinction

  • before the year 2100 at 19%.

  • Now, you might be thinking "whatever, blah blah blah armageddon".

  • "It'll be okay, humans are too smart

  • to go extinct." Maybe you're right.

  • But it's difficult to predict the distant future

  • with a lot of certainty. What's really cool though

  • is that if you embrace that uncertainty, a simple argument

  • can show that human extinction soon is actually

  • more probable. It's called the Doomsday

  • argument. Imagine a giant

  • urn that contains either 10 balls

  • numbered 1 to 10, or a million balls

  • numbered 1 to a million. Now, you don't know

  • which is the case, but you are allowed to pull out

  • one ball. You go ahead and do that

  • and it is ball number 4.

  • That's pretty strong evidence in favour of the 10 ball condition

  • because drawing a four from a set of 1 through 10

  • is a one in 10 chance. But drawing four from a million different numbers

  • is a one in a million chance.

  • By analogy you are also a numbered

  • ball. You are a human who knows

  • approximately what your birth number is.

  • It's probably somewhere around 100

  • billion. That's how many other humans

  • were most likely born before you were.

  • Importantly, you didn't get to decide which birth number

  • you would have. So, just like the number for a ball,

  • you are a random sample from the set of all humans

  • who will ever live. The Doomsday argument points out

  • that from 200 billion people there's a 50 percent chance that a randomly chosen

  • person,

  • like you, would be born in the first one hundred billion.

  • Whereas if there will be 10 trillion humans,

  • there's only a one percent chance that any given human,

  • say you, would happen to be born within the first

  • 100 billion. Either you are special

  • and lucky to be born so improbably early in the story of humanity

  • or your birth number is to be expected

  • because there will not be tens of trillions of humans.

  • Human extinction will be sooner

  • rather than later. But before you become

  • too convinced that the end is nigh, keep in mind that the Doomsday argument is

  • not

  • uncontroversial. One problem it might have

  • is a reference class problem. Are you really a

  • random sample from the set of all humans who will ever

  • be born? Well, if you believe that in the not so distant future

  • humans will be quite different than they are today.

  • For instance, there'll be full of more 3D printed

  • organs. The mere fact that right now there aren't very many humans

  • with that trait could be evidence that you aren't a random sample from the

  • set of all humans,

  • just from the set of all humans like

  • you, like does around you. Those born

  • earlier in human history. Also

  • the Doomsday argument doesn't consider the likelihoods

  • of actual threats or human advantages

  • over those threats in the future. It just assumes that

  • we don't know which way the balance will lie; that

  • human extinction soon and human extinction

  • later are equally likely. But maybe you don't believe that.

  • Maybe you are convinced that human ingenuity will

  • always stay one step ahead of any extinction event

  • thrown at it. You could be right,

  • but there's reason to doubt that optimism.

  • For example, the Fermi paradox.

  • If it is likely that intelligent life forms in our universe are capable

  • of living for billions

  • and billions of years, where are they?

  • Why are the skies so silent? Perhaps

  • it is because extinction level threat events are just

  • too common for intelligent life anywhere

  • to ever catch up.

  • So,

  • does this mean we should just give up?

  • The Voluntary Human Extinction Movement think so.

  • Founded in 1991, its supporters believe that

  • humans are a negative influence

  • on Earth and always will be. Thus

  • we have a moral obligation to just stop reproducing

  • right now and fade away. But what would

  • a computer do? In a way, that's

  • kind what Tom 7 did. He created a program that plays video games.

  • The program came up with novel techniques and strategies for playing

  • games and even exploited glitches

  • humans didn't know about, or at least

  • hadn't told it about. He also had the program play other games,

  • like Tetris, which I think is relevant

  • to our question. The computer struggled to figure out what to do.

  • You see, the computer wasn't programmed to consider future repercussions far

  • enough ahead

  • to notice that stacking Tetriminos in certain ways

  • made a big difference. On one run, when faced with

  • imminent demise, the computer did something

  • eerie. Rather than

  • lose, and receive a 'game over',

  • it just paused the game. For

  • ever. Tom 7 describes the computer's reasoning

  • like this: "The only winning move

  • is to not play." And that's right.

  • If you pause a game for ever

  • you will never lose that game. But you'll also never

  • win that game or achieve a high score.

  • Now, we might not know what achieving a

  • sentient life high score in this universe

  • means or whether or not we're capable of achieving one.

  • We might also sometimes panic

  • when the future looks bleak. But if we keep playing

  • and keep learning, chances are we could eventually

  • figure it out and start playing

  • really well.

  • So, thanks for continuing to play, for being here.

  • And as always,

  • thanks for watching.

Hey, Vsauce. Michael here. Do you want to be infected with Ebola

字幕と単語

ワンタップで英和辞典検索 単語をクリックすると、意味が表示されます

B1 中級

人類滅亡 (Human Extinction)

  • 397 13
    jk HC Yang に公開 2021 年 01 月 14 日
動画の中の単語