Placeholder Image

字幕表 動画を再生する

  • I want you to imagine

  • walking into a room,

  • a control room with a bunch of people,

  • a hundred people, hunched over a desk with little dials,

  • and that that control room

  • will shape the thoughts and feelings

  • of a billion people.

  • This might sound like science fiction,

  • but this actually exists

  • right now, today.

  • I know because I used to be in one of those control rooms.

  • I was a design ethicist at Google,

  • where I studied how do you ethically steer people's thoughts?

  • Because what we don't talk about is how the handful of people

  • working at a handful of technology companies

  • through their choices will steer what a billion people are thinking today.

  • Because when you pull out your phone

  • and they design how this works or what's on the feed,

  • it's scheduling little blocks of time in our minds.

  • If you see a notification, it schedules you to have thoughts

  • that maybe you didn't intend to have.

  • If you swipe over that notification,

  • it schedules you into spending a little bit of time

  • getting sucked into something

  • that maybe you didn't intend to get sucked into.

  • When we talk about technology,

  • we tend to talk about it as this blue sky opportunity.

  • It could go any direction.

  • And I want to get serious for a moment

  • and tell you why it's going in a very specific direction.

  • Because it's not evolving randomly.

  • There's a hidden goal driving the direction

  • of all of the technology we make,

  • and that goal is the race for our attention.

  • Because every news site --

  • TED, elections, politicians,

  • games, even meditation apps --

  • have to compete for one thing,

  • which is our attention,

  • and there's only so much of it.

  • And the best way to get people's attention

  • is to know how someone's mind works.

  • And there's a whole bunch of persuasive techniques

  • that I learned in college at a lab called the Persuasive Technology Lab

  • to get people's attention.

  • A simple example is YouTube.

  • YouTube wants to maximize how much time you spend.

  • And so what do they do?

  • They autoplay the next video.

  • And let's say that works really well.

  • They're getting a little bit more of people's time.

  • Well, if you're Netflix, you look at that and say,

  • well, that's shrinking my market share,

  • so I'm going to autoplay the next episode.

  • But then if you're Facebook,

  • you say, that's shrinking all of my market share,

  • so now I have to autoplay all the videos in the newsfeed

  • before waiting for you to click play.

  • So the internet is not evolving at random.

  • The reason it feels like it's sucking us in the way it is

  • is because of this race for attention.

  • We know where this is going.

  • Technology is not neutral,

  • and it becomes this race to the bottom of the brain stem

  • of who can go lower to get it.

  • Let me give you an example of Snapchat.

  • If you didn't know, Snapchat is the number one way

  • that teenagers in the United States communicate.

  • So if you're like me, and you use text messages to communicate,

  • Snapchat is that for teenagers,

  • and there's, like, a hundred million of them that use it.

  • And they invented a feature called Snapstreaks,

  • which shows the number of days in a row

  • that two people have communicated with each other.

  • In other words, what they just did

  • is they gave two people something they don't want to lose.

  • Because if you're a teenager, and you have 150 days in a row,

  • you don't want that to go away.

  • And so think of the little blocks of time that that schedules in kids' minds.

  • This isn't theoretical: when kids go on vacation,

  • it's been shown they give their passwords to up to five other friends

  • to keep their Snapstreaks going,

  • even when they can't do it.

  • And they have, like, 30 of these things,

  • and so they have to get through taking photos of just pictures or walls

  • or ceilings just to get through their day.

  • So it's not even like they're having real conversations.

  • We have a temptation to think about this

  • as, oh, they're just using Snapchat

  • the way we used to gossip on the telephone.

  • It's probably OK.

  • Well, what this misses is that in the 1970s,

  • when you were just gossiping on the telephone,

  • there wasn't a hundred engineers on the other side of the screen

  • who knew exactly how your psychology worked

  • and orchestrated you into a double bind with each other.

  • Now, if this is making you feel a little bit of outrage,

  • notice that that thought just comes over you.

  • Outrage is a really good way also of getting your attention,

  • because we don't choose outrage.

  • It happens to us.

  • And if you're the Facebook newsfeed,

  • whether you'd want to or not,

  • you actually benefit when there's outrage.

  • Because outrage doesn't just schedule a reaction

  • in emotional time, space, for you.

  • We want to share that outrage with other people.

  • So we want to hit share and say,

  • "Can you believe the thing that they said?"

  • And so outrage works really well at getting attention,

  • such that if Facebook had a choice between showing you the outrage feed

  • and a calm newsfeed,

  • they would want to show you the outrage feed,

  • not because someone consciously chose that,

  • but because that worked better at getting your attention.

  • And the newsfeed control room is not accountable to us.

  • It's only accountable to maximizing attention.

  • It's also accountable,

  • because of the business model of advertising,

  • for anybody who can pay the most to actually walk into the control room

  • and say, "That group over there,

  • I want to schedule these thoughts into their minds."

  • So you can target,

  • you can precisely target a lie

  • directly to the people who are most susceptible.

  • And because this is profitable, it's only going to get worse.

  • So I'm here today

  • because the costs are so obvious.

  • I don't know a more urgent problem than this,

  • because this problem is underneath all other problems.

  • It's not just taking away our agency

  • to spend our attention and live the lives that we want,

  • it's changing the way that we have our conversations,

  • it's changing our democracy,

  • and it's changing our ability to have the conversations

  • and relationships we want with each other.

  • And it affects everyone,

  • because a billion people have one of these in their pocket.

  • So how do we fix this?

  • We need to make three radical changes

  • to technology and to our society.

  • The first is we need to acknowledge that we are persuadable.

  • Once you start understanding

  • that your mind can be scheduled into having little thoughts

  • or little blocks of time that you didn't choose,

  • wouldn't we want to use that understanding

  • and protect against the way that that happens?

  • I think we need to see ourselves fundamentally in a new way.

  • It's almost like a new period of human history,

  • like the Enlightenment,

  • but almost a kind of self-aware Enlightenment,

  • that we can be persuaded,

  • and there might be something we want to protect.

  • The second is we need new models and accountability systems

  • so that as the world gets better and more and more persuasive over time --

  • because it's only going to get more persuasive --

  • that the people in those control rooms

  • are accountable and transparent to what we want.

  • The only form of ethical persuasion that exists

  • is when the goals of the persuader

  • are aligned with the goals of the persuadee.

  • And that involves questioning big things, like the business model of advertising.

  • Lastly,

  • we need a design renaissance,

  • because once you have this view of human nature,

  • that you can steer the timelines of a billion people --

  • just imagine, there's people who have some desire

  • about what they want to do and what they want to be thinking

  • and what they want to be feeling and how they want to be informed,

  • and we're all just tugged into these other directions.

  • And you have a billion people just tugged into all these different directions.

  • Well, imagine an entire design renaissance

  • that tried to orchestrate the exact and most empowering

  • time-well-spent way for those timelines to happen.

  • And that would involve two things:

  • one would be protecting against the timelines

  • that we don't want to be experiencing,

  • the thoughts that we wouldn't want to be happening,

  • so that when that ding happens, not having the ding that sends us away;

  • and the second would be empowering us to live out the timeline that we want.

  • So let me give you a concrete example.

  • Today, let's say your friend cancels dinner on you,

  • and you are feeling a little bit lonely.

  • And so what do you do in that moment?

  • You open up Facebook.

  • And in that moment,

  • the designers in the control room want to schedule exactly one thing,

  • which is to maximize how much time you spend on the screen.

  • Now, instead, imagine if those designers created a different timeline

  • that was the easiest way, using all of their data,

  • to actually help you get out with the people that you care about?

  • Just think, alleviating all loneliness in society,

  • if that was the timeline that Facebook wanted to make possible for people.

  • Or imagine a different conversation.

  • Let's say you wanted to post something supercontroversial on Facebook,

  • which is a really important thing to be able to do,

  • to talk about controversial topics.

  • And right now, when there's that big comment box,

  • it's almost asking you, what key do you want to type?

  • In other words, it's scheduling a little timeline of things

  • you're going to continue to do on the screen.

  • And imagine instead that there was another button there saying,

  • what would be most time well spent for you?

  • And you click "host a dinner."

  • And right there underneath the item it said,

  • "Who wants to RSVP for the dinner?"

  • And so you'd still have a conversation about something controversial,

  • but you'd be having it in the most empowering place on your timeline,

  • which would be at home that night with a bunch of a friends over

  • to talk about it.

  • So imagine we're running, like, a find and replace

  • on all of the timelines that are currently steering us

  • towards more and more screen time persuasively

  • and replacing all of those timelines

  • with what do we want in our lives.

  • It doesn't have to be this way.

  • Instead of handicapping our attention,

  • imagine if we used all of this data and all of this power

  • and this new view of human nature

  • to give us a superhuman ability to focus

  • and a superhuman ability to put our attention to what we cared about

  • and a superhuman ability to have the conversations

  • that we need to have for democracy.

  • The most complex challenges in the world

  • require not just us to use our attention individually.

  • They require us to use our attention and coordinate it together.

  • Climate change is going to require that a lot of people

  • are being able to coordinate their attention

  • in the most empowering way together.

  • And imagine creating a superhuman ability to do that.

  • Sometimes the world's most pressing and important problems

  • are not these hypothetical future things that we could create in the future.

  • Sometimes the most pressing problems

  • are the ones that are right underneath our noses,

  • the things that are already directing a billion people's thoughts.

  • And maybe instead of getting excited about the new augmented reality

  • and virtual reality and these cool things that could happen,

  • which are going to be susceptible to the same race for attention,

  • if we could fix the race for attention

  • on the thing that's already in a billion people's pockets.

  • Maybe instead of getting excited

  • about the most exciting new cool fancy education apps,

  • we could fix the way kids' minds are getting manipulated

  • into sending empty messages back and forth.

  • (Applause)

  • Maybe instead of worrying

  • about hypothetical future runaway artificial intelligences

  • that are maximizing for one goal,

  • we could solve the runaway artificial intelligence

  • that already exists right now,

  • which are these newsfeeds maximizing for one thing.