Placeholder Image

字幕表 動画を再生する

  • BRADLEY HOROWITZ: I want to welcome you all.

  • My name's Bradley Horowitz, I'm VP

  • of Social for Google, Social Product Management.

  • And I'm here today to welcome Sandy Pentland

  • to come in and speak with us.

  • I'm going to be brief.

  • I encourage you all to Google Sandy.

  • You will pull up his long list of credentials,

  • which include academic credentials, business

  • credentials, across many, many disciplines

  • for many, many years.

  • World Economic Forum, Forbes' Most Influential,

  • it goes on and on.

  • So I'm going to try to give you a little anecdote of something

  • that I don't think you'll get on the web.

  • Nothing too embarrassing.

  • I'm a student, both former and current, of Sandy's.

  • Former in the sense that I was a media lab Ph.D. student.

  • He was my adviser.

  • And current in the sense that I stay

  • closely attuned to everything that Sandy does.

  • As we were walking over here from building 1900,

  • we were sort of doing what old friends do,

  • which is play the name game and checking in

  • on all the old connections and friends that we share.

  • How's Roz doing?

  • How Stan doing?

  • How's Ali doing?

  • What about Fad?

  • And we went through them, and turns out

  • everybody's doing fine.

  • You know, many of you are here, actually in the second row,

  • in front row.

  • Many of you have gone off to become professors

  • at MIT or Berkeley or Georgia Tech.

  • And it was just so great.

  • And thinking about that for a moment,

  • I recognized that I was just walking

  • through one of the vintages, the sort of early '90s vintage

  • of Sandy's students who have all gone off to do great things.

  • And Sandy has consecutively piled

  • on top of that, round after round,

  • of graduate students and students that he has inspired.

  • And in addition to all of those lists of accomplishments,

  • one of the things that really touches me most

  • about Sandy and his work is that he's

  • such an inspirational educator.

  • He not only has enthusiasm for his own work,

  • he's able to impart that to others

  • and create generations of people that care passionately

  • about technology and science.

  • And it's just so great to be in the company, which we will all

  • get to share for an hour right now,

  • of a person who can inspire and lead that way.

  • And so with that, I'll hand it over to Sandy.

  • Welcome.

  • SANDY PENTLAND: Well, thank you.

  • [CLAPPING]

  • SANDY PENTLAND: Now, I'll have to inspire and cause

  • passion, which is actually part of what

  • I'm going to talk about.

  • So how does that happen?

  • So maybe this is good.

  • So I'm going to talk about two things.

  • One is basic science about who people

  • are, how we use electronic media, how we use face to face

  • media, how we evolved as a social species.

  • And then I want to move to how we can use this knowledge

  • to make things better, to have a more

  • sustainable digital ecology, to make government work.

  • Wouldn't that be amazing? [LAUGHS]

  • And so on and so forth.

  • And as Bradley mentioned, I do a lot of things.

  • I thought I'd stick this in.

  • I love this picture.

  • This is the boards back in the 1990s.

  • And that's Thad in the front there,

  • Thad Starner, who I think most of you know.

  • And then two other things I do that are of real relevancy

  • here, maybe three.

  • Is one is for the last five years,

  • I've run a group-- helped run a group at Davos--

  • around personal data, privacy, and big data.

  • And that's, of course, a very relevant topic for this crowd,

  • but particularly, going forward.

  • And the group includes people like the Chairman

  • of the Federal Trade Commission, the vice president of the EU,

  • Politburo members from China, et cetera, et cetera.

  • So it's a conversation between CEOs of major companies

  • and chief regulators and advocacy.

  • And I'll talk about that at then end

  • and where I think that things are going

  • and what you might want to do about it.

  • And I just joined the Google ATAP Board

  • because it used to be owned by Motorola,

  • but when Motorola got sold, they intelligently

  • moved the really creative interesting part over here

  • to Google.

  • And as Bradley mentioned, that started a bunch

  • of companies, which are doing well.

  • So the thing that I'm really concerned about,

  • the thing that's passionate, is making the world work better.

  • And a sort of story for this is about 15 years ago,

  • I was setting up a series of laboratories in India.

  • And, you know, we had huge government sponsorship.

  • We had a board of directors, which

  • are some of the brightest most successful people in the world.

  • And it was a complete disaster.

  • And it had to do with a lot of things.

  • It had to do with all of the sort of macho,

  • signaling charisma in the room with the board of directors.

  • But it also had to do with the way the government failed

  • to work, or did work.

  • And looking back on that, I can sort of

  • see that premonitions of the US Congress today.

  • All right?

  • So we went and visited the Indian Congress,

  • where we saw people throwing shoes at each other

  • and throwing cash in the air.

  • And we look at the US Congress today,

  • and it's somewhat similar, unfortunately.

  • So I want to make things better.

  • And what occurs to me is if we knew

  • how to make our organizations work,

  • then we could really do things.

  • Like we could solve global warming tomorrow

  • if we all knew how to sort of talk about it rationally,

  • come to a good decision, and then carry that through.

  • And the fact that that sounds like ludicrous fantasy--

  • oh, yeah, everybody agree.

  • Sure.

  • Not in our lifetime-- tells you just how profound

  • the problem is.

  • And that's why I think one of the most important things

  • that's happened in the last decade, something you've all

  • been part of, is this era of big data,

  • which is not about big at all.

  • It's about personal data.

  • Detailed data about the behavior of every person on Earth,

  • where they go, what they buy, what they say online,

  • all sorts of things like that.

  • Suddenly we could watch people the way,

  • say, you would watch an ant hill or Jane Goodall

  • would watch apes.

  • We can do that, and that has profound impact

  • that is hard to appreciate.

  • I made this little graph, which comes out

  • of-- inspired by Nadav's thesis here,

  • which is duration of observation.

  • These are social science experiments, and the biggest

  • medical experiments.

  • So this is like the Framingham heart study.

  • 30 years, 30,000 people.

  • But they only talked to people like once every three years.

  • So the bit rate was like one number per month.

  • So you had no idea what these people were doing.

  • They could have been eating fried chicken all the time.

  • You don't know, right?

  • Or most of the things we know about psychology

  • come from down here.

  • This is a number of bits per second, duration.

  • This is a bunch of freshman in Psych 101

  • filling out some surveys.

  • And that's what we take to be social science,

  • political science, and medical science.

  • But now we have these new ways of doing things.

  • And so what in my group we've done

  • is we've built little badges, like you all

  • have little name badges.

  • And so we can actually know where you go

  • and who you talk to.

  • And we do this with organizations.

  • We'll track everybody for a month.

  • We never listen to the words, but we

  • do know the patterns of communication.

  • And I'll show you a little bit about that.

  • Similarly, we put software in phones,

  • and we look at the patterns of communication

  • within a community.

  • So I go into a community and give everybody brand

  • new phones.

  • And some of the people here have been integrally involved

  • in these experiments.

  • And we'll look at their Facebook activity, their credit card

  • record, their sleep pattern, their communication pattern,

  • who they hang with, who they call,

  • and ask, what do all this communication patterns

  • have to do with outcomes?

  • All right?

  • Do they spend too much?

  • What things do they choose to buy, and so forth.

  • And what you find from the big data,

  • and, of course, modern machine learning sorts of things,

  • is that you can build quantitative predictive models

  • of human behavior, which you all know.

  • But I think you know the wrong part.

  • And I'm going to tell you about the other part that's

  • much stronger than what you typically do, OK?

  • And so you can predict behavior.

  • And people go, well, wait a second.

  • What about free will?

  • That may not have occurred to you,

  • but that's a traditional thing to ask.

  • And I'll tell you a little bit about that along

  • the way because it turns out that a lot of our behavior

  • is very habitual, and that's the part

  • that we can model mathematically.

  • So the big picture here, and this

  • is part of the reason I got off onto this research,

  • is I go to places like Davos, and you

  • listen to the president of this and the CEO of that.

  • And when they talk about changing policy, talk

  • about doing anything, they use economics metaphors.

  • And the thing about [INAUDIBLE] economic metaphors

  • is that they're all about individuals.

  • So you've heard about rational individuals.

  • And everybody rags on the rational part.

  • I'm not going to do that.

  • I'm going to rag on the individual part, OK?

  • Because I don't think we are individuals.

  • What we desire, the ways we learn to go about doing it,

  • what's valuable, are consensual things.

  • So they actually are captured by this sort

  • of model, this independent model.

  • That matters because those interactions are

  • the sources, not only of fads and economic bubbles,

  • but they're really the social fabric that we live in.

  • So everybody knows about the invisible hand

  • that are led by the invisible hand

  • to advance the interest of society.

  • What that means is that markets are supposed to allocate things

  • efficiently and fairly, right?

  • If you've thought about it, you know

  • this doesn't work in the real world, [LAUGHS] OK?

  • And the question is, why?

  • So one of the things that-- there's

  • several things to say about this.

  • Most people think that this statement is something

  • that he made in "The Wealth of Nations."

  • And I'm just going to [INAUDIBLE]

  • "The Wealth of Nations."

  • But it's not.

  • He made it in a book called "Moral Sentiments," which

  • very few people read.

  • And it went on to say something very different.

  • It went on to say that "it's human nature

  • to exchange not only goods, but ideas, assistance, and favors.

  • And it's these exchanges that guide people

  • to create solutions for the good of the community."

  • So Adam Smith did not believe that markets

  • were socially efficient.

  • He believed that it was the social fabric of relationships

  • that caused the pressures of the market

  • to be allocated where they're needed.

  • And, in fact, a lot of mathematicians

  • believe now that this sort of despite Nobel prizes

  • about market efficiency and markets

  • being good for governance, it's not.

  • It's really-- you have to have the right sort of regulation

  • replication mechanism.

  • But this is another solution.

  • Adam Smith way back when said, if we could model the peer

  • to peer relationships, we could understand

  • how market things eventually resolve

  • to be much more efficient.

  • And that's what we're doing.

  • We're doing something that you could

  • call, sort of, Economics 2.0.

  • Instead of the individual approximation,

  • we're now modeling not only individual behavior,

  • but peer to peer behavior at the same time.

  • So that's the sort of big context for what's up here.

  • So let me give you an example of what that means.

  • So in a typical situation, you have some people

  • that influence each other.

  • So, you know, their political views, it's what's cool to do.

  • You pick it up from the people around you.

  • And we have a lot of evidence upon this

  • from the experiments we've done in our group,

  • showing that people's attitudes about, you know,

  • what music to listen to, what apps to download,

  • what spending behavior to have, is largely

  • predicted by their exposure to what other people do.

  • And you may not want to believe this

  • because it's not the rhetoric in our society.

  • And you guys are the last people to be saying this

  • to because you guys are like the best and smartest in the world.

  • But it really is true that about 50% of the variance

  • comes from these peer to peer relationships.

  • And we know that when we do incentives,

  • when we try to these-- CEOs and governors

  • try to set up governance schemes to make people do things,

  • they always talk about individual incentives.

  • That's part of this mindset that comes from the 1700s,

  • is that we're all rational individuals.

  • So we'll give this guy money to behave differently.

  • But when you do that, what happens is, of course,

  • you're putting that incentive in opposition

  • to the social pressure he gets from other people.

  • And if they're aligned, it works wonderfully.

  • But if they're not aligned, which happens all the time,

  • incentives don't work.

  • Individual incentives don't work.

  • And the moment the incentive goes away, you know,

  • you start paying them, they revert to what

  • the social fabric does.

  • So when you begin to think about this mathematical framework

  • that includes the social fabric, an obvious thing

  • occurs to you, which is that, well,

  • instead of giving one person the incentive,

  • maybe I can modify the social fabric.

  • What we have is exchanges between people,

  • and we have incentives being applied to the individuals.

  • And now what I'm going to do is I'm

  • going to modify the incentives between the people, OK?

  • And you can write down these equations just the way

  • you write down economic equations.

  • So this was in "Nature Scientific Reports"

  • just last year.

  • So you could write it all down, you know, with utilities

  • and peer pressure and externality cost.

  • And you'd discover something interesting, is

  • when you add these second order terms in,

  • you find that incentives that modify the interactions

  • are generically more than twice as efficient

  • as incentives that go to individuals.

  • Generically that way.

  • And in the data that I'll show you,

  • it's sort of four to 20 times more powerful.

  • So this is the thing that I want to really get across to you,

  • is that this sort of power of economics that has mathematics

  • and prediction is very limited.

  • It doesn't include anything that has strong peer

  • to peer effects, like bubbles, like fads.

  • But you can now write those down because we have enough data

  • and enough math be able to do it.

  • So let me give you some examples of what that means.

  • These are simple examples.

  • You can do much more complicated ones.

  • So this was, again, sort of done in Nadav's thesis.

  • We got a community, where we divided them into two pieces.

  • It's actually three pieces, but I'll just

  • talk about two pieces.

  • In one piece, we had a little app

  • that showed them how active they were.

  • And this was in the winter in Boston,

  • where people tend to just put the blanket over their head

  • and go away, right?

  • So we wanted them to get out and around and stuff.

  • And so we can show them their activity level,

  • and we can give them money.

  • So you can say, oh, if I was more active than last week,

  • I made $3.

  • Wonderful, OK?

  • But in the other part of the community,

  • we assign people buddies.

  • And you would be his buddy.

  • If he is more active, you'd get rewarded.

  • Not him, you.

  • Your buddy gets rewarded for you being active.

  • And what you do is you pick people

  • that have a lot of interactions with each other to do this.

  • And it sounds a little creepy, but actually, people

  • got into it.

  • Almost everybody signed up for it.

  • And what you got is you got everybody's, like,

  • looking at the other guy and saying,

  • well, are you being active?

  • Because they're reminded on their phone all the time,

  • and they're getting a couple bucks for it.

  • It's not a big thing, OK?

  • But remember that that incentive scheme, that social network

  • scheme, if I'm not incenting the individuals,

  • I'm incenting the network, is generically more than twice

  • as efficient.

  • In fact, in this experiment, the way we did it,

  • we found it was four times more efficient.

  • And if we'd done it the right way, if we went back and did it

  • again, as sort of a post hoc analysis,

  • we would have gotten eight to 10 times more efficient, OK?

  • Pretty interesting.

  • Oh, and one other thing.

  • It stuck.

  • When we ran out of money, we turned off

  • the money, no more incentives.

  • People kept being more active for the period

  • that we could observe, OK?

  • Because what we'd done is changed the social fabric.

  • We made being active a topic of conversation,

  • a topic of social pressure, of prestige, of interest.

  • And, of course, that has momentum to it.

  • Here's another example that's online.

  • We went to a canton in Switzerland,

  • which was trying to save power.

  • Keep power below a certain level because they

  • had hydroelectric power up to one level.

  • And beyond that, they had to turn on diesels.

  • And noisy, expensive, polluting.

  • And they tried educating people, and they

  • tried financial incentives, and nothing really worked.

  • And so we convinced them to sign people up as buddies.

  • So you would sign up with him as your buddy,

  • and if you saved energy, he would get a reward.

  • Now, the rewards were, like, pathetic.

  • The budget was $0.50 per week, OK?

  • And little dancing bears on your website.

  • It was, like, really stupid.

  • But what happened is that for the people that signed up,

  • you got a 17% reduction in energy usage.

  • Now, that sounds good, but let me give you the comparable.

  • There have been some places where people have raised prices

  • to get 17% reduction.

  • On average, you have to double the price of energy

  • to get that reduction in energy use.

  • That's the price elasticity curve.

  • So for $0.50 a week, we could get the same effect as doubling

  • the price.

  • This one is by a friend of mine, James Fowler.

  • Done with Facebook in 2010.

  • He's sent 61 million people a message

  • about getting out and vote in the election.

  • The sort of simple summary of it is it had almost no effect.

  • People didn't do it.

  • A few people did.

  • He could go back and sample people

  • and say, how many people did this have in effect.

  • He also included a "I Voted" button,

  • which would then show your face to your Facebook friends, OK?

  • This also had no effect, except with one particular class

  • of people, which is the people you had strong face

  • to face relationships with.

  • If you appeared in the same images with this other person

  • regularly, then among that group,

  • that "I Voted" button would generate two

  • to three more people voting.

  • So greater than yield one, a cascade of behavior.

  • So, again, what was happening there

  • is social pressure, the peer to peer things.

  • The centralized thing didn't do it.

  • It was peer to peer pressure between people

  • with strong ties.

  • And, in fact, actually, it's not captured

  • by the electronic network at all.

  • These are things that were outside of that, OK?

  • So that's an example that I think

  • is really interesting that you can build on.

  • And somebody mentioned, actually,

  • how many of you know the red balloon contest?

  • [INAUDIBLE] knows it.

  • [LAUGHTER]

  • SANDY PENTLAND: [INAUDIBLE] OK.

  • But so DARPA had this social network grand challenge

  • where we had to find 10 red balloons somewhere in the US.

  • And everybody tried using the economic incentives.

  • You know, you get some money if you

  • found a balloon a reported it.

  • We used something like this mechanism.

  • We're able to recruit what we estimate

  • to be about two million people in 24 hours, and won.

  • Again, not giving people directly the incentive.

  • In that case, it's a little more complicated,

  • but giving people these peer to peer things.

  • So that's cool.

  • Why do you think humans are this way?

  • Well, let me give you an example that I think really

  • tells us why the action is not between our ears.

  • The action is in our social networks, OK?

  • We are a social species.

  • We evolved that way.

  • Why?

  • Let me give you a really graphic example of that.

  • So this is a site called eToro.

  • It's a social network site.

  • On this site, people buy and sell dollars and euros

  • and gold and silver, and stuff like that, OK?

  • And unlike almost every other trading platform,

  • it's a social platform.

  • So I can see what every other person,

  • what every of these 1.6 million people are doing.

  • You can't see the dollar amount, but I

  • can see that they're shorting euros,

  • long dollar, leveraged 25, right?

  • One day contract, and I can see how much money

  • they made at the end.

  • I could see their return on investment.

  • So here are people playing with their own money,

  • substantial amounts of their own money, millions of them

  • all over the world, doing this very regularly.

  • Sort of average maybe one a day, that sort of transaction,

  • right?

  • A couple, maybe a couple days.

  • And we can make a graph of the social network.

  • And that's what this.

  • This is 1.6 million people.

  • These are the same 1.6 million people.

  • And wherever there's a dot, this person

  • decided to follow this other person.

  • And follow in eToro has a different meaning

  • than Facebook.

  • Follow means I'm going to take 10% of my money,

  • and whatever that person does, my 10% percent of the money

  • will be invested exactly the same way.

  • So this is follow with teeth, OK?

  • And this is the graph of following.

  • So these are people learning from each other,

  • looking at each other strategies, and trading.

  • And you see some people are going it alone.

  • They read the newspaper, they look at the wire,

  • they browse the web, then they trade.

  • Other people are in this orgy of social dialogue, right?

  • All following each other.

  • And, in fact, if you look at it, you

  • see that there are all these loops in there.

  • So, you know, I follow you, you follow him, he follows me.

  • Uh oh. [LAUGHS]

  • And what happens in this loop is that you

  • get very few new ideas.

  • It's the same ideas going around and around.

  • And this is the sort of thing that's

  • a precursor of an economic bubble.

  • That the question is, which of these sort of social strategies

  • gives greater return on investment?

  • Why are we a social species?

  • The way that people almost always analyze it

  • is greater information.

  • That these guys all read the newspapers and everything.

  • They have all the information in the world, OK?

  • These guys read the same newspapers and everything,

  • but they also look at each other.

  • What would you expect to happen?

  • Well, you can write down the equations here.

  • And what you can do is you can look

  • at the propagation of new strategies

  • across this population.

  • So when this guy comes up with a new thing to do,

  • how likely is it to propagate throughout the social network?

  • And in that way, you can quantify

  • the number and diversity of new strategies any one person sees.

  • These people will see about no new strategies

  • because they're all on their own.

  • These people will see very few new strategies

  • because they're all listening to each other.

  • It's the same thing around and around.

  • And these people are much more diverse.

  • If we look at that return on investment,

  • we get a curve like this.

  • So, again, this is a mathematical function

  • that has to do with the number of new strategies.

  • So the rate of propagation of strategies

  • through the medium, the social network.

  • If you look at the number of new things,

  • this is very low, this is very low,

  • this has many new types of strategies.

  • And this vertical axis is return on investment.

  • So this is, like, one of these no BS types of majors, OK?

  • Real people, their own money, doing it

  • on their own choice, making money, or not.

  • People who trade by themselves are market neutral.

  • You might expect that on average.

  • They hire the market.

  • They lose a little bit of money in trading costs.

  • People who are in these momentum echo

  • chambers don't do very well either.

  • And what isn't shown here is sometimes

  • there are crashes that blow them all up.

  • So they actually do pretty badly on the long term.

  • But people in the middle make 30% more money.

  • So this is not something that is in traditional economics.

  • What we're talking about here is a blend

  • of social strategies for learning

  • from other people, plus individual information.

  • It's the peer to peer interactions.

  • And probably the reason that we have a social species,

  • this learning from each other, is

  • because it has this much more efficient output.

  • And there's a big literature about this.

  • Don't just believe me.

  • This is a wonderful example because I can quantify it.

  • And every doc here is all in the trades

  • by millions of people for a whole day.

  • So this is, like, more data than you know, right?

  • And if I did just one asset class,

  • like dollars versus euros, it wouldn't have this spread

  • that it does.

  • It would be a nice band.

  • So as you get more diverse learning

  • from your social environment, your return on investment

  • goes up until you begin getting too many loops.

  • And then it goes back down.

  • Now I like this example because I think this example applies

  • to the government, it applies to making decisions in companies.

  • If you begin thinking about it, we're

  • all living these social networks,

  • and what we're trying to do is make good decisions.

  • Here, I'm showing you that a mixture of social learning

  • plus individual learning-- I can tell you a lot more about it,

  • it's in the book-- gets you better decisions.

  • And not just better decisions by one person,

  • this is better decisions of the entire 1.6 million people.

  • Now, that's a really different thing.

  • I should also mention that one of the things we

  • did with this platform is when we discovered that they were

  • in this echo chamber state, that's

  • not good for the platform or them, OK?

  • Everyone's going to lose money.

  • So we looked at the loop structure,

  • and we figured out how best was the optimal way

  • to break it up was.

  • And we gave coupons to key people,

  • small group of key people, that would cause this echo chamber

  • to break up in an optimal manner.

  • And that doubled the return on investment of the people

  • in the entire network.

  • And that lasted for about three days, four days,

  • and they went back to being stupid again.

  • But [LAUGHS] that's their problem.

  • We've done this sort of repeatedly.

  • We know it works.

  • So you can actually control the flow

  • of ideas in a network like this and improve

  • the average function of the people in the network.

  • It's a very different way of thinking

  • about things than the normal way because you're not

  • concerned about individuals.

  • You're not concerned about their education and their decision

  • making.

  • You're concerned about the pattern of learning

  • and the performance of an ensemble, rather

  • than the individuals.

  • So one of the other things that this big data tells us

  • is that this process can be broken up into two pieces.

  • And to illustrate that, I'll show this diagram that's

  • from Danny Kahneman's Nobel Prize lecture.

  • He's the father of behavioral economics.

  • And he makes the point that people

  • have two ways of thinking.

  • There's a slow way of thinking.

  • [INAUDIBLE] probably knows about thinking

  • fast and slow, very popular.

  • Slow way of thinking that's the serial reasoning that we do.

  • And there's this fast way of thinking,

  • which is associations.

  • You take the experience you had, you

  • say, how is this situation like my previous experiences?

  • Maybe you interpolate a little bit,

  • and you make your decision very, very fast.

  • This is a very old mechanism, 100, 200 million years old.

  • This is pretty much unique to humans.

  • Interestingly, this is the much better mechanism by far

  • if you have the right set of experiences.

  • If you don't have the right set of experiences,

  • this is a disaster waiting to happen because you're

  • going to associate the wrong things with the wrong things,

  • and follow right off the cliff.

  • And when I look at the learning that people

  • have from each other in these social networks,

  • I see a qualitatively different type of behavior.

  • So when I look at slow learning--

  • so this is a learning that people integrate

  • into their conscious representations.

  • So the new song you heard, the new fact,

  • the new product that came out.

  • People are very promiscuous about this.

  • It only takes one exposure to integrate that

  • into your ensemble of things you know about.

  • And this is a way almost everything

  • that you guys build is based on.

  • Oh, we're going to have more information, right?

  • But information is not behavior.

  • It turns out that to get behavior change, which

  • is what I call idea flow, you need something different.

  • So this is an exploration.

  • We are trying to find new possibilities and new facts.

  • But it's relatively isolated from behavior change.

  • You could learn about all sorts of things

  • and never change your behavior.

  • This is why it's hard to stop smoking,

  • this is why it's hard to stop overeating,

  • why all sorts of things are hard is

  • that our habits, our actual behaviors, that reside here

  • are largely independent of this.

  • Now, there's some leakage.

  • If you concentrate real hard, some early adopters, yes, it

  • does happen.

  • But as I showed with the voting experiment,

  • the transfer from here to here is very weak.

  • On the other hand, what is the case

  • is that if you see multiple people experimenting

  • with the same idea, people whom you have strong relationships

  • with, then you will with very high certainty

  • tend to adopt that behavior, OK?

  • So what you're doing is this social learning.

  • If I see for some of my peers that

  • doing this results in a better outcome,

  • then without even thinking about it, I'll begin to adopt that.

  • If I hear about it, you know, through email or on the web

  • or something, it's very unlikely.

  • We have a database of all of-- I can't

  • say the name of the company, but a competitor

  • bought it for $1 billion.

  • The social network for the inside companies.

  • So we have the deployment for over 1,000 companies in there,

  • using that social network.

  • And what we find can be summed up in an interesting statistic.

  • If you get invitations to join this intracompany social

  • network from as many as 12 people in a half an hour,

  • you're still unlikely to sign up,

  • unless those people are people you already

  • have strong relationships with.

  • If they're people you know face to face or people

  • you work with regularly, then as few as three invitations

  • makes it almost certain that you'll sign up.

  • So that's just like the voting thing,

  • it's what I'm talking about here.

  • Behavior change, which is what you usually care about,

  • has to do with this social reinforcement mechanism

  • that I call engagement.

  • It's community vetting of ideas and behaviors

  • that results in the adoption of a new behavior.

  • It's not the broadcast, in fact, that we often think about.

  • So let me show you an example of this.

  • So this is data from a German bank.

  • It has five departments, managers, development, sales,

  • support, customer service is the last one.

  • And this is all the email, and the red stuff

  • is all the face to face.

  • We get this off of little badges we put on people.

  • So you probably can track this stuff,

  • but you've never tracked the face to face stuff.

  • Nobody does.

  • And what we find is that the sort of punch line

  • is that the email, the pattern of email,

  • has very little to do with productivity

  • or creative output.

  • But the pattern of rich channels of communication

  • has a huge amount.

  • So I'll show you a slightly distracting thing first,

  • and then I'll tell you the real punchline here.

  • So these guys are going to do an ad campaign.

  • They're starting now where the boss sends out lots of email

  • to have lots of meetings to figure out how to do it.

  • During that time, nobody talks to customer service.

  • They deploy the thing, it's a disaster, and as a consequence,

  • they deploy it now.

  • And then they have all day meetings with customer service

  • to figure out how to fix it, OK?

  • So the real punchline, because we've

  • done some dozens of companies now,

  • is that you can see the pattern of rich channel communication,

  • and that predicts typically 30%, and sometimes 40%

  • of the variation in productivity of work groups.

  • 30% to 40% is bigger by far than anything that you look at.

  • I mean, you'd have to like kill the people to get

  • that big of an effect.

  • And the mathematical formulation of it

  • is basically a probability that if I talk to you

  • and I talk to you, what's the likelihood that you

  • two also talk to each other?

  • It's those loops.

  • And it's this learning from each other,

  • keeping people in the loop, nice little mathematical

  • relationship, that predicts this productivity.

  • And there's another thing, so that's

  • that engagement I was talking about.

  • There's another [INAUDIBLE] exploration,

  • and that's the stuff that your boss tells

  • you is not in your job description.

  • That's going to talk to the people in sales,

  • or the janitors, or the people at the other company.

  • Just picking up new ideas, new facts, new observations,

  • and bringing them back to your work group

  • to bang against each other and see if they make

  • sense to do that social learning process, OK?

  • I wrote a paper for Harvard Business

  • Review that lays this out.

  • It's called the "New Science of Building Great Teams."

  • And it won Paper of the Year award, which is nice.

  • But it also won Paper of the Year

  • from the Academy of Management, which is the academic side.

  • And that's the first time, I believe,

  • that Harvard Business Review and the academic business guys

  • have ever agreed.

  • So maybe it's worth taking a look at.

  • Anyway, so that's companies.

  • Let's look at the real world.

  • So this is a company I helped found in 2006.

  • Sold to AT&T's mobile advertising

  • that's been out recently.

  • People moving around in San Francisco.

  • Big dots are the most popular places.

  • Maybe some of you guys have seen this before.

  • I like it.

  • I show it often.

  • Looks like a nicely mixed city, but actually,

  • if you analyze it, if you cluster people by their paths

  • and by their exposure to each other, what you find

  • is you find that it's a very segregated city.

  • There's these groups of people that hardly ever

  • are exposed to each other.

  • And then there's other groups within the group,

  • they don't know each other, but they go to the same places,

  • they see the same things, and they have the same habits.

  • So in other words, they have very strong engagement

  • within the groups and they learn habits of behavior from that.

  • Now, sometimes that's good.

  • So, for instance, sometimes it's sort of trivial.

  • It's like you might discover that one group here,

  • you get a fad for red dresses.

  • No particular reason.

  • It's just what people in this group do, OK?

  • In another group, though, what you

  • find is you find that they have a different attitude

  • about paying back credit cards than maybe you do.

  • And so they don't have such good risk scores.

  • Again, it's not anything that they thought about.

  • It's just what people in their group do.

  • They learn from each other.

  • [INAUDIBLE] George just, like, threw it away, got a new one.

  • Nobody came after him.

  • It's the smart thing to do, right?

  • And then the other thing that you find,

  • which is very important, is chronic diseases

  • vary by group because of behavior.

  • Chronic diseases are mostly a function of your behavior.

  • You eat too much, you drink too much,

  • you don't exercise enough, all those things.

  • And a lot of other things we don't know.

  • But we don't know why that particular group is

  • susceptible to diabetes, but we know

  • they're very much more susceptible than other people.

  • It seems to be that they learned bad habits from each other, OK?

  • So I'm going to give a TED talk in a little bit,

  • so I thought I'd put this in.

  • So what TED does-- and his idea's worth spreading, right?

  • Make his wonderful videos blast him out to everybody.

  • And what that's doing is it's increasing

  • people's exploration.

  • You got a million views of this little movie, da da da da,

  • right?

  • But it doesn't change behavior.

  • That's my prediction.

  • That's what I see in all of this data.

  • What changes behavior is all the peer

  • to peer communication that comes afterwards,

  • where you say, what do you think of that?

  • Another guy says, oh, yeah, that's awesome.

  • And then the third guy says, well, OK.

  • And you get this validation among your peer group,

  • and that's what leads you to actually change your behavior.

  • So if you like what I'm saying, if you think it's interesting,

  • talk to your peer group, [LAUGHS] right?

  • Maybe it'll change your behavior.

  • So you can actually do something interesting today,

  • which is you can use these ideas to map

  • stuff in the entire cities.

  • So this is mixing of different communities in Mexico.

  • And I should start it over again.

  • So the red stuff is where the most mixing happens,

  • and the yellow stuff is where very little mixing happens.

  • And if it's blank, we don't have the data.

  • So you can see on Sundays, there's very little mixing.

  • People stay home.

  • Monday, a lot of people come out.

  • So according to the things that I've told you,

  • it's this mixing between communities

  • that's the source of a lot of the innovation

  • and creative output in a community.

  • It's the banging together of ideas

  • that causes innovation and better social outcomes.

  • And you can now do this-- and this is just cell tower

  • activity.

  • This is not any personal data at all.

  • This is just at the level of cell towers.

  • You say, well, which parts of the community

  • do the people at this cell tower come from, OK?

  • You don't know the individual people.

  • But you can use this in interesting ways.

  • If I use that same method-- this is

  • something we did in the Ivory Coast.

  • I helped convince the carrier, Orange,

  • release all of their cell tower data for the Ivory Coast.

  • Ivory Coast is a very poor country

  • that also had a civil war.

  • So the government can't go in the northern half

  • of the country.

  • What they do is they have poverty statistics

  • for the lower half.

  • And using this method, you can fit the statistics

  • for poverty in the lower half and extrapolate them

  • to the upper half.

  • And the poverty that your measuring is interesting.

  • So this has to do with two factors.

  • It's this exploration outside the community and engagement

  • within the community, OK?

  • And so this MPI is a multi-factor thing.

  • And it's a combination of poverty,

  • but also life expectancy, crime, and infant mortality

  • because they all co-vary with one another.

  • So that's an example.

  • So you saw it with Mexico City.

  • You see it here.

  • This is something a former student

  • of mine, Nathan [INAUDIBLE] did.

  • He took all the data from councils in the UK.

  • So there are neighborhoods that are

  • administrative units in the UK.

  • And he looked at their socioeconomic outcome

  • index, which is, again, poverty, crime, infant mortality stuff,

  • and compared it to the land line phone records,

  • and measured two things, which are very similar to what

  • I call exploration and engagement

  • and generated this graph.

  • So when you get a community that doesn't talk to itself

  • and doesn't talk much outside of itself, all the babies die.

  • Well, not all the babies.

  • A lot of babies die.

  • When you get a community where they're richly integrating

  • into the rest of society and they talk among themselves,

  • very few babies die.

  • And, of course, this is a richer community,

  • that's a poorer community, that has

  • less crime, that has more crime.

  • So we've seen this now in several places.

  • We've seen it in England, we've seen it in Ivory Coast,

  • we've seen it in Cambridge.

  • And you can begin doing things with this.

  • So for instance, you can use this

  • to be able to predict GDP in cities.

  • So we took the GDP for 150 cities in Europe and 150 cities

  • in the US, and we measured the amount of banging together

  • of ideas that you got in rich channels of communication,

  • face to face primarily.

  • And we did that by using things like Foursquare

  • to ask, how often do people come together

  • from different communities and at what distance?

  • You can print a nice mathematical function.

  • It's the same form in Europe as it is in the US.

  • It varies by the density of the city and the transportation

  • infrastructure.

  • So if it's a very dense setting with a really good

  • transportation infrastructure, you

  • run into a lot of different people,

  • there's a lot of ideas banging together,

  • and you get a lot of innovation.

  • And so what this is showing is a measure

  • of this face to face engaging and exploration.

  • And this is GDP per head.

  • I put kilometer, I think actually that one is.

  • And you can see that it accounts for around 90%

  • of the variance, which in social science, is like a, you know,

  • law from above or something like that.

  • In other words, if you tell me the density

  • of the city and the transportation infrastructure,

  • and actually just need to tell me the average commute time,

  • I can tell you the GDP almost perfectly.

  • Similarly, if I tell you that mobility or the call pattern

  • in a neighborhood, I can tell you

  • the GDP, the number of infant mortality, and the crime rate.

  • Again, r squared of about 0.85 is like a law.

  • It's amazing, OK?

  • And that opens up the possibility

  • of doing cool things.

  • You could, for instance, change the infrastructure

  • to make more ideas bang together, right?

  • You could make it so it's easy to get around

  • than a place like New York or San Francisco,

  • as opposed to an incredible pain in the butt, all right?

  • And so, you know, we, at MIT, have done things

  • like this shared car, shared scooter.

  • I'm on the advisory board for Nissan

  • and helping them build what we hope

  • to be the first autonomous vehicle-- commercial autonomous

  • vehicle, all right?

  • Because we actually built one.

  • I helped to build one almost 20 years ago,

  • but it was never deployed commercially.

  • And so now we're going-- the CEO says we're going to do it.

  • So that's [INAUDIBLE].

  • But the thing I think is most-- so let me back up.

  • So this is a tool we built at MIT-- this

  • is Ken Larson and Ryan Chin primarily--

  • which allows you to simulate things.

  • So this is actually the area around MIT-- Media Lab.

  • That's the Media Lab there.

  • So they're built out of LEGOs.

  • And they have a laser range finder,

  • which scans the 3D thing.

  • And then you do some computation,

  • and you project back what ought to happen.

  • So, for instance, you can project back

  • how the wind will go, or traffic patterns.

  • But you could also project back things

  • like anticipated creative output.

  • How many different ideas are going to bang together?

  • Are all the communities little silos,

  • or are they actually mixing?

  • So this is something I'm sure people who

  • plan these buildings talked about all the time,

  • but you never measure it.

  • And yet, all the data say, that's

  • the source of real creative output.

  • It's also the source of getting everybody on the same page.

  • You have to mix those two things.

  • So where are we getting to build the interactive tools to do

  • that.

  • So another way which probably resonates more with this group

  • is this, which is trading ideas with each other electronically,

  • rather than face to face.

  • Rich channels, like face to face, are important.

  • But you can supplement them, you can extend them in various ways

  • by better data sharing.

  • And to a certain degree, that's why

  • people call personal data the new oil of the internet

  • because all those personal experiences are not

  • only good for learning peer to peer,

  • they're also good for advertising and lots

  • of other things.

  • And as I think I've shown you, it also

  • is something that worries a lot of people

  • and could be used in very bad ways.

  • And so about five years ago, I helped start group

  • at Davos, which included people like the Chairman

  • of the Federal Trade Commission, Justice Commissioner of the EU,

  • people from the Politburo in China, CEOs of things

  • like Vodafone, Microsoft, et cetera,

  • to be able to talk about these problems.

  • And so what we were looking for was a win-win-win solution,

  • where citizens would feel protected and be

  • able to see more value from their data,

  • from trading information, where companies would

  • be able to make money and where governments would

  • be able to provide public goods, be

  • able to make a more resilient society, cyber resistance,

  • and so forth.

  • And the nice thing-- this is the sort of diagram

  • they do at Davos.

  • While people speak, some incredibly talented artist

  • draws the discussion.

  • You can't really interpret it, but it's just

  • amazing to see them do it.

  • But the bottom line was is that the ideas that came out

  • of that, which are now enshrined in the Consumer Privacy

  • Bill of Rights in this country and the privacy directives

  • in the EU, and are being considered in China,

  • have to do with changing the way we treat

  • data, personal data, the sort of personal stories,

  • in a very fundamental way.

  • And that's to put much more control

  • in the hands of individual through notification

  • that people are collecting data about you, informed consent.

  • That means you show them what the data is, you describe

  • to them what the value they're going to get from sharing is,

  • and they opt in to that particular use.

  • And then they can opt out if they don't like it.

  • Auditing, to make sure that you did what you said

  • you were going to do.

  • And that's the retraction I've always talked about.

  • So that's where things are going.

  • In anticipation of that, I got DARPA to give us a lot of money

  • to build an infrastructure like this because they wouldn't

  • do much, except small experiments.

  • But in this country, I got [INAUDIBLE] in Italy

  • to be a living lab, to try to live in the future,

  • by giving citizens more control over their data,

  • using this infrastructure in conjunction

  • with Telecom Italia, Telefonica, the local government, and so

  • on.

  • And the experiment is to be able to say,

  • if people have a repository, a copy of all the data

  • that's about them, and that risk reward ratio is different

  • because they can opt into things,

  • and they know what they're opting into, they can opt out,

  • they could audit it.

  • You've changed the risk reward ratio.

  • Will the sharing go up?

  • Will companies make more money?

  • Will people do more sharing?

  • Will you get greater innovation through that sharing

  • of ideas and information?

  • Will government be able to do a better

  • job at providing public goods?

  • And so we've deployed this for the last year and a half.

  • It has many sort of technical elements.

  • Some of the ones that I'm proud about, as we've

  • talked MITRE, which is a government defense

  • contractor, to releasing something

  • called OpenID Connect as an open source.

  • Identity mechanism, that's really quite state of the art.

  • It's now being supported by MIT Kerberos Consortium.

  • Their intent is to make it a basic element,

  • the internet security.

  • If you are interested in this stuff,

  • you should take a look at it.

  • And also trust network technologies,

  • whereby it's a combination of long computer code

  • that give people more control and audit

  • ability over personal data.

  • An example of this, as you might be familiar with,

  • is the SWIFT network, which is for interbank transfer.

  • So the SWIFT network handles $3 trillion of money a day.

  • And as far as we know, it's never been hacked,

  • despite operating I think at 164 different countries

  • and with a lot of dodgy banks.

  • And it has to do with a marriage between the sharing

  • protocols and the legal contract,

  • which is a consumer contract.

  • It's not special regulations.

  • It's just contract law with close match between that.

  • So the communications begin with the offer

  • of a contract, the signing of a contract, it's auditable,

  • there's joint liability so everybody's watching out

  • for everybody else, and it seems to work.

  • The Visa network is a similar sort of thing.

  • Some of this is used in the federal government

  • for medical data sharing.

  • And one of the things that's particular about our solution--

  • and I know this is too quick, but I'm hoping that there's

  • people that are interested in this--

  • is that like the SWIFT network, we

  • don't provide sharing of data, except in extreme.

  • So generally, there's no reason to share data.

  • What you want to do is you want to share answers to questions,

  • and you want to share stories.

  • You want to say, oh, yes, I'm in San Francisco,

  • not, I'm at this lat long.

  • Or I'm in San Francisco today, not lat long at 3:15 PM.

  • So by providing the most, sort of,

  • general story elements possible, you

  • get the commercial opportunities,

  • the public good opportunities, with much, much less exposure

  • in terms of raw data sharing and unintended uses.

  • It's not a perfect answer, but it's a good answer.

  • So we've deployed this in [INAUDIBLE],

  • we're deploying it in MIT.

  • We could do it other places.

  • I'm going to tell you just a little bit

  • about the basic architecture, but then I

  • see I'm running too long, right?

  • I am, yeah.

  • So I won't tell you too much.

  • There is an architecture.

  • The cool thing about this is once people have personal data

  • stories, you can spin up applications trivially.

  • There's no collecting the data to get bootstrapped.

  • The data is already there.

  • So we're doing this with Mass General Hospital.

  • Were doing it with lots of other people where, you know,

  • when you opt in, day one, it has a whole history of you

  • because you've been storing all that data.

  • Anyway, so taking too long, sorry.

  • Big data, better life.

  • That's what we're about.

  • The book describes all this in detail.

  • Thank you.

  • [CLAPPING]

  • AUDIENCE: So you show these beautiful correlations

  • between some outcome for society and the number of interactions,

  • right?

  • And I'm wondering, is there strong evidence

  • of causality there?

  • But for instance, if we just tweak how much interaction

  • is going on in a given society, would that, in and of itself,

  • escalate it?

  • SANDY PENTLAND: So we know that it's causal in small groups

  • and in groups of 100 people because we've

  • done interventions.

  • We don't know that it's causal in big groups.

  • But you can look at, for instance,

  • the architecture of lots and lots of different cities,

  • and it makes a certain amount of sense.

  • You see the same pattern.

  • Unless, of course, [INAUDIBLE] fits so well.

  • Basically, what you're talking about

  • is sort of the Jane Jacob solution, which

  • is small communities with very good transportation

  • infrastructure between them.

  • A small community where you could walk around

  • gives you the strong engagement and culture and social support.

  • And the very good transportation infrastructure

  • lets communities interact with each other.

  • That's the way the design shakes out, basically.

  • So we think it's causal.

  • We don't know.

  • We're trying to work with cities to show that it is, all right?

  • AUDIENCE: So I work in privacy, and I liked your remarks

  • on modification and firm consent, auditing,

  • and the rest.

  • What do you think about actual automatic expiration of such--

  • SANDY PENTLAND: I think that's a great idea--

  • AUDIENCE: Would it increase the value over a long time,

  • or would it have a negative effect

  • to the value of society over a long time?

  • SANDY PENTLAND: I think it's one of these things

  • that you have to experiment with,

  • but I would expect it would increase it.

  • I mean, you know, the fundamental thing

  • is risk reward, right?

  • You want to know what's going on,

  • so you don't want to be spied on.

  • You want to have control over it.

  • And you want to be able to share an expectation of a return

  • without a lot of downside.

  • So expiration means that it's less likely to spread.

  • Auditing means that it's less likely to get stolen.

  • It's still will sometimes, but what it is really

  • is it's a framework that's a lot like our financial network,

  • our banking.

  • You know, you have these strings of numbers

  • that are the amount of money that you have,

  • and you give it to this thing called a bank.

  • And then you could look at it, and the federal government

  • comes and audits them, and you could

  • take it out if you don't like it.

  • And so we're talking about that with personal data, where

  • I put it in a bank, and I say, I will give it

  • to these people in return for these sorts of things.

  • And if I don't like what you do with it, I'll take it back.

  • And then the next objection is, wasn't this too complicated?

  • And yes, it is too complicated.

  • That's why we have things like mutual funds and 401ks,

  • and junk like that is because it's just way too complicated

  • for a human.

  • But you'd have the same sort of thing with personal data.

  • The AARP would have a standard way for elderly people

  • to share data that is deemed safe.

  • AUDIENCE: Specifically what I mean

  • is I opt in into something that that

  • opt in is not treated as indefinite.

  • SANDY PENTLAND: It should be absolutely.

  • The opt in should be part of the contract

  • that it expires, right?

  • AUDIENCE: Yes.

  • SANDY PENTLAND: Yeah.

  • AUDIENCE: Thank you.

  • AUDIENCE: I had a quick question about the trying to break up

  • the investment trading circles.

  • Is there a reason you chose an individual incentive

  • to try to break up the social networks,

  • or was that just the easiest way to try to break those up?

  • SANDY PENTLAND: So we tried several different things.

  • One was just giving-- first of all,

  • they're not individual incentives.

  • What it is, is saying, here's a coupon

  • if you follow that person.

  • So it's saying, build a link in the social graph.

  • It's not like you think about it more or something like that.

  • So we tried several things.

  • One was to give people random coupons.

  • So just pay attention to a random person that did nothing.

  • We gave people coupons to pay attention

  • to the highest performing people.

  • That did something.

  • That returns by about 2%.

  • And then we took people that were targeted to break up

  • the feedback loops, and that was the thing

  • that had this much larger effect, OK?

  • But notice that it wasn't an incentive

  • for any particular person to do well, all right?

  • Some of the people we gave coupons did less well, OK?

  • But I don't really care.

  • What it did is it broke up the loops,

  • and that the average performance went up higher.

  • [CLAPPING]

BRADLEY HOROWITZ: I want to welcome you all.

字幕と単語

ワンタップで英和辞典検索 単語をクリックすると、意味が表示されます

B1 中級

サンディ・ペントランド"社会物理学:良いアイデアはどのようにして広がるのか? (Sandy Pentland: "Social Physics: How Good Ideas Spread" | Talks at Google)

  • 119 9
    Hhart Budha に公開 2021 年 01 月 14 日
動画の中の単語