Placeholder Image

字幕表 動画を再生する

AI 自動生成字幕
  • If you remember that first decade of the web, it was really a static place.

    ウェブの最初の10年を覚えていれば

  • You could go online, you could look at pages, and they were put up either by organizations who had teams to do it

    本当に静的な場所でした。

  • or by individuals who were really tech-savvy for the time.

    オンラインでページを見ることができます。

  • And with the rise of social media and social networks in the early 2000s,

    とか、団体が設置していました。

  • the web was completely changed to a place where now the vast majority of content we interact with is put up by average users,

    班を持っている人

  • either in YouTube videos or blog posts or product reviews or social media postings.

    あるいは、本当に<a href='#post_comment_1'>tech-savvy<i class="icon-star"></i>時代にあった個人によるものです。

  • And it's also become a much more interactive place, where people are interacting with others, they're commenting, they're sharing, they're not just reading.

    そして、ソーシャルメディアの台頭で

  • So Facebook is not the only place you can do this, but it's the biggest, and it serves to illustrate the numbers.

    とソーシャルネットワークを利用した2000年代初頭の

  • Facebook has 1.2 billion users per month.

    ウェブが一新されました

  • So half the Earth's Internet population is using Facebook.

    今やコンテンツの大半を占める場所に

  • They are a site, along with others, that has allowed people to create an online persona with very little technical skill,

    私たちが交流しているのは、平均的なユーザーによって投稿されています。

  • and people responded by putting huge amounts of personal data online.

    YouTubeの動画でもブログ記事でも

  • So the result is that we have behavioral, preference, demographic data for hundreds of millions of people, which is unprecedented in history.

    または製品レビューやソーシャルメディアへの投稿。

  • And as a computer scientist, what this means is that I've been able to build models that can predict all sorts of hidden attributes for all of you

    そして、よりインタラクティブな場所にもなっています。

  • that you don't even know you're sharing information about.

    人と人との交流があるところ

  • As scientists, we use that to help the way people interact online, but there's less altruistic applications,

    彼らはコメントしているし、共有している。

  • and there's a problem in that users don't really understand these techniques and how they work, and even if they did, they don't have a lot of control over it.

    彼らはただ読んでいるだけではありません。

  • So what I want to talk to you about today is some of these things that we're able to do,

    なので、Facebookだけではありません。

  • and then give us some ideas of how we might go forward to move some control back into the hands of users.

    でも、それが一番大きいんです。

  • So this is Target, the company.

    と数字を説明するのに役立ちます。

  • I didn't just put that logo on this poor, pregnant woman's belly.

    Facebookの月間ユーザー数は12億人。

  • on this poor, pregnant woman's belly.

    ということは、地球のインターネット人口の半分は

  • You may have seen this anecdote that was printed in Forbes magazine

    はFacebookを利用しています。

  • where Target

    他の人と並んでサイトです。

  • where Target sent a flyer to this 15-year-old girl with advertisements and coupons for baby bottles and diapers and cribs two weeks before she told her parents that she was pregnant.

    これにより、人々はオンラインの<a href='#post_comment_2'>persona<i class="icon-star"></i>を作成することができるようになりました</a>。

  • Yeah, the dad was really upset.

    技術力がほとんどない状態で

  • He said, "How did Target figure out that this high school girl was pregnant before she told her parents?"

    と人々は膨大な量の個人データをオンラインに置くことで対応しました。

  • It turns out that they have the purchase history for hundreds of thousands of customers

    その結果、行動、嗜好、人口統計学的なデータがあります。

  • and they compute what they call a pregnancy score, which is not just whether or not a woman's pregnant, but what her due date is.

    何億人もの人のために

  • And they compute that not by looking at the obvious things, like, she's buying a crib or baby clothes,

    歴史上類を見ない

  • but things like, she bought more vitamins than she normally had, or she bought a handbag that's big enough to hold diapers.

    そして、コンピュータ科学者として、これが意味することは

  • And by themselves, those purchases don't seem like they might reveal a lot,

    模型を作ることができた

  • but it's a pattern of behavior that, when you take it in the context of thousands of other people, starts to actually reveal some insights.

    あらゆる種類の隠れた属性を予測できる

  • So that's the kind of thing that we do when we're predicting stuff about you on social media.

    あなたの知らないあなたのために

  • We're looking for little patterns of behavior that, when you detect them among millions of people, lets us find out all kinds of things.

    についての情報を共有しています。

  • So in my lab and with colleagues, we've developed mechanisms where we can quite accurately predict things

    科学者として、私たちはそれを利用して

  • like your political preference, your personality score, gender, sexual orientation, religion, age, intelligence,

    人々がオンラインで交流する方法

  • along with things like how much you trust the people you know and how strong those relationships are.

    しかし、利他的なアプリケーションは少ない。

  • We can do all of this really well.

    という問題があります。

  • And again, it doesn't come from what you might think of as obvious information.

    これらの技術とその仕組みを理解してください。

  • So my favorite example is from this study that was published this year in the Proceedings of the National Academies.

    で、仮にそうなったとしても、それをどうにかすることはできません。

  • If you Google this, you'll find it.

    今日は何を話したいかというと

  • It's four pages, easy to read.

    は、私たちができることのいくつかです。

  • And they looked at just people's Facebook likes, so just the things you like on Facebook, and used that to predict all these attributes, along with some other ones.

    そして、どのように進めていくか、いくつかのアイデアを教えてください。

  • And in their paper they listed the five likes that were most indicative of high intelligence.

    ユーザーの手に戻るようにコントロールを移動させます。

  • And among those was liking a page for curly fries.

    これがターゲットという会社なんですね。

  • Curly fries are delicious, but liking them does not necessarily mean that you're smarter than the average person.

    そのロゴを入れただけではなく

  • So how is it that one of the strongest indicators of your intelligence is liking this page when the content is totally irrelevant to the attribute that's being predicted?

    このかわいそうな妊婦のお腹の上で。

  • And it turns out that we have to look at a whole bunch of underlying theories to see why we're able to do this.

    こんな逸話が印刷されていたのを見たことがあるかもしれません。

  • One of them is a sociological theory called homophily, which basically says people are friends with people like them.

    フォーブス誌でターゲットが

  • So if you're smart, you tend to be friends with smart people, and if you're young, you tend to be friends with young people,

    この15歳の女の子にチラシを送った

  • and this is well established for hundreds of years.

    哺乳瓶やオムツ、ベビーベッドの広告やクーポンで

  • We also know a lot about how information spreads through networks.

    妊娠していることを両親に伝える2週間前。

  • It turns out things like viral videos or Facebook likes or other information spreads in exactly the same way that diseases spread through social networks.

    ええ、お父さんは本当に動揺していました。

  • So this is something we've studied for a long time. We have good models of it.

    彼は「ターゲットはどうやってこの女子高生が妊娠していることを知ったの?

  • And so you can put those things together and start seeing why things like this happen.

    "両親に話す前に?"

  • So if I were to give you a hypothesis, it would be that a smart guy started this page, or maybe one of the first people who liked it would have scored high on that test.

    購入履歴があることが判明

  • And they liked it, and their friends saw it, and by homophily, we know that he probably had smart friends,

    何十万人もの顧客のために

  • and so it spread to them, and some of them liked it, and they had smart friends, and so it spread to them,

    彼らは妊娠スコアと呼ばれるものを計算します。

  • and so it propagated through the network to kind of a host of smart people,

    妊娠しているかどうかだけではない。

  • so that by the end, the action of liking the curly fries page is indicative of high intelligence, not because of the content,

    でも、彼女の予定日は?

  • but because the actual action of liking reflects back the common attributes of other people who have done it.

    そして、彼らはそれを計算します。

  • So this is pretty complicated stuff, right?

    当たり前のことを見るのではなく

  • It's a hard thing to sit down and explain to an average user, and even if you do, what can the average user do about it?

    ベビーベッドやベビー服を買っているような

  • How do you know that you've liked something that indicates a trait for you that's totally irrelevant to the content of what you've liked?

    でも、いつもより多くのビタミン剤を購入していました。

  • There's a lot of power that users don't have to control how this data is used.

    オムツが入るくらいの大きさのハンドバッグを買ったとか

  • And I see that as a real problem going forward.

    そして、それだけでは、それらの購入は

  • So I think there's a couple paths that we want to look at if we want to give users some control over how this data is used,

    彼らは多くのことを明らかにするかもしれない

  • because it's not always going to be used for their benefit.

    という行動パターンですが

  • An example I often give is that, if I ever get bored being a professor,

    他の何千人もの人の文脈で取ると

  • I'm going to go start a company that predicts all of these attributes and things like how well you work in teams and if you're a drug user, if you're an alcoholic.

    実際にいくつかの洞察を明らかにし始めます。

  • We know how to predict all that.

    ということで、そういうことをしています。

  • And I'm going to sell reports to H.R. companies and big businesses that want to hire you.

    ソーシャルメディアであなたのことを予測しているときに

  • We totally can do that now.

    ちょっとした行動パターンを探しています。

  • I could start that business tomorrow, and you would have absolutely no control over me using your data like that.

    何百万人もの人々の中からそれらを検出した時

  • That seems to me to be a problem.

    いろいろなことを調べさせてくれます。

  • So one of the paths we can go down is the policy and law path.

    だから私の研究室でも同僚と一緒でも

  • And in some respects, I think that that would be most effective, but the problem is we'd actually have to do it.

    私たちは、私たちができるメカニズムを開発しました。

  • Observing our political process in action makes me think it's highly unlikely

    卜する

  • that we're going to get a bunch of representatives to sit down, learn about this,

    あなたの政治的嗜好のように

  • and then enact sweeping changes to intellectual property law in the U.S., so users control their data.

    あなたの性格スコア、性別、性的指向

  • We could go the policy route, where social media companies say, "You know what? You own your data."

    宗教、年齢、知性

  • You have total control over how it's used.

    のようなものと一緒に

  • The problem is that the revenue models for most social media companies rely on sharing or exploiting users' data in some way.

    人をどれだけ信用するか

  • It's sometimes said of Facebook that the users aren't the customer, they're the product.

    そして、それらの関係性がどれほど強いかを知ることができます。

  • And so how do you get a company to cede control of their main asset back to the users?

    本当に全部できるんですよ。

  • It's possible, but I don't think it's something that we're going to see change quickly.

    そして繰り返しになりますが、それはあなたが考えるような明らかな情報からではありません。

  • So I think the other path that we can go down that's going to be more effective is one of more science.

    だから私のお気に入りの例は、この研究からです。

  • It's doing science that allowed us to develop all these mechanisms for computing this personal data in the first place.

    それは、今年の国立研究開発法人国立科学研究院の会議録に掲載されたものです。

  • And it's actually very similar research that we'd have to do if we want to develop mechanisms that can say to a user, "Here's the risk of that action you just took."

    これはググれば出てくると思います。

  • "Here's the risk of that action you just took."

    4ページで読みやすいです。

  • By liking that Facebook page, or by sharing this piece of personal information,

    そして、彼らは人々のFacebookの「いいね!」だけを見ていました。

  • you've now improved my ability to predict whether or not you're using drugs or whether or not you get along well in the workplace.

    だからFacebookで好きなものだけを

  • And that, I think, can affect whether or not people want to share something, keep it private, or just keep it offline altogether.

    そして、それを使ってこれらの属性をすべて予測しました。

  • We can also look at things like allowing people to encrypt data that they upload,

    他のいくつかのものと一緒に。

  • so it's kind of invisible and worthless to sites like Facebook or third party services that access it,

    そして、彼らの論文では、彼らは5つの好きなものをリストアップしました。

  • but that selected users who the person who posted it want to see it have access to see it.

    知性の高さを最も示していた

  • This is all super exciting research from an intellectual perspective, and so scientists are going to be willing to do it.

    そしてその中には、あるページに「いいね!」をしたことがありました。

  • So that gives us an advantage over the law side.

    カリーポテトのために(笑)

  • One of the problems that people bring up when I talk about this is, they say,

    カリーポテトは美味しいです。

  • when I talk about this is, they say,

    が、好きだからといって、必ずしも

  • "You know, if people start keeping all this data private, all those methods that you've been developing to predict their traits are going to fail."

    一般人よりも頭がいいということは

  • And I say, "Absolutely!", and for me, that's success,

    では、最強の指標の一つである

  • because as a scientist, my goal is not to infer information about users, it's to improve the way people interact online.

    あなたの知性がこのページを気に入っている

  • And sometimes that involves inferring things about them, but if users don't want me to use that data, I think they should have the right to do that.

    内容が全く関係ない場合

  • I want users to be informed and consenting users of the tools that we develop.

    を予測されている属性にするのか?

  • And so I think encouraging this kind of science and supporting researchers who want to cede some of that control back to users and away from the social media companies

    を見なければならないことが判明しました。

  • means that going forward, as these tools evolve and advance, means that we're going to have an educated and empowered user base,

    根底にある諸説

  • and I think all of us can agree that that's a pretty ideal way to go forward.

    ができるようになった理由を見てみましょう。

  • Thank you.

    その一つがホモフィリーと呼ばれる社会学的な理論である。

If you remember that first decade of the web, it was really a static place.

ウェブの最初の10年を覚えていれば

字幕と単語
AI 自動生成字幕

ワンタップで英和辞典検索 単語をクリックすると、意味が表示されます