Placeholder Image

字幕表 動画を再生する

  • Peter Kafka: I'm not going to do a long wind-up here, because I have a lot of questions

  • for my next guest.

  • I'm delighted she's here.

  • Please welcome Susan Wojcicki, CEO of YouTube.

  • They gave you a good hip-hop theme for your way in.

  • Susan Wojcicki: Thank you.

  • Thank you for coming.

  • Sure.

  • Thank you for having me.

  • I'm really glad we get to have this conversation.

  • I'm glad we get to do it in public, on a stage, on the record.

  • That's great.

  • Let's start here.

  • There was a bunch of news last week.

  • Some of it involved you.

  • Some of it involved vox.com, where I work.

  • There was a policy change.

  • I think they all sort of happened at the same time.

  • Can we just walk through what happened, and if they're parallel tracks, or if they were

  • connected?

  • Sure.

  • So, first of all, thank you.

  • A lot of things happened last week, and it's great to be here and talk about what happened.

  • But I do want to start, because I know that the decision that we made was very hurtful

  • to the LGBTQ community, and that was not our intention at all.

  • Should we just set context, for anyone who was not following this?

  • What decision this was?

  • Yeah.

  • So, let me ... I'll go into that.

  • But I thought it was really important to be upfront about that, and to say that was not

  • our intention, and we were really sorry about that.

  • But, I do want to explain why we made the decision that we did, as well as give information

  • about the other launch that we had going on.

  • Really, there were two different things that happened at the same time.

  • The first one I'll talk with is, we made a really significant change involving hate

  • speech.

  • This is something we had been working on for months, and we launched it on Wednesday of

  • last week.

  • And this is a series of policy changes you've been rolling out for years now.

  • So, just to be clear ... Yeah.

  • So, we've been making lots of different policy changes on YouTube.

  • We have made about 30 changes in the last 12 months, and this past week, we made a change

  • in how we handle hate speech.

  • That took months and months of work, and hundreds of people we had working on that.

  • That was a very significant launch, and a really important one.

  • What we did with that launch is we made a couple big changes.

  • One of them was to make it so that if there's a video that alleges that some race or religion

  • or gender or group, protected group, is superior in some way, and uses that to justify discrimination

  • or exclusion, that would now no longer be allowed on our platform.

  • Similarly, if you had a religion or race, and they alleged that inferiority, that another

  • group was inferior, and they used that to justify discrimination in one way.

  • Those were changes that we made.

  • So, examples would be like, “Race X is superior to Y, and therefore Y should be segregated.”

  • Is it weird to you that you had to make a rule that said, “This shouldn't be allowed”?

  • That this wasn't covered either by an existing rule?

  • That you had to tell your community, “Look.

  • This is not acceptable”?

  • Well, actually, a lot of this ... We're a global company, of course.

  • And so, if you look at European law, there are a number of countries that have a really

  • strong hate speech law.

  • And so, a lot of this content had never been allowed in those countries, but had actually

  • been allowed in the US and many other countries.

  • And so what we had actually done with it a few years ago is we had actually had limited

  • features, meaning that it wasn't in the recommendations.

  • It wasn't monetized.

  • It had an interstitial in front of it to say that this was content that we found offensive.

  • And when we did that, we actually reduced the views to it by 80 percent.

  • So, we found that it was effective, but we really wanted to take this additional step,

  • and we made this step on Wednesday.

  • We also added, which is really important, a few other definitions to protected groups.

  • So, we added caste, because YouTube has become so significant in India.

  • Then, we also added victims of verified violent events.

  • So, like saying the Holocaust didn't happen, or Sandy Hook didn't happen, also became

  • violations of our policies.

  • And so, this was happening on Wednesday, and we launched it on Wednesday.

  • There were thousands of sites that were affected.

  • And again, this is something that we had been working on ...

  • This was coming already.

  • It was coming already.

  • We had started briefing reporters about it in Europe over the weekend, because they're

  • ahead.

  • You know, the train had left the station.

  • And then at the sameon Friday, there was a video.

  • We heard the allegations from Mr. Carlos Maza, who uploaded a video on Twitter with a compilation

  • Works at vox.com.

  • Who works at vox.com, yes.

  • With a compilation of different video pieces from Steven Crowder's channel, putting them

  • together, right?

  • And asked us to take action.

  • Each of these videos had harassment

  • Saying, “He's directing slurs at me, and the people who follow him are attacking me

  • outside of YouTube, as well.”

  • Yes.

  • So, he alleged that there was harassment associated with this, and we took a look at this.

  • You know, we tweeted back and we said, “We are looking at it.”

  • You know, Steven Crowder has a lot of videos, so it took some time for us to look at that

  • and to really understand what happened, and where these different snippets had come from

  • and see them in the context of the video.

  • Actually, one of the things I've learned, whenever people say, “There's this video

  • and it's violative.

  • Take it down or keep it up,” you have to actually see the video, because context really,

  • really matters.

  • And so, we looked through a large number of these videos, and in the end we decided that

  • it was not violative of our policies for harassment.

  • So, were you looking at this yourself, personally?

  • Vox is a relatively big site.

  • It's a big creator.

  • Were you involved in this directly?

  • I mean, I am involved whenever we make a really important decision, because I want to be looking

  • at it.

  • So, you were looking at the videos.

  • Well, so we have many, many different reviewers.

  • Mm-hmm.

  • They will do a review.

  • Again, there are lots of different videos produced by Steven Crowder.

  • He's been a longtime YouTuber.

  • But in this case, did you weigh in personally?

  • Did you look at the stuff?

  • I mean, yes.

  • I do look at the videos, and I do look at the reports and the analysis.

  • Again, I want to say there were many videos, and I looked certainly at the compilation

  • video.

  • So, when the team said, “We believe this is non violative.

  • This doesn't violate our rules,” you agreed with that?

  • Well, let me explain to you why.

  • Mm-hmm.

  • Why we said that.

  • But you agreed?

  • I agreed that that was the right decision, and let me explain to you why I agreed that

  • was the right decision.

  • Okay?

  • So, you know, when we gotfirst of all, when we look at harassment and we think about

  • harassment, there are a number of things that we look at.

  • First of all, we look at the context.

  • Of, you know, “Was this video dedicated to harassment, or was it a one-hour political

  • video that had, say, a racial slur in it?”

  • Those are very different kinds of videos.

  • One that's dedicated to harassment, and one that's an hour-longso, we certainly

  • looked at the context, and that's really important.

  • We also look and see, is this a public figure?

  • And then the third thing that we look at is, you know, is it malicious?

  • Right?

  • So, is it malicious with the intent to harass?

  • And for right or for wrong right now, malicious is a high bar for us.

  • So the challenge is, like when we get an allegation like this, and we take it incredibly seriously,

  • and I can tell you lots of people looked at it and weighed in.

  • We need to enforce those policies consistently.

  • Because if we were not to enforce it consistently, what would happen is there would be literally

  • millions of other people saying, “Well, what about this video?

  • What about this video?

  • What about this video?

  • And why aren't all of these videos coming down?”

  • And if you look at the content on the internet, and you look at rap songs, you look at late-night

  • talk shows, you look at a lot of humor, you can find a lot of racial slurs that are in

  • there, or sexist comments.

  • And if we were to take down every single one, that would be a very significant

  • So, to stipulate that you take it seriously.

  • I want to come back to the idea that there's a ton of this stuff here.

  • Well, so what we did commit toand really, this is I think really importantis we

  • committed, like, “We will take a look at this, and we will work to change the policies

  • here.”

  • We want to be able towhen we change a policy, we don't want to be knee jerk.

  • We don't want it to be like, “Hey, I don't like this video,” or, “This video is offensive.

  • Take it down.”

  • We need to have consistent policies.

  • They need to be enforced in a consistent way.

  • We have thousands of reviewers across the globe.

  • We need to make sure that we're providing consistency.

  • So, your team spends a bunch of time working on it.

  • They come to you at some point and they say, “We don't think this is violative.”

  • You say, “We agree.”

  • You announce that.

  • And then a day later you say, “Actually, we do have problems with this.”

  • Well, so what ... Okay.

  • So, we did announce it, and when we announced it, if you look carefully at the tweet, what

  • we actually said at the end is, “We're looking at other avenues.”

  • Mm-hmm.

  • That's because we actually have two separate processes.

  • One of which is like, “Is this content violative,” from just the purely community guidelines.

  • But then we also have monetization guidelines, and that's because we have a higher standard

  • for monetization.

  • We're doing business with this partner.

  • Our advertisers also have a certain expectation of what type of content they are running on.

  • And so, we had the first review.

  • We said, “It doesn't violate the community guidelines on harassment, but we'll take

  • a look at our harassment guidelines and commit to updating that.”

  • Which actually had been on our plan anyway.

  • I had actually put that in my creator letter that I had just done a few weeks ago, saying

  • we were going to take a hard look at it.

  • But we had been working so hard on the hate speech, and so our teams were caught up on

  • that.

  • But that really had been next on our list.

  • So, we have a higher standard for monetization, so then we did announce the monetization change.

  • That Steven Crowder was, his monetization was suspended.

  • So, was that in reaction to people reacting to you not reacting?

  • No.

  • Or was that something that you were already planning to do and just hadn't gotten around

  • to announcing?

  • No.

  • We were in the process of looking at that, and there werewhen we look at these accounts,

  • there are many different components that we look at, and that's actually why we put

  • the line, “There are other avenues that we're still looking at.”

  • And that might have been too subtle.

  • If I were to do it again, I would put it all into one

  • Do it in one go.

  • Yeah, I would do it all in one go.

  • But we were also

  • So you said, “We're not kicking you off, but we're not going to help you make money

  • on YouTube.

  • It'll be directly through ads.”

  • We're suspending monetization.

  • Meaning, “We're not going to run ads against your stuff.

  • If you still want to sell racist coffee mugs or whatever you're selling, that's your

  • business, but we're not going to help you.

  • We're not going to put an ad in front of your stuff.”

  • Well, we said we're not going to put an ad in front of it, but the conditions by which

  • we will turn it on can be broader than just that.

  • So, for example, if they're selling merchandise and linking off of YouTube, and that is seen

  • as racist or causing other problems, that's something that we will discuss with the creator.

  • So, one more question specific to this.

  • Because again, we're putting advertising there, so we need to make sure that the advertisers

  • are going to be okay with it, and we have a higher standard.

  • And so, we can sort of look at all different parts of that creator and what they're doing,

  • and basically apply that higher standard there.

  • So, people I work with at Vox and other people are saying the one problem we've got with

  • all this, in addition to what seems like a back and forth, is that we don't understand

  • why you made the decision you made.

  • There's not enough transparency.

  • We can't figure out what rules he did or didn't break.

  • And also, by the way, it seems clear that he did break these rules.

  • But they're asking for transparency, they're asking for more understanding of what went

  • on here in this specific case.

  • Is that something that's reasonable for someone to expect out of you and out of YouTube?

  • To say, “Here's exactly what happened.

  • Here's exactly what broke the rule for us.

  • Here's exactly why we're demonetizing it”?

  • Which case are you talking about?

  • Well, in the case of the Crowder/Maza stuff.

  • But for anything, right?

  • So, we tried to be really transparent.

  • We communicated numerous times, including publishing a blog explaining some of the rationale

  • for our decision.

  • We try to be really transparent with our community, with our guidelines.

  • We get that request actually a lot from our creators, because they want to know what's

  • allowed on our platform, what's not allowed, from a monetization standpoint.

  • And so, we do