字幕表 動画を再生する 英語字幕をプリント Why would you even put this sting out there when you know what a huge percentage of the world is going to do with that? I think the most likely impactful use of deep fakes is to be used to harass individuals. Deep fake videos simulate real people in fake situations and a growing it's an alarming rate. We need to be more vigilant with what we trust from the Internet, the vast majority of these videos or female celebrities having their likeness swapped into sexually explicit videos without their knowledge or consent. But how far away we from this technology being easily accessible to the general public? So what are we looking at here? Okay, so right. So if I smile, you can see that the movie is actually moving with. And as you can see, it's calculating the whole thing in real time, right? So this is basically Kim Jong Un's face over lead onto mine. Wow, but But you're talking my facial expressions so I can, you know, look into the camera and no smile. This is an extreme form off the fake because, first of all, it's real time, and the second thing is that I just need a single picture of that person. Do you see it getting to a stage in the next couple of years? In the next six months, where I could have a nap and creates, you know, this the face already in the hands of the public may be nuts. Everyone can create whatever they want. I think this is much harder, but I think it will just improve significantly so that there will be examples of videos that will not be detected. Hollywood actress Bella Thorne is one of the most deep, faked women in the world. One video of her crying over the death of her father has been edited into pornography. And then I'm like saying like something something. But I miss him. Obviously my father is dead, and then they put that together with a girl masturbating in This video is going around, and everyone really is thinking that it is actually me. Earlier this year, a hacker threatened to publish topless photos of Bella. Her response was to release the nude. Images have fell, but she worries that deep fake may be used to blackmail ordinary people in the future. I don't know how we regulate Alps and things like that, but because it's not going to just be your favorite celebrity or your favorite, this person or this person that you want to put in this app because you can do it to your best friend in school. If you decide you hate them so much, you can send it around to this person. You could do it to AH, famous politician of this and that safeguarding people against the precious of growing up online has become a point of advocacy for Bella Thorne. It's also an increasing worry and priority for people in the tech world. What's happened over the last couple of years is the technological development that's been driven by video games. The capability to create deep fake videos and images like deep nudes is totally within the capability of somebody who just buys a reasonable gaming PC. How much of a I think it's actually really big priority it. The executives at these platforms have realized that the people have been saying our tools are being used for harm, are starting to win out that they were right in that there need to be more investment. This is going to be another field where people are going to specialize in spend years of their career trying to figure out how to stop. With online abuse increasing on a rise in digital misinformation developed a safe, deep fakes will only become more prevalent, more sophisticated and even harder to detect.