Placeholder Image

字幕表 動画を再生する

AI 自動生成字幕
  • Mark welcome to your first SIGGRAPH How do you see the the advances of generative AI ad meta today and how do you apply it to either enhance your operations or Introduce new capabilities that you're offering with generative AI I think we're gonna quickly move into the zone where Not only is is the majority of the content that you see today on Instagram You're just recommended to you from kind of stuff that's out there in the world that matches your interests and whether or not you follow The people I think in the future a lot of this stuff is going to be Created with these tools to some of that is going to be creators using the tools to create new content Some of it I think eventually is going to be content.

    マーク 最初のSIGGRAPHにようこそ ジェネレーティブAIの広告メタの進歩をどのように見ていますか?また、それをどのように応用して、御社の業務を強化したり、ジェネレーティブAIを使って新しい機能を提供したりしていますか?将来的には、このようなコンテンツの多くは、このようなツールを使って作成され、その一部は、ツールを使って新しいコンテンツを作成するクリエイターになると思います。

  • That's either created on the fly for you Or or or kind of pulled together and synthesized through different things that are out there So I kind of dream of one day Like you can almost imagine all of Facebook or Instagram being you know Like a single AI model that has unified all these different content types and systems together That actually have different objectives over different time frames, right?

    だから、いつかフェイスブックやインスタグラムのすべてが、さまざまなタイプのコンテンツやシステムを統合した単一のAIモデルのようなものになることを想像できるような、そんな日を夢見ているんだ。

  • Because some of it is just showing you you know What's the interesting content that you're gonna be that you want to see today?

    というのも、そのうちのいくつかは、あなたが今日見たいと思う面白いコンテンツは何なのかを示しているだけだからだ。

  • But some of it is helping you build out your network over the long term right people you may know or accounts These these multimodal Tend to be yeah tend to be much better at recognizing patterns weak signals and such and so one of the things that people people always you know, it's so interesting that AI has been So deep in your company.

    しかし、そのうちのいくつかは、長期的にネットワークを構築するのに役立つものです。これらのマルチモーダルな人々は、パターンや弱いシグナルなどを認識するのに非常に優れている傾向があります。

  • You've been building GPU infrastructure running these large recommender systems for a long time Now you're now you're slow on it actually getting to GPUs.

    あなたは長い間、こうした大規模なレコメンダー・システムを実行するGPUインフラを構築してきました。

  • Yeah, I was trying to be nice I know.

    ああ、親切のつもりだったんだ。

  • Well, tell everybody about the creator AI and AI studio that's gonna enable you to do that Yeah, so we actually I mean this is something that we're you know, we've talked about it a bit But we're rolling it out a lot wider today You know a lot of our vision is that I don't think that there's just gonna be like one AI model, right?

    クリエイターAIと、それを可能にするAIスタジオについて、みんなに教えてほしい。

  • I mean, this is something that some of the other companies in the industry They're like, you know, it's like they're building like one central agent and and yeah We'll have the med AI assistant that you can use but a lot of our vision is that We want to empower all the people who use our products to basically create agents for themselves So whether that's you know, all the many many millions of creators that are on the platform or you know hundreds of millions of small businesses we eventually want to just be able to pull in all your content and very quickly stand up a business agent and Be able to interact with your customers and you know do sales and customer support and all that The one that we're that we're just starting to roll out more now is We call it AI studio and it basically is a set of tools that eventually is gonna make it so that every creator can build Sort of an AI version of themselves As sort of an agent or an assistant that that their community can interact with There's kind of a fundamental issue here where there's there's just not enough hours in the day, right?

    私たちのビジョンの多くは、私たちの製品を利用するすべての人々に、基本的に自分自身のためにエージェントを作成する権限を与えたいということです、プラットフォーム上にいる何百万というクリエイターであれ、何億という中小企業であれ、私たちは最終的に、あなたのすべてのコンテンツを取り込み、非常に迅速にビジネスエージェントを立ち上げ、あなたの顧客と対話し、販売やカスタマーサポートなどを行えるようにしたいと考えています。AIスタジオと呼んでいるもので、基本的には一連のツールで、最終的にはすべてのクリエイターがAIバージョ

  • It's like if you're if you're a creator you want to engage more with your community But you you're constrained on time and similarly your community wants to engage with you But it's tough.

    もしあなたがクリエイターなら、自分のコミュニティともっと関わりたいと思うでしょう。でも時間的な制約があり、同様にコミュニティもあなたと関わりたいと思っている。

  • I mean, there's there's just there's a limited time to do that So the next best thing is is allowing people to basically create These artifacts, right?

    だから、次善の策は、基本的にこれらの人工物を作ることができるようにすることなんだ。

  • It's um, it's sort of it's an agent, but it's you train it to kind of on your material To represent you in the way that you want I think it it's it's a very kind of creative endeavor almost like a like a piece of art or content that you're putting out There and no, it's it's me very clear that it's not engaging with the creator themselves But I think it'll be another interesting way just like how creators put out content on on these Social systems to be able to have agents that do that One of the interesting use cases that we're seeing is people kind of using these agents for support This was one thing that that was a little bit surprising to me is one of the top use cases for meta AI already is People basically using it to roleplay difficult social Situations that they're gonna be in so whether it's a professional situation.

    それは、ある意味、エージェントなんだけど、あなたの素材について、あなたが望むようにあなたを代理するように訓練するものなんだ。それは、とてもクリエイティブな試みで、あなたがそこに発信している芸術作品やコンテンツのようなものだと思う、しかし、クリエイターがソーシャルシステムでコンテンツを発信するように、エージェントがそのようなことができるようになるのは、別の興味深い方法になると思います。

  • It's like, all right I want to ask my manager like how do I get a promotion or raise or I'm having this fight with my friend or I'm Having this difficult situation with my girlfriend Like how like how can this conversation go and basically having a like a completely?

    どうすれば昇進や昇給ができるのかとか、友人とケンカしているとか、ガールフレンドと難しい状況になっているとか。

  • judgment-free Zone where you can basically roleplay that and see how the conversation will go and and get feedback on it But a lot of people they don't just want to interact with the same kind of you know agent whether it's meta AI or chat GPT Or whatever it is that everyone else is using they want to kind of create their own things So the llamas is genuinely important We built this concept to call an AI factory AI foundry around it so that we can help everybody build You know a lot of people they they they have a desire to Build AI and it's very important for them to own the AI because once they put that into their their flywheel their data flywheel That's how their company's institutional knowledge is encoded and embedded into an AI So they can't afford to have the AI flywheel the data flywheel that experience flywheel somewhere else So and then so open source allows them to do that But they they don't really know how to turn this whole thing into an AI and so we created this thing called an AI foundry we provide the tooling we provide the expertise llama Technology we have the ability to help them turn this whole thing Into an AI service and and then when when we're done with that They take it they own it we the output of its what we call a NIM and this NIM this this neural micro and video inference microservice They just download it they take it and they run it anywhere they like including on-prem and we have a whole ecosystem of partners From OEMs that can run the NIMs to GSIs like Accenture that that we've trained and work with to create llama based NIMs and and and Pipelines and and now we're we're off helping enterprises all over the world do this.

    判断しかし、多くの人々は、メタAIであれ、チャットGPTであれ、みんなが使っているのと同じようなエージェントと対話したいとは思っていません。多くの人々がAIを構築したいという願望を持っており、AIを所有することは非常に重要です。そのため、AIのフライホイールやデータのフライホイール、経験のフライホイールを他の場所に置く余裕はない。そこで、オープンソースがそれを可能にするのだ。llamaテクノロジーは、彼らがこの全体をAIサービスに変えるのを支援する能力を持っています。そして、それが終わったら、彼らはそれを

  • I mean, it's really quite an exciting thing It's really all triggered off of the llama open sourcing the the Ray-Ban metaglass Your vision for for bringing AI into the virtual world is really interesting tell us about that Yeah, well, okay a lot to unpack in there The segment anything model that you're talking about We're actually presenting I think the next version of that here at SIGGRAPH segment anything, too And it is it now works.

    つまり、とてもエキサイティングなことなんだ。レイバンのメタグラスをオープンソース化したラマをきっかけに、AIを仮想世界に持ち込むというあなたのビジョンは本当に興味深い。

  • It's faster.

    その方が速い。

  • It works with Here we go It works in video now as well, I think these are actually cattle from my ranch in Kauai By the way, these are what they're called delicious delicious delicious marks There you go Yeah, another next time we do so mark mark came over to my house and we made Philly cheesesteak together next time You're bringing the time you did I was more of a sous-chef the fun effects Will be able to be made with this and because it'll be open a lot of more serious applications across the industry, too So, you know, I mean scientists use this stuff to you know study like coral reefs and natural habitats and in kind of evolution of landscapes and things like that, but I mean it's a being able to do this in video and having a be a zero shot and be able to kind of interact with it and tell it what you want to track is It's it's it's pretty cool research.

    ちなみに、これはカウアイ島にある僕の牧場の牛だと思うんだけど、おいしいおいしいと呼ばれているものなんだ。マークマークが僕の家に来て、今度一緒にフィリーチーズステーキを作ったんだ、科学者たちは、サンゴ礁や自然の生息地、景観の進化などを研究するために、このようなものを使っている。

  • I think what you're gonna end up with is It's just a whole series of different potential glasses products at different price points with different levels of technology in them So I kind of think Based on what we're seeing now with the Ray-Ban metas.

    最終的に行き着くのは、価格帯も技術レベルも異なる、さまざまな可能性を秘めたメガネのシリーズだと思う。

  • I would guess that displayless AI glasses at like a $300 price point are going to be a really big product that like tens of millions of people or hundreds of millions of people eventually are gonna have You're gonna have super interactive AI that you're talking to yeah visual you have visual language understanding that you just showed You have real-time translation.

    300ドル程度のディスプレイレスAIメガネは、何千万人、何億人という人々がいずれ手にすることになる、本当に大きな製品になるだろう。

  • You could talk to me in one language.

    あなたは片言の日本語で私に話しかけることができる。

  • I hear it in another language Yeah, then the display is obviously gonna be great, too But it's gonna add a little bit of weight to the glasses and it's gonna make them more expensive So I think for there will be a lot of people who want the kind of full holographic display But there are also gonna be a lot of people for whom You know, they they want something that eventually is gonna be like really thin glasses and so you guys know when when when Zuck calls it h100 his data center of h100 There's like I think you're coming up on 600,000 and and they're we're good customers That's how you get the Jensen Q&A at SIGGRAPH It's ladies and gentlemen Mark Zuckerberg, thank you You

    でも、メガネの重量が少し増えるし、値段も高くなる、ザックがh100と呼んだとき、彼のh100データセンターには60万人が集まりました。

Mark welcome to your first SIGGRAPH How do you see the the advances of generative AI ad meta today and how do you apply it to either enhance your operations or Introduce new capabilities that you're offering with generative AI I think we're gonna quickly move into the zone where Not only is is the majority of the content that you see today on Instagram You're just recommended to you from kind of stuff that's out there in the world that matches your interests and whether or not you follow The people I think in the future a lot of this stuff is going to be Created with these tools to some of that is going to be creators using the tools to create new content Some of it I think eventually is going to be content.

マーク 最初のSIGGRAPHにようこそ ジェネレーティブAIの広告メタの進歩をどのように見ていますか?また、それをどのように応用して、御社の業務を強化したり、ジェネレーティブAIを使って新しい機能を提供したりしていますか?将来的には、このようなコンテンツの多くは、このようなツールを使って作成され、その一部は、ツールを使って新しいコンテンツを作成するクリエイターになると思います。

字幕と単語
AI 自動生成字幕

ワンタップで英和辞典検索 単語をクリックすると、意味が表示されます