字幕表 動画を再生する 英語字幕をプリント [MUSIC PLAYING] JASON MAYES: Hello, everyone. My name is Jason Mayes. I'm a developer advocate within the TensorFlow org here at Google. And today, I've got Gonzalo from the TensorFlow Enterprise team. GONZALO GASCA MEZA: Hi, Jason. How are you? JASON MAYES: Very good, thanks. So what's new in TensorFlow Enterprise? GONZALO GASCA MEZA: TensorFlow Enterprise is seamless, scalable, and supportive TensorFlow Distribution, which is available in a variety of Google Cloud AI products. To date, TensorFlow Enterprise provides users with an optimized version of TensorFlow, which also includes long-term version support. JASON MAYES: Awesome. GONZALO GASCA MEZA: TensorFlow Enterprise contains custom-built TensorFlow related packages, such as TensorFlow Datasets, TensorFlow I/O, TensorFlow Probability, TensorFlow Hub, and TensorFlow Estimator. Each TensorFlow Enterprise Distribution is anchored to a particular version of TensorFlow. And all packages are included in the open source version. TensorFlow Enterprise is available in different Google Cloud AI products, such as Deep Learning Containers, AI Platform, Notebooks, and Deep Learning VM image. TensorFlow Enterprise also provides optimizations when used with other Google Cloud Services, such as Google Cloud Storage and BigQuery. JASON MAYES: So what demos do you have in store today for us? GONZALO GASCA MEZA: Today, we will be installing a Deep Learning VM Image. And the way we're going to do it is from the UI first. And then we also have the option to deploy a VM image from the CLI. TensorFlow Enterprise Distribution when used with GCP, provides security fixes and selective bug patches for a period of three years. All users of TensorFlow open source receive only one year of security fixes. In contrast, TensorFlow Enterprise provides three years. Let me you an example. So TensorFlow 1.15 in the open source version will release security patches for each minor version for a [INAUDIBLE] release for a year. In contrast, TensorFlow Enterprise will give you three years of security fixes and bug patches. And all of those will be available in the GitHub repository as open source. JASON MAYES: It's very good to have that extra support there, right? [LAUGHTER] Good stuff. GONZALO GASCA MEZA: And TensorFlow Enterprise is not a fork. While it's available in Google Cloud products, all the code is available in the TensorFlow open source GitHub repository. JASON MAYES: Awesome. GONZALO GASCA MEZA: And we also provide white glove service. JASON MAYES: Brilliant. GONZALO GASCA MEZA: So what if I show you how to get started? JASON MAYES: Definitely, let's see how we get started with this. GONZALO GASCA MEZA: I'm going to do a quick start demo. And we will create a Deep Learning VM image in the Google Cloud console. We also are going to launch an AI Platform Notebooks. And we're going to run a notebook that is downloaded from the TensorFlow website so you guys can see how TensorFlow Enterprise is comparable with existing TensorFlow and associated packages. In this case, we're using transfer learning and Notebook, so we'll execute it. And finally, we will deploy a deep learning container. JASON MAYES: Excellent. I know a lot people interested in transfer learning these days. So this is very exciting. GONZALO GASCA MEZA: Yes. JASON MAYES: So me more. GONZALO GASCA MEZA: So if we go to the Google Cloud console, we're going to create a Deep Learning VM image. So you go to Compute Engine, VM Instances. You're going to click on Create. Then you go to the Marketplace, and you're going to look for Deep Learning VM. You're going to click on the first option, and select Launch. In the Deep Learning VM, you can select if you want a GPU or you want a CPU. And you can also select the TensorFlow Enterprise version. Today, we support TensorFlow 1.15 and 2.1. So in this case, we're going to create TensorFlow 1.15 version without GPUs. We're only going to have a very simple image. And just click Deploy. JASON MAYES: Lovely. And then we wait for that to fire up, I guess. [LAUGHS] Good stuff. GONZALO GASCA MEZA: There's a different way to create TensorFlow Enterprise Deep Learning VM image that's via the command line. JASON MAYES: Of course. GONZALO GASCA MEZA: We create the CPU-only version. But what if we create a virtual machine with GPU on a more recent version of TensorFlow? Let me show you how to do it. So I'm using the Google Cloud console. And the only thing that you need to define is the image family. In this case, we're going to use a TensorFlow tool with the latest GPU version. We want to define one GPU, so we define the accelerator type, in this case, a Tesla T4. And we also want to install the NVIDIA driver automatically. This installed NVIDIA driver and [INAUDIBLE] latest version. And if you want to reduce the cost, you can add the preemptible flag, which will allow you to create an instance with a lower cost. And just click Enter. So we created a Deep Learning VM image from the UI, and we also had the option to do it from the CLI. We created TensorFlow Enterprise 1.15 from the UI and a new instance from the CLI with the GPU and the latest version of TensorFlow, which is 2.1. JASON MAYES: Awesome. It's great to have so many options to do the same thing depending on what you prefer. So that's awesome. GONZALO GASCA MEZA: So right now we're going to be creating a new AI Platform Notebook instance with the latest version of TensorFlow Enterprise 2.0. JASON MAYES: Excellent. Let's see that. GONZALO GASCA MEZA: For that, you need to go to the AI Platform menu and then select Notebooks. Create a new instance. And you can use the default options, which is TensorFlow 2.1 with one NVIDIA Tesla K80, or you can also customize this instance. In this case, I'm just going to go with the default one. I'm going to select to install the NVIDIA GPU driver automatically for me. That saves me a lot of time. And click Create. While the instance is being created, you will see that here you have the NVIDIA Tesla K80. And you can also have the options to-- let's say if you want to change it in the future, you change it to a different GPU, like a V100 or a T4. And once the instance is available, you will see the open JupyterLab link enabled. This takes like a few seconds. Now that the open JupyterLab is enabled, you can just click on it, and you will see the Jupyter interface. In this case, we're going to download Jupyter Notebook from the TensorFlow website and just upload it here. We are importing TensorFlow as tf. And you can see, it's just the same information as you do with regular TensorFlow. Here, we have TensorFlow 2.1. So let's go to the notebook. This is a transfer learning notebook for image recognition, which uses tf.keras and TensorFlow Hub. You don't need to do any modifications to it. It will just run right on. So I'm just going to run all cells. This notebook basically is downloading some images from a web server. Then it uses TF Hub to use transfer learning. This is an image that we're going to try to recognize. So basically the example here is using the flowers data set. We are using a TF module, which is going to help us to improve the quality of our results. And then going to be training the model. So this model training is actually happening right now. You can see how the [INAUDIBLE] is increasing. We're running for two epochs, just for the sake of this demo. And then later, we will plot the results. Now we can see how the loss is reduced eventually the accuracy increased over time. We reference some-- JASON MAYES: And it recognized the flowers. GONZALO GASCA MEZA: --predictions, and then we can see some of the results here. And this is just without any modification. So the last product that we have available for TensorFlow Enterprise is the Deep Learning Containers. So Deep Learning Containers provides Docker containers, which are already preinstalled with TensorFlow. And if you want to use TensorFlow with GPU, it brings out essentially the GPU drivers. JASON MAYES: Excellent. GONZALO GASCA MEZA: Let me show you how to deploy one. So this is my local environment. There's no Docker container right now. So you just need to run it as [INAUDIBLE].. You enter Docker run, proxying the port 8080. And I'm using the TensorFlow 2 CPU version. Because the Docker container also uses the JupyterLab, I will be able to use my JupyterLab in my local computer, very similar to [INAUDIBLE] from Notebook. When you have Deep Learning Containers, you have the option to deploy Deep Learning Containers with TensorFlow Enterprise in other products, such as a Google Cloud, Kubernetes Engine, for example. So let's go and take a look. I'm going to connect to my local host. And now you can see I'm there now. Jason, you can see how easy it is to get started with TensorFlow Enterprise in Google Cloud. JASON MAYES: Definitely. Thank you very much for the demo. GONZALO GASCA MEZA: Thanks. JASON MAYES: And where can I learn more information? GONZALO GASCA MEZA: You can go to the Google Cloud website and look for TensorFlow Enterprise. And also, you can get started if you already have an account and go to the Google Cloud console, and follow the steps we just did. JASON MAYES: Perfect. I shall check that out. [MUSIC PLAYING]
A2 初級 TensorFlow Enterprise (TF Dev Summit '20) (TensorFlow Enterprise (TF Dev Summit '20)) 3 0 林宜悉 に公開 2021 年 01 月 14 日 シェア シェア 保存 報告 動画の中の単語