Placeholder Image

字幕表 動画を再生する

  • [MUSIC PLAYING]

  • SARAH SIRAJUDDIN: I'm Sarah.

  • I'm the engineering lead for TensorFlow Lite.

  • And I'm really happy to be back at I/O

  • again, talking about TensorFlow Lite again.

  • TIM DAVIS: Woo-hoo.

  • Thank you to all of you for joining us here today.

  • I'm Tim.

  • I'm the product manager for TensorFlow Lite.

  • And today we're here to tell you about doing machine learning

  • on mobile and IoT devices.

  • SARAH SIRAJUDDIN: So I expect that most of you

  • here are already familiar with what machine learning is,

  • so I won't be going into that.

  • But let's talk instead about what is TensorFlow.

  • TensorFlow is Google's open source, cross-platform machine

  • learning framework.

  • It allows you to build your models

  • and deploy them to servers, browsers, and all the way

  • to edge devices.

  • It's a full end-to-end ecosystem which

  • goes all the way from research to production,

  • from training in the data center to deployment and edge like,

  • I said.

  • TensorFlow has support for multiple languages--

  • Swift, JavaScript, and Python.

  • TIM DAVIS: So what is TensorFlow Lite, and how does it fit in?

  • TensorFlow Lite is TensorFLow's cross-platform framework

  • for deploying ML on mobile devices and embedded systems.

  • You can take your existing TensorFlow models

  • and convert them over to TensorFlow Lite easily.

  • But I wanted to walk you through why this

  • is so important to TensorFlow.

  • There has been a global explosion

  • in Edge ML, driven by the need for user experiences

  • that require low latency and closer knit interactions.

  • Further drivers, like poor network connectivity

  • in many geographical regions around the world and user

  • privacy requirements, have all fueled the need for ML

  • on device.

  • This has led to a whole revolution of machine learning

  • and product innovation in nearly every single industry vertical.

  • We see it all over the place, driven

  • by on-device machine learning running in environments that

  • utilize small compute, that have a constrained memory capacity,

  • and consume low power.

  • So that's why the TensorFlow team decided to invest heavily

  • in making it easy to develop, build, and deploy ML

  • that is cross-platform capable.

  • TensorFlow Lite can be deployed on Android, iOS,

  • Linux, and other platforms.

  • It's easier than ever to use TensorFlow

  • and convert your model to TensorFlow Lite

  • and deploy it anywhere.

  • SARAH SIRAJUDDIN: So at this point,

  • you might be wondering what can you do with TensorFlow Lite.

  • We really want you to be able to solve any kind of problem

  • that you can imagine directly on the device.

  • And later in this talk, we will be

  • talking about the many ways developers

  • are using TensorFlow Lite.

  • But first, I want to show you a video of a fun

  • demo that we built, which was featured in yesterday's dev

  • note--

  • developer keynote.

  • This highlights the cutting edge of what is

  • possible with TensorFlow Lite.

  • The demo is called Dance Like.

  • Let's roll the video.

  • [VIDEO PLAYBACK]

  • [MUSIC PLAYING]

  • - Dance Like enables you to learn

  • how to dance on a mobile phone.

  • - TensorFlow can take our smartphone camera

  • and turn it into a powerful tool for analyzing body pose.

  • - We had a team at Google that had developed an advanced model

  • for doing pose segmentation.

  • So we were able to take their implementation,

  • convert it into TensorFlow Lite.

  • Once we had it there, we could use it directly.

  • - To run all the AI and machine learning models to detect body

  • parts, it's a very computationally expensive

  • process where we need to use the on-device GPU.

  • TensorFlow Library made it possible so

  • that we can leverage all these resources,

  • the compute on the device, and give a great user experience.

  • - Teaching people to dance is just the tip of the iceberg.

  • Anything that involves movement would be a great candidate.

  • - So that means people who have skills

  • can teach other people those skills.

  • And AI is just this layer that really just interfaces

  • between the two things.

  • When you empower people to teach people,

  • I think that's really when you have something

  • that is game-changing.

  • [END PLAYBACK]

  • TIM DAVIS: All right, cool.

  • So to build Dance Like, as I talked about yesterday,

  • we built ourselves this audacious goal

  • of running five on-device tasks in parallel, in real time,

  • without sacrificing performance.

  • And I want to walk you through what they were.

  • So we're running two body part segmentation models.

  • We're matching the segmentation models in real time.

  • We're running dynamic time warping.

  • We're playing a video and encoding a video.

  • And let me emphasize this again--

  • this is all running on device.

  • And to show you, I'm actually going to do a live demo.

  • I spend a lot of I/O dancing.

  • And if we just cut to the app, what you'll see

  • is there's a few dancers you can choose from,

  • so real time in slow mo.

  • I'm going to like slow mo because I'm a beginner dancer.

  • And so I can fire up some dance moves.

  • You can see me--

  • the pose model running.

  • And basically what's happening now

  • is it's segmenting me out from the background

  • and identifying different parts of my body.

  • And then, as I follow along with the dancer,

  • a second segmentation model starts running, but this time

  • on the dancer.

  • So now there's two segmentation models running by the GPU.

  • And that produces the matching score

  • that you see up in the top right-hand corner.

  • And what that's doing is giving me

  • some feedback on how well I'm matching the dancer.

  • It's pretty cool, right?

  • But we went further.

  • Dancing is cool, and dancing in slow mo isn't that cool.

  • So what we thought we would do is

  • we would use dynamic time warping

  • to sync my slow mo moves with the real-time dancer.

  • And so what you get is an effect where

  • the user, all running on device, can

  • output this type of content.

  • So you can come and try this in the AI Sandbox.

  • You can see for yourself.

  • You can get a video.

  • You can share it.

  • And it's really, really cool.

  • And it's all because of TensorFlow Lite.

  • SARAH SIRAJUDDIN: How awesome was that?

  • And props to Tim for agreeing to dancing on stage at I/O,

  • and not once, but twice.

  • So besides dancing, what are some other use cases

  • that developers are using TensorFlow Lite for?

  • The major on-device use cases that we

  • see are typically related to image and speech,

  • so things like segmentation, object

  • detection, image classification, or speech recognition.

  • But we are also seeing a lot of new and emerging use cases come

  • up in the areas around content generation and text prediction.

  • TensorFlow Lite is now on more than 2 billion devices

  • around the world, running on many different apps.

  • Many of Google's own largest apps are using it,

  • as are apps from many other external companies.

  • So this is a sampling of some of the apps

  • which are using TensorFlow Lite--

  • Google Photos, GBoard, YouTube, Assistant,

  • along with several global companies,

  • like Uber and Airbnb.

  • TIM DAVIS: So TensorFlow Lite also powers ML Kit,

  • which is our out-of-the-box solution for deploying

  • Google's best proprietary models on device.

  • You would have heard about this yesterday, too.

  • We are also powering that in the back end.

  • SARAH SIRAJUDDIN: So now let's move

  • on to how you can get started with using

  • TensorFlow Lite yourself.

  • It's fairly simple to get started.

  • I'm going to walk you through how

  • you can use an off-the-shelf model, or retrain your model,

  • or use a custom model that you may

  • have built for your own specific use case with TensorFlow Lite.

  • And once you've done that, it's really about validation

  • and optimizing your performance for latency, size,

  • and accuracy.

  • So first, let's dive into how you can get started.

  • As a new user, the simplest way to get started

  • is to download a pretrained model from our model repository

  • on tensorflow.org.

  • We have models for popular use cases

  • like image classification, object detection, estimating

  • poses, and smart reply.

  • And this is an area where we plan

  • to keep adding more and more models,

  • so please check back often.

  • These models that are hosted there are already

  • in the TensorFlow Lite model format,

  • so you can use these directly in the app.

  • Now if you did not find a model which is a good use for your--

  • which is a good fit for your use case, you can try retraining.

  • And this technique is also frequently called

  • transfer learning.

  • And the idea here is that you can

  • reuse a model that was trained for another task

  • as a starting point for a model for a different task.

  • And the reason why this is useful

  • is that training a model from scratch

  • can sometimes take days.

  • But transfer learning can be done in short order.

  • Note that if you do retrain a model,

  • you will still need to convert that retrained model

  • into TensorFlow Lite's format before you use it in an app.

  • And later in this talk, I will show you

  • how you do that conversion.

  • OK, so once we have a model in TensorFlow Lite format,

  • how do you use it in an app?

  • First you load the model.

  • Then you preprocess your data into a format

  • that your model will accept.

  • Then you change your application code

  • to invoke the TensorFlow Lite inference library.

  • And finally, you use the result of the inference in your code.

  • So let's walk through some code which shows this.

  • This is code which was taken from the image classifier

  • example.

  • This is hosted on our website.

  • It's in Java, written for the Android platform.

  • You can see that the first thing that we do here

  • is that we load the model, and then

  • we construct the TensorFlow Lite interpreter.

  • We then load the image data and preprocess it.

  • And you'll notice that we're using a byte buffer.

  • And the reason we're doing that is to optimize for performance.

  • Next step is to run inference and classify the images.

  • And that's it.

  • That's all you need to do to get an image classifier on Android.

  • I do want to highlight that the example that I have run through

  • is in Java, but TensorFlow Lite also has bindings

  • for Objective-C, C++, Swift, as well as Python.

  • So the next thing I want to move on to

  • is how you can use your own custom

  • model with TensorFlow Lite.

  • So the high-level steps here are that you train your model

  • with TensorFlow.

  • You write it out into the saved model format.

  • And then you would need to convert that

  • into TensorFlow Lite format using TensorFlow Lite's

  • converter.

  • And then you make your changes in your app

  • to use the model like I walked you through just now.

  • So this is a code snippet showing

  • how you can convert a saved model into TensorFlow Lite

  • model format.