Placeholder Image

字幕表 動画を再生する

  • My name is Nick Eel s.

  • So we're here to talk to boat machine Learning and JavaScript with tensorflow.

  • Jess.

  • So if you're familiar at all with machine learning and data signs, you'll know that most of it happens in Python.

  • So this is a poll that was given by, ah, popular data science block called Katie Nuggets.

  • Um, and you can see that python absolutely dominates.

  • And this is for a pretty good reason.

  • You know, there has been many, many years of tooling built in python for data science from numb pie.

  • Depend a CZ two cycler into tensorflow.

  • Um, and this field is gonna continue to evolve in python.

  • And I don't have to convince the folks in this room.

  • Java script is a very, very popular language.

  • This is obviously the Octa verse survey by get hope and you could see that Java script absolutely dominates.

  • Um, so we think that there's actually a lot at the Java script community, and the folks in this room can bring two machine learning.

  • So, uh, who here has seen this?

  • This is Tensorflow playground.

  • Okay, A few people, but not a lot of people, so I definitely recommend checking this lingo.

  • This is an in browser visualization of a neural network.

  • It was built by one of our colleagues at Google, and the idea here is you can immediately change the number of layers, neurons, learning rate and so forth, and immediately see how the neural network generalizes over a data set.

  • Now this thing was a huge educational success.

  • You know, it's being used in universities across the world now and in Google's own machine learning cross course.

  • So again, I definitely recommend checking us out.

  • So we step back and we kind of asked ourselves, You know, why was this such a success?

  • Like, Why do people care about this in browser machine learning thing at all?

  • And we kind of distilled it down to a few points.

  • So the obvious things that you just click a link and you get going if you've done any python and it's like really pain in the butt to get your drivers installed, python libraries and stolen that kind of thing, it's super interactive.

  • You know you have buttons, you have hyper parameters that you can play.

  • You can immediately see how the changing the knobs affect the learning configuration of the model.

  • Ah, we didn't take advantage of this in the playground, but, you know, in the Java script world in the browser, you have cameras and microphones and standardize access to these things.

  • But you don't really get a CZ much in the python.

  • When and super important to us is that you can actually make predictions locally, and data can stay on the clients.

  • This is privacy preserving.

  • So we took this and we launched a library called Tensorflow Jess.

  • We released it last year.

  • In March, it's GPU accelerated.

  • So we use Web G L to make everything fast so we actually do all the linear algebra in fragment traders, and I'll talk a little bit about that in a second.

  • But one of the things that we do is we let you make predictions through machine learning models, but we also let you train them directly in the browser or even in no Jess, and we'll talk about about that as well.

  • So when designing the library, you know, we had a couple of goals in mind.

  • One is that we wanted to empower a diverse group of jobs for developers like the folks in this room.

  • You know, there's a lot of really awesome people in this community, and we want to sort of married these two worlds at the same time.

  • We wanted the folks who are experienced in the machine learning to build support their work to the web.

  • Now, these goals air kind of sometimes at conflict.

  • So we'll talk a little bit about how we resolve this.

  • Um, okay, so one of the principles that we had was we wanted the library be super easy to use, and we kind of lean towards that over performance at the same time, we didn't want to sacrifice any functionality for simplicity.

  • Um, so just jumping into what that means, Um, we decided to go with this eager only approach.

  • I'm not gonna go into what that means, but it's a much simpler way of programming.

  • And actually, most of the machine learning world is moving towards this eager approach versus a graph based approach where you actually stitched together a computation graph and executed later.

  • We really wanted to be easy, so we moved towards eager.

  • We also provide a high level layers, a p i, which is ah, set of best practices in the machine learning community.

  • So you don't have to think about all the details of your linear algebra when you're constructing a model.

  • And we also provide a whole repositories of pre trained models that requires zero understanding of machine learning to get started.

  • And I'm gonna show you a couple of examples of those in a second.

  • And I want to highlight this.

  • We focus on performance one and where it matters.

  • Obviously we want matrix multiplies to be fast.

  • What we did was we took individual models, and we figured out how to make those individual models faster on a use case spaces.

  • So, as I said, we don't want to actually make the library less functional, So we support radiance.

  • This is fancy talk for sensitivity of each of the weights, which means we can train through any operation that you use in the Tensorflow Js library sport.

  • A lot of the tensorflow ops 130 of them.

  • And for any of these models that were about to show you, you can actually dig down and get some of the machine learning constructs out of them if you want to.

  • Okay, so quickly jumping into what the technical stock looks like.

  • A TTE the very top of the obstruction AP eyes.

  • We have our models.

  • Repos.

  • So this is a set of pre trained models.

  • I'll show you a couple of those in a second require a very little understanding of machine learning.

  • Below that, this is our layers, a p I.

  • This is where you can construct a model.

  • You can train the model.

  • You can serialize the model for later.

  • Um, and we'll show you some of that soon to, uh and we also have below that are Cory people, which is orjust linear algebra, Colonel.

  • So these air matrix multiplies convolutions and their Grady ants, which are derivatives.

  • So all of these AP eyes, you consort of poke in any of these a structural layers.

  • All of these sit on top of web jail in the browser.

  • We use fragment traders to run over mouth in parallel and in node.

  • We actually bind with the n a p i to tensorflow c++, and what that means is, if you use that same a p i for any of these things, you immediately get the hardware acceleration that Tensorflow has been working hard on for the CPU and Fergie abuse with Kota and eventually we're gonna have tp support.

  • It was also very important for us not to silo ourselves in the Java script world.

  • There is a whole wealth of models.

  • They're trained in the python ecosystem that we want to take advantage of.

  • So we have converter tools that lets you take a caress model or a tensorflow saved model and bring them back into the JavaScript world.

  • Okay, so let's quickly take a look at what the models report looks like.

  • Um, so if you check out this link, these air, all our pre trained models these are hosted on get hub on N p.

  • M.

  • So we host all the weights and all the Java script for you, and we have a wealth of models, from object recognition to human pose detection, to localisation, to segmentation, to text classification.

  • And the list goes on.

  • Just go check this thing out.

  • But I want to show you one of the demos because it's fun.

  • So go.

  • Okay, so this model is called pose in it.

  • Um, it's running completely in the browser, and nothing is being sent back to a server.

  • And the idea here is we take RGB images from Webcam.

  • We pass it through this pose detection model that generates key points for each of you know, some of my body parts.

  • And then it returns an object that we could just render on the screen.

  • And obviously it works with two people.

  • So this is a lot of fun, and we'll show you how to use one of these models in a minute.

  • The second model is very similar to pose.

  • Net is doing person segmentation.

  • So it's, you know, this background little funny, but basically what it does is it draws a mask of the one where it thinks is a human posing a zero where it does where it thinks there's not.

  • So this is what this one's a lot of fun, and I don't know if this is gonna show well, here, but one of the effects that really like his portrait mode, you can see this thing blurring.

  • So we have, you know, a soft core base portrait mode that's running directly in the browser pretty fast.

  • Okay, so let's go back to the slides and I'm gonna show you how that actually works and what the code looks like us.

  • That model is called body picks.

  • It's a pre trained person segmentation model that we've done a lot of work to make.

  • A super fast.

  • Uh, pretty straightforward.

  • You import tensorflow, Jess and body picks to libraries.

  • We have them on N p.

  • M.

  • We host them on CD ends for you.

  • We, you know, have a regular image.

  • Dag.

  • That's it.

  • And this image is frank.

  • Now, Frank is Nick's baby.

  • Ah, and he is doing a yoga pose for us.

  • So we're gonna try to figure out where Frank is in this image.

  • So first, we just load the model we call await body picks out load, and this is gonna download all of our weights these ways, we host on R G C P buckets for you, so you don't have to pay for any of that.

  • And then you just call one line of code estimate person segmentation on the image and you get a Jason object out.

  • And inside of that Jason object is a binary mask of where it thinks the kid is that is that simple?

  • You don't really have to understand.

  • The ml bits of this.

  • One of the other things.

  • This model it gives you is parts as well.

  • So it'll tell you which pixels are face, which pixels our arms and legs and so forth.

  • And we provide some fun utilities for drawing masks on top of those.

  • So you can imagine this being used for, like, a video game Sprite.

  • You just jump around on screen and you immediately have a fun video against, Right?

  • Okay, so I don't have to explain this to people in the room, but JavaScript runs in a ton of town of places and we're working hard to get tensorflow Jess working in those places.

  • So we have the browser and note, obviously, but working on electron and react native and we chat.

  • So we'll talk about those in a second.

  • But I want to show you some of the cool examples that we we like in these worlds.

  • So in the browser side, hopefully the links are here.

  • We have a project called credibility.

  • This is one that's done by Google.

  • Um, and it's a set of experiments around.

  • Can we make interacting with music and art more accessible?

  • So we're using that posed that model that I showed you, and we're actually able to play a synth with just our face.

  • This runs completely in the browser.

  • The link is there.

  • Go try this after the talk, please.

  • Um, cool.

  • So then we also have a project called Over Manifold.

  • Not by us by uber.

  • And this project is a way to debug and understand the machine learning models as their training.

  • And they actually use tensorflow, Jess, just for linear algebra.

  • So fast Matrix multiplication is in the browser.

  • Airbnb is also using tensorflow, Jess.

  • They ship a little model to the to the client.

  • So when you're about to upload a profile picture if they see a license or a government issued passports in that in that photo will yell at you before they upload to their server so they don't have to own that p i on the back end on the on the desktop and node, there's a project called clinic doctor and clinic Doctor is a project that mountain monitors your note application for a sea view spikes, and they use tensorflow dress actually to dis ambiguity, garbage collection spikes from your CPU in your in your actual program.

  • One of my personal favorites is a project called Magenta Studio.

  • Magenta is a team at Google that does generative music and art.

  • And they actually have an electron up that plugs directly into a Bolton live.

  • And it can generate many notes on a track for you.

  • Or it can generate a drumbeat alongside.

  • Maybe a guitar grew that you have.

  • So this is a ton of fun and augments an existing work throat our workflow And, you know, Java script is awesome.

  • So of course we do it there.

  • Um Okay, so this other platform, called Re Chat, is massive in China.

  • If people don't know about it, it's got a 1,000,000,000 users, lots of many programmes, lots of developers, and they all run a javascript.

  • And we're working hard to get GP acceleration stories working inside of that.

  • With that, I'm gonna hand it off to Nick to talk about some other stuff.

  • Thanks.

  • To go as we kinda highlighted, um, JavaScript runs in a lot of places, and we're starting to think of areas where we can keep expanding where you can run tensorflow Js um I want to step back and talk about our note bindings first before we dive into the next topic.

  • We launched these about a year ago, and the library is great because it's super fast.

  • Uses that c library like nickel mention, Um, and it's great for deploying on the servers.

  • They're doing local work flows on your desktop or workstation, but there are a few downsides.

  • Is particular library we have one of them is the GPU.

  • Acceleration requires invidious Cuda Library.

  • It's a really fast library, but it's very large, and we at Tensorflow don't currently support Mac OS, so there's no GPU acceleration of math.

  • And the other thing is, the node package itself is a native module, all built on N a P I.

  • And it links to the sea library tensorflow that could be really large, depending on which library you're using.

  • Cuda can be around 250 megabytes, and so just not Lennox.

  • So it's a very large package to ship, so we start to think, Is there something in between?

  • We could do a note, and we started working really hard and launched earlier this year, a new headless graphic stack for note and we launched its called the No Dash T l E s package.

  • We work hard with the protein here to build a headless graphic stack.

  • For that, we wanted to take that and accelerate our existing Web G l stack all headless, unknown.

  • And this library runs by angle and angle is the driver We shipping chrome today and it translates Web geo calls to your native system Graphics stack.

  • So on windows, it's direct three D open geo on windows and in your native Mac OS graphic stack implementation.

  • So you think this is gonna be great for some desktop paps like electron mobile and embedded space and in ah, I OT devices?

  • Plus, this is going to bring GPU acceleration to Mac OS.

  • Um, we're working hard to finish this up a couple of things.

  • So we're hoping the launch here later in June or sometime this summer, and I want to show a demo of this actually running.

  • Um, we built a really quick electron app, So if I go ahead and just run my ass, this app uses mobile net, which is one of our out of the box models that does basic image classifications so you can see an image and tell you what it is.

  • So as I pull up in my app, you're low.

  • Not the most exciting you.

  • Why?

  • But we It shows the G l stack that's running.

  • You can see it's running angle with an open geo for one core and the latest open geo.

  • Yes, stack their angle when I click Run demo.

  • What's happening?

  • It's going out.

  • It's fetching our model.

  • It's loading you.

  • And it predicted that that's Ah, Labrador retriever.

  • And we're running about 150 milliseconds or I'm sorry, running 150 predictions on an image and are averaging about 23 milliseconds.

  • So that's that's very close to 30 frames a second in real time.

  • So we think this is gonna be really great on the electron side.

  • It doesn't block you're you I thread for doing all the displays, your dispatching, all these ml calls through the note process, all with a headless deal, and that package is like 5 to 10 megabytes.

  • It's very small, and I also want to show one other thing.

  • Um, this is the latest type of io ti boards.

  • This is ah, in video, Jetson Nano and basically just has a big GPU stapled to the top of it, and we were able to last week at this running with this headless stack as well running that same model.

  • Ah, no counsel dump.

  • Is it that most exciting?

  • But we're doing around 76 milliseconds of inference time, just with the very thin, um, arm 64 build of our note back in.

  • I don't want to talk about another library.

  • We've been working really hard, and it's in browser visualization for our tensorflow.

  • GS Library wants his package.

  • It's called T F G.

  • A stash of is, and you can think of it as like the Chrome Dev tools for ML models.

  • We have the single divisor and it's lights out, and it's a canvas for painting a bunch of elements that the library provides.

  • We have a bunch of built in charts, such a loss in accuracy for ML training.

  • We also have what we call high level visualization methods.

  • This basically allows youto look at those complicated ops like convolutions, which do a bunch of filters on your image while you're training and see what's happening in between inch of those convolutions.

  • Model evaluation utilities is another set of drawing libraries.

  • And that sort of shows you where your model is might be over bias to particular class in ways that you can sort of see how you might alter your data set to make sure you have a very nicely train model.

  • All right, we've been talking about turning a lot of stuff, but we want to show you a lot of the things that, um, they killed myself and the team have been thinking about where we're going forward with the project.

  • One thing is really excited about the current future.

  • Um, with all the new specs into coming down the browser, uh, especially Java script on the website.

  • We have two new standards.

  • We've been looking at really hard the last couple months.

  • One of them is Web GPU.

  • Leggy views the next generation graphic stack that's coming to the browser.

  • We've been working really hard with the crime team to try to get that implementation up and rolling.

  • Um, another one we've been looking at is Wasim.

  • Now, in the male world, we really need to do Cindy to make Qassem really ineffective accelerator for sea views.

  • So we've been working really hard on that with again with the crime team, and we're hoping to have something for devices where the GPU isn't all that great.

  • We can fall back to Lawson's.

  • Indeed, one of the great parts about the ML spaces, just amount of research and we're finding about every year.

  • Arm All the state models have get faster from, um, reductions in architecture or new hardware acceleration stories.

  • So every year of the models that we keep showing continue to get faster, especially on edge and, um, browser devices.

  • Another great product we have a Google is auto ml and animal souls, the whole training part.

  • If you want to do an image classification problem, you can give it a set of images and uploaded to the cloud and automatically finds the right architecture for your model.

  • And it spits out the model that you can deploy on your device.

  • We're looking at some integration with that team as well, to make it just a really seamless experience.

  • And another thing that our team has been focusing on is just optimizing our existing back ends.

  • So our web she'll implementation.

  • For example, we worked on packing textures, which is a fancy term of using less memories as much as possible in our acceleration Library on that speed spent a bunch of things, including IOS, up to 10 times faster than what we were saying.

  • Before looking at the things we're gonna launch this summer, visualization is already launched.

  • Another package that we didn't really highlight is the data library.

  • And the data library is the really easy to use package for getting stuff out of the browser.

  • Um, microphone data, Webcam data.

  • You don't have to worry about converting utensils.

  • You could just sort of streamline these things into the brat into your model on our platform side expanding where we run tensorflow gs.

  • Ah, As mentioned, we chat the headless webs, yell stuff, and then we're really starting to dive into how we can provide a nice react native experience, and in honor of the box model fronts, we're gonna continue focusing on audio and text models as well as improving the accuracy and performance of our existing offerings, but that I want to thank you for attending our talk.

  • Everything we've shown is we work purely an open source, and, um, all of our stuff is found on G s dot tensorflow dot ord.

  • Um, one of the other things we wanted the acknowledges while in the camera and myself working at Google.

  • When we get to work on this project, we've this project would not be where it's at without the large number of open source contributors we've had.

  • And we wanna extend a thank you to them for all the hard work they've done.

  • And in one last plug, um, we're actually hiring a developer advocate for our team.

  • And if anyone is interested, please follow that link or come see us at the booth here at G s.

  • Khan, that's all.

My name is Nick Eel s.

字幕と単語

ワンタップで英和辞典検索 単語をクリックすると、意味が表示されます

B1 中級

TensorFlow.jsで機械学習をWebに、そしてその先へ。 (TensorFlow.js Bringing Machine Learning to the Web and Beyond by Nick Kreeger & Nikhil Thorat)

  • 3 0
    林宜悉 に公開 2021 年 01 月 14 日
動画の中の単語