Placeholder Image

字幕表 動画を再生する

  • [MUSIC PLAYING]

  • BILL LUAN: Good afternoon, everyone.

  • Welcome to the Google I/O session of Introducing Google

  • Coral.

  • My name is Bill Luan.

  • I'm a program manager at Google's Coral team.

  • On behalf of all my team members,

  • warm welcome to all of you and many

  • of you watching online around the world.

  • Thank you for joining us.

  • Before I start to talk about the Coral product,

  • let's take a quick look at the industry

  • trend, which demonstrates to us why

  • Coral will be important to us.

  • The growth of the smart devices and the so-called IoT--

  • the internet of things--

  • has grown tremendously over the past number of years.

  • And it represents one of the biggest growth opportunity

  • in the years to come.

  • Per many industry trend forecasts over the internet,

  • such as this one you can see, you

  • can see many of those similar on the internet.

  • The growth of the non-IoT devices, such as PCs, laptops,

  • mobile phones, tablets, and so on, in the next five to six

  • years will grow from about 11 billion installed units

  • globally to around 12 billion over the next five to six years

  • time span, representing about 14% growth rate.

  • However, the IoT smart devices, the growth

  • rate from the current install base about 8 billion units

  • worldwide will grow to 21 billion in the same period,

  • representing a much larger growth rate of more than 150%.

  • So this really tells us where the growth opportunity

  • and opportunities for innovation for all of us,

  • for developers around the world, where that lies ahead.

  • Namely, they are in the smart devices.

  • So in the smart devices users continue

  • to grow the interest in smart devices in terms of innovation,

  • and the development will continue to grow.

  • There are several key factors which

  • will drive this trend continue forward.

  • Because number one, due to the increase

  • of the interest around the world in terms of AI,

  • machine learning, the advancement

  • of key research in AI continues to grow

  • over the last several years.

  • In fact, the number of research papers

  • has published in the last few years

  • in machine learning is more than the total number of such papers

  • in the last decade combined.

  • So more AI capabilities will make the application

  • of AI machine learning on devices more practical,

  • feasible, as machine learning models become

  • more accurate, faster, performance better,

  • industries will have an interest to continue to use them.

  • So as more devices at the edge require machine learning,

  • we really need to have a solution

  • to bring the machine learning onto the device at the edge.

  • We need a technology, especially a hardware,

  • to bring the machine learning acceleration, that capability

  • right on the device.

  • So in a nutshell, smart devices growth demands

  • bring machine learning to the edge to the device.

  • And we need a solution for that.

  • So let me introduce to you what Google has made for you--

  • for developers around the world--

  • to answer this industry demand what will make that possible.

  • Introducing Coral from Google.

  • It is a new exciting technology platform

  • to enable developers all around the world

  • to build on-device machine learning acceleration AI

  • applications with ease and with high efficiency.

  • So with this introduction session what you will learn

  • is about what the Coral product line offers.

  • What are machine learning capabilities

  • you can build with the Coral platform and the technology.

  • And third, what are the use cases in terms

  • of deploying machine learning with applications

  • in a lot of industry use.

  • So I'm going to go through all these with my instruction here.

  • Coral is designed as a-- make it easier

  • to bring machine learning on the device,

  • make it easier for you to develop applications.

  • It is a platform to make prototyping tool production

  • with high speed, with efficiency.

  • It is a platform to develop on-device machine learning.

  • It offers a group of components of hardware components

  • to bring unique high-performance machine learning capability

  • right onto the Edge devices.

  • It also offers a complete set of software tool change

  • to enable you to develop applications

  • in AI machine learning with ease.

  • In addition to that, Coral also offers a set

  • of ready-to-be-used machine learning models for you

  • to quickly deploy onto devices.

  • So a lot of you may curious.

  • Why it is called Coral?

  • How that has anything to do with AI?

  • Well, if you look at the nature of coral in the natural world,

  • coral really do represent a community of vibrant,

  • teeming with life, inclusive, very open community.

  • Right?

  • The living organism together, they're

  • working together to contribute to a common good.

  • And that's exactly what we inspire,

  • want to make a platform for the industry, for all of you.

  • Coral's inspiration and mission is

  • to provide a vibrant platform for all of you

  • to collaborate and to develop applications to bring

  • AI applications to the device.

  • We want to enable developers everywhere

  • to turning AI ideas to business solutions from idea

  • to prototyping to large scale production deployment

  • with ease and simplicity.

  • And finally, we want to encourage everybody joining

  • a community-wide effort to contribute and to learn

  • and to share machine learning models together.

  • So that's where the name of the Coral came from.

  • So why do we want to talk about the benefits of the machine

  • learning on device?

  • So let's take a quick look at what

  • are the key points of machine learning

  • on device in terms of benefits.

  • Number one, it is a high performance benefit.

  • Because everything is computed on the device right

  • on the device locally there.

  • You do not need to send the data back to the cloud.

  • Right?

  • So high performance, everything on the local

  • allow you to do things much more efficiently.

  • Also it's very important-- very key-- in many applications,

  • you want the data to stay on the device,

  • especially in business application scenarios.

  • Some user privacy data cannot be sent to a server,

  • cannot be sent to the cloud.

  • We want a technology to enable you

  • to make that solution available to your customers.

  • And also because data stays local.

  • All the information coming from the sensors right on the device

  • can be accessed and used to compute

  • for your machine learning results right there, rather

  • than you have to send the data to the cloud, to the server,

  • to compute and send the data back.

  • Right?

  • So this is another way to look at it.

  • It's a much better performance.

  • Because auditing every data is locally available.

  • Next is works offline.

  • There's many scenarios-- the internet of things,

  • the smart devices--

  • they may or may not have internet connection.

  • In fact, in most cases they don't have a cloud connection.

  • So you still want to have a machine learning

  • capability in that scenario.

  • And a offline on-device machine learning capability

  • will make that happen for you.

  • And lastly, it is much more power efficient.

  • Lot of Edge devices are small.

  • They don't have large power supply.

  • And they require high efficiency in terms of computing

  • computation on the device.

  • Because you don't have to send the data back to the cloud,

  • and of course, you save the bandwidth.

  • You save the power to send the data so use a Wi-Fi on network.

  • So the benefits of machine learning

  • is very strong on-device, machine learning very strong.

  • So let's take a quick look at the Coral product line.

  • What do we offer to the industry?

  • Coral product line offers both hardware components

  • and software components.

  • On the hardware side it offers development

  • on board, which is a single border computer allow you

  • to prototype developer applications.

  • Also, Coral offers a number of sensors

  • to allow you to build applications using

  • the sensory data in terms of imaging,

  • video, environmental sensors, making

  • those available as a part of the input of data

  • to your application.

  • And also in the software side, we

  • provide a complete set of software tool change

  • from operating system to SDK to machine

  • learning modules allow you to easily use them and quickly

  • build your application.

  • In addition to that, we provide a detailed comprehensive set

  • of contents in terms of documentation, examples,

  • online guide, data Sheets, et cetera.

  • All of those were made available online at the Coral website

  • to all of you.

  • So now we know what to the Coral product suite contains.

  • Let's take a look little bit more detail

  • in terms of the hardware components.

  • So Coral hardware product.

  • We offer a suite of hardware to you

  • in terms of prototyping and developing applications.

  • The first one is called Coral Dev Board.

  • As you can see on the picture here,

  • it retails for about $150.

  • It's a single-board computer with operating system

  • on board and machine learning capability right on the device.

  • The second one is a USB key, what we call USB accelerator.

  • It has the Edge TPO machine learning acceleration chip

  • right on the device.

  • You can put this USB into any Linux machine

  • and to be able to bring machine learning capability right

  • on to those devices.

  • And I have those with me.

  • And I want to show you the relative size

  • dimension of that.

  • So this is the Coral Deb Board.

  • It's very small, as you can see.

  • It is a single board computer with all

  • the necessary input/output of the connectors on that.

  • This is the USB key, it's even smaller.

  • It's just typical to any USB key you would use.

  • So these are the two current computer platforms

  • you use to develop applications.

  • In addition to that, we also offer a couple of sensors,

  • as I said, to take sensing data from the field for your machine

  • learning application to consume.

  • Number one is a five-megapixel autofocus camera,

  • what we call a Coral camera.

  • Second, we just released a few days ago

  • a new environmental sensor, which is this one here.

  • As you can see, very small.

  • And it has a digital display on that.

  • It allow you to take the input from temperature,

  • humidity, and light and so on.

  • These input sensors, you can use that and build it

  • for your applications.

  • Now with these you do the prototype.

  • But when you deploy these devices

  • into largest scale production environment,

  • we allow you to enable you to take the sun

  • module from the Dev Board.

  • In other words, this piece of circuit board can snap off.

  • You can embed into your product.

  • And this is for volume large scale deployment.

  • And the individual unit price, as you can see, for about $115.

  • And we offer discount for volume deployment.

  • Very soon in the next quarter we will offer a PCI-E connector--

  • a base connector-- which you can plug into any PC or industrial

  • PC and industry devices that accept a PCI-E

  • connector to allow you to bring machine learning capability

  • into those devices.

  • So those on the left, are what you

  • need to do prototyping for your machine learning application.

  • The one on the right--

  • the one on the right is for large scale deployment.

  • Of course, the cameras and environment of sensors,

  • you could have used for both prototyping and the larger

  • scale deployment so they stay in the middle.

  • All right, let's talk about the Google Edge TPU.

  • This is the center-- the core of this platform--

  • bringing machine learning capabilities onto the device.

  • So the Edge TPU is a small, application-specific circular

  • chip that Google designed specifically

  • for optimizing machine learning on the device.

  • It is designed to take TensorFlow Lite

  • machine learning modules.

  • And it supports 8-bit quantile TensorFlow model and running

  • on-device with a high efficiency.

  • It consumes only two watts power.

  • And it runs very fast.

  • The picture you see in the middle

  • represents its relative size to a penny.

  • So it's a tiny chip.

  • And with the same module of this size,

  • you can easily embed into many, many devices.

  • So how fast this Edge TPU goes?

  • Well, in a general time, what we publish online,

  • the performance speed of the Edge TPU

  • runs computation at about four trillion operations per second.

  • So in general terms, it's a Four TOPS, if you will.

  • You may ask, well, how fast it actually

  • runs machine learning model?

  • Take a look at this comparison table.

  • We benchmark couple of very common mostly used

  • vision machine learning models, such as MobileNet, Inception.

  • We're running the dev board or the Edge TPU USB

  • on both of these running against powerful CPU or embedded CPU,

  • powerful desktop CPU such as the Xenon 64-bit CPU

  • or the ARM CPU in the embedded world.

  • As you can see in this table, in comparison,

  • the machine learning capability speed of the Edge TPU

  • runs only a fraction of the time necessary

  • compared to the desktop CPU and the embedded CPU.

  • So it's much, much faster.

  • Now some of you say, well, OK.

  • These are just benchmark numbers.

  • Do you have an example--

  • real world example-- you can show me?

  • And I say yes.

  • Let me borrow an example from one of our beta users

  • who posted this in our Coral beta online forum.

  • He said, I'm building an app monitoring online traffic

  • in real time.

  • You know, watching 10, 20 cars real time.

  • I just take the mobile net out of the box

  • without much tweaking.

  • And I used the Coral product, I being

  • able to achieve about 48 frame per second performance.

  • That is very comparable to a 30 frame per second

  • using a GTX 980 GPU plus a CPU in similar gear.

  • Now for those of you who build game machines,

  • you know to build HTX 980 GPU plus CPU,

  • you're talking about-- what? $500 to $1,000 cost of gear.

  • But with the Coral Dev Board, it's only 150.

  • You'll be able to achieve the same results.

  • So this really tells you the cost and the performance

  • advantage of the Coral product.

  • So now let's talk about the Dev Board a little bit more,

  • since most of you are engineers, you'll

  • want to see the technical spec of it.

  • Let me go through a little bit.

  • So first of all, the Dev Board, as you can see in the picture

  • or in my hand, it is a prototype development board

  • allow you to directly develop on-board on-device machine

  • learning capability in applications.

  • It's a full computer.

  • It has the CPU.

  • It has GPU.

  • It has online memory, and also runs a Linux operating

  • system on this device.

  • It uses a modular design--

  • what do we call SOM--

  • S-O-M-- system module design, meaning again,

  • you can snap off this SOM circuit board

  • and deploy into your own product.

  • By module design, make it easier from prototype to deployment.

  • It has many I/O connectors allow you

  • to connect to many accessories during the development,

  • make everything really, really easy.

  • So here's a picture of the Dev Board.

  • As you can see, it contains all the necessary I/O ports for you

  • to connect the two devices, such as HDMI

  • to connected to display; USB connectors to connect

  • the cameras, keyboard, monitors, and so on;

  • as well as ethernet connectors.

  • It also, of course, has Wi-Fi and Bluetooth to connect

  • to internet through the Wi-Fi.

  • OK.

  • Let's talk a little bit technical spec in detail.

  • The CPU, it uses a NXP quad-core on CPU chip.

  • As you can see, the product is very fast--

  • 853 is a very high speed CPU chip.

  • It also supported GPU.

  • And it has a onboard encrypted chip,

  • allow you to securely connect to Google cloud.

  • It has a one-gig onboard RAM, as well as 8-gig flash memory.

  • So enough space for you to deploy your application.

  • It supports Wi-Fi, Bluetooth, and takes a five volts

  • standard power supply.

  • In terms of connectors, it supports both USB 2.0

  • and USB 3.0 speed connections, with support of both USB Type C

  • and Type A connectors.

  • Under audio and video category, as you can see,

  • it supports all the AV necessary connections, especially

  • the full size HDMI 2.0 connector for a full 1080p video display.

  • It has a micro SD card allow you to bring more software onboard.

  • Has a gigabyte network support.

  • And it has a GPIO 40-pin for I/O connection.

  • It supports a special version of the Debian Linux, what

  • do we call Mendel, because it is specialized to support Edge TPU

  • functions.

  • And the machine learning models is

  • supporting most of the common vision machine learning models

  • such as MobileNet, Inception, which

  • works great on small devices, mobile devices.

  • I want to talk about especially this GPIO connection.

  • For many of you who are makers, you're

  • used to making projects using Raspberry Pi, right?

  • Raspberry Pi has this 40-ping GPIO connector.

  • With the Developer Board--

  • Dev Board-- you can do exactly the same thing.

  • It's compatible to Raspberry Pi 40-ping GPIO.

  • So for all the things you have done in the past

  • connecting to external lights, switches, using GPIO ports,

  • or using the post width modulation to control step

  • motors, everything you've done in the past with Raspberry Pi,

  • you can do it with the Dev Board very, very easily.

  • So how do you use Dev Board in terms

  • of development and deployment?

  • Conceptually it's very easy, as I explained.

  • I'm sure you're already seeing this.

  • So doing the prototyping, you use the Dev Board

  • to enable all these connectors.

  • You can connect to switches, sensors, temperature gauges,

  • whatever you want.

  • You can connect to a monitor.

  • You can connect to a keyboard.

  • You do development right there, because its operating system

  • ringing on the device.

  • When you're done, you take off the SOM module

  • and unplug from the Dev Board.

  • And you can buy many, many SOM modules, as I said.

  • Right?

  • You can deploy the SOM module into whatever product

  • you are making, such as, say, smart refrigerator,

  • smart washing machine.

  • Depends on the application you develop.

  • So this is really easy.

  • It's prototyping and deployment in one in a product package.

  • OK?

  • All right.

  • We talked about Dev Board.

  • Let me also touch briefly the second product,

  • which is the Coral USB Accelerator, which

  • is this little thing here.

  • Again, it's a small USB key you can

  • plug into any USB slot on any Linux machine.

  • It has a onboard Edge TPU, bring the machine learning capability

  • right onto any device you plug into.

  • And also supports not only the Debian Linux, but also

  • the Raspberry Linux, which is similar

  • or the same way used on Raspberry Pi.

  • So you can plug this and take this key

  • and work with Raspberry Pi.

  • So it opens up a bit of more opportunities

  • for you to do the development.

  • So the Coral Accelerator advantages, number one,

  • as you would imagine, bring on-device machine learning

  • into many more machines.

  • You can plug into a laptop if you want,

  • if the laptop runs special version of Linux.

  • And you can plug into Raspberry Pi.

  • It's compatible with many hardware

  • what the Raspberry Pi supports in the past.

  • So not only PCs, but laptops, Raspberry Pi,

  • and industrial systems.

  • The [INAUDIBLE] box supports a USB plug.

  • You just plug this key, and you will

  • be able to deploy your machine learning applications.

  • So the Coral USB Accelerator is a low-cost, convenient way

  • to experiment in building prototypes in AR.

  • OK.

  • So we talked about hardware.

  • Let me also talk about the software side--

  • the complete suite of toolchain that Coral

  • provides to you to enable you to do machine learning

  • applications.

  • So software components, let's take a look

  • from components level what Coral software pieces

  • and hardware work together.

  • So on the Coral Dev Board, not only Linux

  • machine, if you will, and to the bottom layer

  • you have the hardware.

  • And you have the Mendel Linux running on top of that.

  • They talk to the operating system, talks to hardware.

  • And Coral, we developed a C/C++ library API direct access

  • to the operating system and the hardware.

  • And so this allow you to have direct access

  • to everything the operating system be

  • able to control for you.

  • Let's say you have a machine learning model.

  • It's a TensorFlow model--

  • TensorFlow Lite model.

  • We provide a Edge TPU compiler allow you to compile,

  • take a TensorFlow Lite model, compiled

  • into a binary format that is compatible to the Edge TPU.

  • So if you have an application, you can access the C/C+ API

  • and running the machine learning model right on the device

  • and access to the layers of hardware.

  • However, we realized that many, many machine learning

  • programmers, like you, have been using Python.

  • So Coral software also provide a Python library, or Python SDK.

  • It is a high level wrapper allow you

  • to using Python programming language to easily access

  • all the features I was just talking about,

  • be able to access the machine learning model,

  • be able to access hardware.

  • And you can write a Python code to I/O control and so on.

  • So this is a complete environment

  • with multiple components working together

  • that we put in the product for you

  • to enable you to do AI development.

  • So Python API is very simple.

  • For many of you who have programmed in Python,

  • we publish this on the Coral website by the way,

  • the basic classes of all the Python API

  • that you would use for developing machine learning

  • applications.

  • I would say pay attention to the middle two.

  • Those are probably the one that you would use the most.

  • One is object detection.

  • One is object classification.

  • And the base engine base class for them

  • is one called a classification engine,

  • one is called a detection engine.

  • So it's very simple.

  • The last one you see here, what do we call the imprinting

  • engine, it is something for transfer learning,

  • which I'm going to talk about in a few minutes.

  • This is something allow you to efficiently develop

  • a customized machine learning model in the Python API library

  • we provided.

  • It also supports that.

  • So let's take a quick look at example.

  • How would you use the Python code, actually,

  • to interact with the machine learning module that the models

  • that we supply?

  • So if I say I want to develop a program using the object

  • detection model, in Python code you

  • would simply initialize the engine

  • using the DetectionEngine that base class.

  • The basic class of the group members, the member functions,

  • you use and to get to the data and initiate and talk

  • to the machine learning module.

  • So you initiate engine here.

  • You also need to load the so-called label file,

  • because if you want to detect a bunch of objects,

  • you want to identify them with labels,

  • you load the label file.

  • And then let's say you want to feed the machine learning

  • model-- object detection model-- with Image, you load the Image

  • file.

  • And of course, for most of you who

  • have played it machine learning models in Vision,

  • you know you can identify object in photo,

  • or you can identify objects in a video screen.

  • In that case, it will be a vector of tensors.

  • But here I'm using a simple example of just using an image.

  • And then the code to interact with the machine learning model

  • is very simple.

  • A simple line.

  • From the engine, a member class, you just say detect with image.

  • And you pass the parameters to this image.

  • And the returned results, the answer come back

  • from this call, you can use it, such as draw a bounding box,

  • specify color, draw you binding box, released to the stuff

  • that you are detecting.

  • So it's very simple.

  • We supply a group of machine learning models for you to use.

  • Coral include them in the product suite.

  • We put them on a website.

  • They are free for you to download.

  • They are pre-compiled TensorFlow Lite models there

  • for you to use.

  • They can be readily run without any further compiling.

  • And you just simply download them into the hardware.

  • The Edge TPU module--

  • the Python module-- it is already

  • installed on the Deb Board.

  • So you don't need to do anything.

  • You're ready to use your Python programming code.

  • The example I just showed you, you can do that.

  • However, if you use the USB, you want to put on a Linux machine,

  • you will need to manually install this Python module.

  • The Python API are pre-installed, as I mentioned.

  • Right.

  • So that's what you need to do.

  • I do want to mention, though, a lot of those models

  • we supply for free online for you to use--

  • those are for noncommercial use only.

  • That means if you want to build a model,

  • let's say you want to sell for money,

  • then you would need to make your own model

  • rather than use a open source free model that

  • is a non-commercial use.

  • OK?

  • The supplied models include the categories of, as I mentioned,

  • image classification, object detection,

  • as well as one called weight-imprinting.

  • That's, again, use for transferred learning.

  • And I'm going to talk about it in a minute.

  • So here are some examples of the models that we

  • make available for you online.

  • And here are the image classification models,

  • as you can see.

  • We support pretty much all the popular image classification

  • models from MobileNet to Inception

  • and the different versions of MobileNet,

  • the different versions of Inception.

  • The difference between them are the type

  • of objects they are be able to identify.

  • For example, if you want to develop an application

  • to tell the difference between different birds

  • or different plants, you will pick the corresponding model

  • to use.

  • OK.

  • With that, let me run a couple of demos for you.

  • First demo I'm going to show you is a object detection model.

  • And the second demo I'm going to show you

  • is a object classification model.

  • So with that, videographer, please switch the display

  • to the camera here.

  • So on the table here, as you can see,

  • I have this demo built with this conveyor belt.

  • I'm simulating real-time traffic.

  • Over here there's a camera points to it.

  • As you can see on the screen, the camera

  • feeds into this Coral Dev Board.

  • In real time it identifies the number of objects.

  • It shows it's a car.

  • It also shows a so-called confidence score--

  • how confident the model thinks it is a automobile.

  • But on a conveyor belt, as you can see,

  • I also have people or pedestrians.

  • But what I'm going to do now is I'm going to "wink"

  • make this movie.

  • OK.

  • So I'm going to turn on the power.

  • And I'm going to crank up the speed to make that running.

  • Now in real time world, as this moving,

  • you take a look on the screen.

  • The machine learning object detection

  • is continually happening.

  • It continuously identifies the correct car or pedestrian,

  • or there's a traffic light as well.

  • Right?

  • So you can see the Coral performance

  • is very high be able to do that as the scene going.

  • Now I want you to pay attention to the lower left corner

  • on the screen, or upper left corner of the screen,

  • shows the frame per second speed.

  • It runs last time I saw it was like 50

  • to 70 frames per second.

  • It is a very high speed performance.

  • Now, if I crank up the speed to make it go a little bit faster,

  • you can see the machine learning in terms

  • of object identification capturing is still going on.

  • Right?

  • It continues to be able to identify automobile

  • in this fast-moving environment.

  • So this is really demonstrating the power

  • of object detection running right on this Coral device.

  • OK?

  • So that's the first demo.

  • [APPLAUSE]

  • Thank you.

  • Yes, you can clap.

  • I hope you guys be able to make a lot more interesting

  • applications just like that.

  • Bring the power of Coral into your imagination,

  • into your innovation.

  • All right.

  • Next one is showing object classification.

  • So over here I have another Dev Board.

  • And the output of that--

  • display, please switch to the output of this camera.

  • So what I'm going to do, I have several food items

  • on the table.

  • And I'm going to let this camera identify the different type

  • of an object.

  • So let's say I put a hamburger over here.

  • And as you can see on the upper left corner,

  • it tries to identify the object with some confidence score.

  • Depends on the lighting condition,

  • hopefully you will see hamburger.

  • Yeah.

  • You need to aim at the right angle and with the light.

  • Let's try this donut.

  • Does it say donut?

  • It showed up as donut?

  • OK.

  • We can also try a sandwich.

  • OK?

  • Lastly, I'm going to try something exotic.

  • Let's say sushi.

  • OK.

  • So this is how you could make object classification work

  • by simply running one of the object classification model

  • right on device.

  • Again, none of these are connect to internet.

  • None of the data is being sent to the cloud or a server.

  • Everything competition happening right on the device.

  • OK?

  • Great.

  • Thank you.

  • [APPLAUSE]

  • All right.

  • Let's switch back to the slides.

  • So now we talk about how do you use

  • a Coral supplied pre-compiled model to deploy that.

  • But what if you want to build something on your own?

  • You want to customize your model.

  • Well, this is where transfer learning comes in.

  • Transfer learning, it is helping you saving time

  • in terms of building your own model.

  • And basically it takes a pre-trained model that

  • is compatible with Edge TPU.

  • And you only take that for your related task

  • by using your own customized data.

  • In a concept, a neural network has

  • many layers deep of neurons.

  • OK?

  • If you want to train this whole model--

  • in fact, I heard it from one of my colleagues

  • who developed a model from ground level up--

  • takes more than 4,000 GPUs to train a vision model.

  • And it takes days.

  • However, if you instead of training everything,

  • you only need to modify the top layer.

  • This is what a transfer learning concept is.

  • Because the lower layer, those neurons

  • are trying to detect, say, different colors,

  • different shapes, different lighting conditions.

  • They can be used for you to help you identify things

  • that you care.

  • Let's say you want to build a model identify

  • different apples.

  • You don't need to train the whole model.

  • You take a classification model, you only modify the top layer.

  • And by training with your own customized data

  • with many different apples.

  • So this is what transfer learning does.

  • So the code to do transfer learning also

  • in a Python environment on Coral is very, very simple too.

  • Basically, you prepare to do transfer learning,

  • you set up a Docker container.

  • You specify what model you want to use.

  • In this case, I'm showing example,

  • I'm using a MobileNet version one.

  • And if you want to train a few top layers,

  • the single command you want to use simply--

  • start training.

  • And again, you give the parameter

  • as the name of the model.

  • But if you want to train the entire model,

  • you can do that too, is you add one more

  • additional flag that says, training, whole model, flag

  • true.

  • So once you run that code in a console on your system,

  • basically, the console will showing you

  • the progress of training in terms of the steps it takes

  • and in terms of the number of how much time it takes.

  • So it's very simple for you to do

  • that in the environment-- in a Linux environment.

  • So with that, let me do another demo for you.

  • It is called the teachable machine.

  • We are going to publish this as open source for you

  • to do the same in the near future.

  • But basically what you see here, I'm going to show you,

  • I'm going to teach it make a machine to remember things.

  • So the video camera, videographer, we

  • need to have an image of this.

  • So on the desk here what you see,

  • it is actually it was built based

  • on this USB with a Raspberry Pi.

  • So more than Dev Board, you could use Raspberry Pi

  • to build the application.

  • So here with this little bit of demo, and it has a camera

  • points up.

  • And I have some different objects.

  • So if I hit a button to take--

  • every time I hit a button, it takes and image, let's say,

  • of this hamburger, it remembers several images

  • of this hamburger.

  • Now if I take a different object and take

  • a different group of photos of the second object,

  • it remembers the images.

  • And I'm going to take ice cream on the green button.

  • And lastly, maybe I'll take this donut with the red button.

  • So what happens in the background here,

  • the program runs doing a transfer learning,

  • taking existing object classification model,

  • but to replace the last layer with the image that just taken.

  • Now watch this.

  • If I put this hamburger back, the yellow light turns on.

  • It remembers.

  • If I put the ice cream, the green light turns on.

  • I hope you can see on the video.

  • Yes.

  • And if I take this donut, the blue light turns on.

  • Now more than that.

  • Moment ago, I trained with this green ice cream.

  • Right?

  • The green light.

  • If I put a yellow ice cream, it remembers too.

  • Because it's a machine learning model more than just the color,

  • and also identify the shape.

  • Because this shape and this shape is different,

  • the model is faster, smart enough

  • to know the difference between the object.

  • So again, this is one of the examples

  • you could use to building things with the capability

  • of classification right on the device without the internet,

  • and to build even with a small USB key like that.

  • Right?

  • Very powerful stuff.

  • [APPLAUSE]

  • Thank you.

  • OK.

  • Let's switch back to the slides.

  • So the ramifications of this is huge.

  • Right?

  • Imagine in an industrial environment,

  • you want to identify things.

  • You want to tell good widgets from bad widgets in assembly

  • line, for example.

  • You don't have a time to train the assembly line--

  • the auto-sorting machine.

  • You could use transfer learning and learn the different objects

  • on the fly, just like that.

  • So it's very powerful.

  • You can use this in building just endless of applications

  • using such capability.

  • All right.

  • So we talk about transfer learning.

  • We talk about building your own customized model.

  • Let me get into a bit more detail of how

  • do you use the Coral models.

  • OK.

  • So we talk about we supplied a set of pre-compiled model

  • for you.

  • This is actually case number one.

  • User case number one.

  • It's very simple.

  • You simply download the Coral model we supply to you.

  • You don't need to compile again.

  • You simply download and put on the device.

  • OK.

  • The second scenario is, you take the existing

  • model, pre-trained.

  • However, you use transfer learning to customize it

  • with your own data.

  • However, after you've done that, it's not compatible yet

  • with the Coral board.

  • You need to compile.

  • So you use Coral-supplied Coral compiler.

  • And you compile it.

  • The net result of the TensorFlow Lite file.

  • You download to the Edge TPU Coral hardware.

  • And you'll be able to run there.

  • Now I want to say for right now, the Coral compiler

  • is only runs on Google Cloud Platform.

  • But very soon we will make this compiler

  • a standalone executable, make it downloadable

  • on the internet for you guys to use.

  • So the user case is you want to build

  • the entire module by yourself.

  • This is like you really want the customization.

  • The existing model doesn't satisfy you,

  • you can do that too.

  • So in that case, you will need--

  • starting with the TensorFlow and building a model from there.

  • So let me talk about the steps involved with there.

  • The workflow of creating your own customized model

  • is the following.

  • TensorFlow model, as you all know,

  • it's a 32-bit floating point model.

  • Right?

  • And that is now usable for Coral,

  • because Coral device require TensorFlow Lite, because it

  • runs on the Edge, needs very little memory.

  • So TensorFlow model will work.

  • You take the TensorFlow model.

  • You convert it into number one step.

  • You convert it into training with a quantized version.

  • So there's a training process called

  • quantized-aware training.

  • You convert your TensorFlow model

  • into a quantized TensorFlow model.

  • So basically convert a 32-bit floating point

  • base to an 8-bit integer-based model.

  • After that, you export this model with TensorFlow model

  • into a TensorFlow frozen graph, which is typically .pb PDF

  • file.

  • But this file is not usable either.

  • It's not quite ready to be deployed on Coral.

  • What you need to do next step is you

  • need to convert this thing into a TensorFlow Lite model,

  • and with a TensorFlow Lite converter.

  • And after that you compile using the TensorFlow Edge TPU

  • TensorFlow compiler and making to a binary that's

  • compatible with the Edge TPU.

  • And then after you've done that, you deploy.

  • So this is a process of a flow you

  • will use in your environment.

  • So to build, we talk about how this platform provides you

  • to building the applications.

  • We said at the very beginning we want

  • Coral to be a platform of an ecosystem

  • for everybody together.

  • So really this is a platform for you

  • to use to innovate and to share with the community globally

  • altogether.

  • With that, I want to show you an example of one

  • of our retail partners called Gravity Link.

  • They built this app--

  • a very cool app.

  • You can use your mobile phone to download the app directly

  • into the Coral Dev Board.

  • And you can find more details at this link below.

  • Or you just simply search Model Play at Google Play.

  • You can download and try.

  • So this is the idea of we want all developers to contribute

  • to this ecosystem, building tools, building

  • models, building applications, share with the industry.

  • And this is what the Coral ecosystem is for.

  • With that, let me ending by saying,

  • what are the potential areas you could develop for AI?

  • Look at this.

  • There's consumer electronics, of course; appliance;

  • there's a lot of opportunities for you to develop there.

  • Industrial warehousing, monitoring the devices,

  • monitoring the assembly line.

  • This is another area.

  • Robots.

  • Robotics.

  • Both the industry and consumer is a field.

  • Automobiles.

  • Automotive industry is also a field.

  • And as all of you here at Google I/O keynote,

  • medical application, medical devices, is another area.

  • And finally, education.

  • Education aids and research.

  • You can use machine learning-- on-device machine learning

  • using Coral that you can innovate.

  • So there's a lot of things you could do.

  • Right?

  • And all the information I've talked about today,

  • they are summarized at our Coral website.

  • If you don't remember anything, remember this.

  • It's called Coral.withGoogle.com.

  • Our documentation, our samples, models, everything's there.

  • So there's more references I put in my slides,

  • you can look at later.

  • There's a reference to the Mendel Linux, reference

  • to the TensorFlow Lite, how do you

  • do quantization-aware training.

  • And all this information is very important.

  • I do want to call out, on Stack Overflow, we have a tag.

  • You can join the online community,

  • participate in discussions, answer

  • other developer's questions, or look at the answers there.

  • I want you monitoring or help each other

  • using this online community.

  • And I want to give a shout out.

  • We have a Coral code app for those of you

  • would like to experiment using Coral at I/O,

  • you can go there today.

  • Our Dev Load team colleagues are helping everybody

  • going the coding app.

  • So with that, a quick summary and a call to action.

  • After you here today, number one.

  • Review our Coral product.

  • Learn more about TensorFlow Lite.

  • Using Coral Board to experiment.

  • And then building your own customized models.

  • And finally, building model from ground level up.

  • We want all of you taking Coral Platform,

  • putting in your imagination, putting in your innovation,

  • bring AI to the industry to the consumers everywhere.

  • So with that, on behalf of our Coral team,

  • thank you all very much for coming to this session.

  • Thank you.

  • [MUSIC PLAYING]

[MUSIC PLAYING]

字幕と単語

ワンタップで英和辞典検索 単語をクリックすると、意味が表示されます

B1 中級

Google Coralの紹介:オンデバイスAIの構築(Google I/O'19 (Introducing Google Coral: Building On-Device AI (Google I/O'19))

  • 3 1
    林宜悉 に公開 2021 年 01 月 14 日
動画の中の単語