Placeholder Image

字幕表 動画を再生する

  • [MUSIC PLAYING]

  • SAM BEDER: Hi, everyone.

  • My name is Sam Beder, and I'm a product manager

  • on Android Things.

  • Today, I'm going to talk to you about Google

  • services on Android Things, and how

  • adding these services to your device

  • can unlock your device's potential.

  • What I really want to convince you of today

  • is not only is integrating Google services on Android

  • Things really, really easy and really, really seamless,

  • but it can make a huge difference in the use cases

  • that you can put on your device as well as for your end users.

  • And I know this year, we have many sessions on Android Things

  • as well as demos in the sandbox area,

  • and code labs to learn more about what's

  • possible on Android Things.

  • I also know that many of you are coming to this session

  • already with ideas of devices that you

  • want to make on Android Things or for IoT devices in general.

  • And I want to show you today all the compelling use cases

  • that you can get when you integrate some of these Google

  • services.

  • So I'm going to go through a number of services today.

  • First, I'm going to talk about Google Play services, which

  • includes a whole suite of tools such as the mobile Vision

  • APIs, location services, as well as Firebase.

  • After that, I'm going to dive into Firebase in a little bit

  • more detail to show you how the real time

  • database that Firebase provides can

  • allow you to publish and persist data

  • and events in interesting ways.

  • After that, I'm going go into TensorFlow,

  • and how TensorFlow--

  • we think-- is the perfect application

  • of the powerful on-device processing

  • of your Android Things device to really add intelligence

  • to that device.

  • Next, I'm going to talk about Google Cloud platform

  • and how using Google Cloud platform,

  • you can train, visualize, and take action

  • on your devices in the field.

  • Finally, I'm going to touch on the Google Assistant and all

  • the amazing use cases that you can

  • get when you integrate the Google Assistant on Android

  • Things.

  • Before I dive into these services,

  • I want to quickly go over Android Things.

  • So, Android Things is based on a system on module design.

  • This means that we work really closely with our silicon

  • partners to bring you modules which you can place directly

  • into your IoT devices.

  • Now, these modules are such that it's

  • economical to put them in devices when you're

  • making millions of devices or if you have a very small run,

  • or if you're just prototyping a device.

  • So earlier today, we actually had a session

  • specifically on going from prototype to production

  • on Android Things, which can give you more detail about how

  • it's feasible to do all this, all the hardware design,

  • and bring your device to production on Android Things.

  • The Android Things operating system

  • is then placed on top of these modules.

  • So Android Things is a new vertical

  • of Android built for IoT devices.

  • Since we work so closely with our silicon partners,

  • we're able to maintain these modules in new ways.

  • It allows these devices to be more secure and updateable.

  • Also, since it's an Android vertical,

  • you get all the Android APIs they're

  • used to for Android development as well as the developer tools

  • and the Android ecosystem.

  • In addition, on Android Things we've

  • added some new APIs such as peripheral iO and user

  • drivers that allow you to control the hardware

  • on your device in new ways.

  • We've also added support for a zero display

  • build for IoT devices without a screen.

  • But really the key piece of Android Things, I believe,

  • is the services on top.

  • Because of the API surface that Android Things provides,

  • it makes it much easier for Google

  • to put our services on top of Android Things.

  • I say endless possibilities here because not only does Google

  • already support all the services I'm

  • going to walk you through today, but any services

  • that Google makes in the future will be much more portable

  • on Android Things because of this API surface.

  • So now, let's start diving into some of these services.

  • Let's talk about Google Play services and all

  • the useful tools that it provides.

  • Google Play services gives you access

  • to a suite of tools, some of which you see here.

  • So you get things like the mobile vision APIs,

  • which allow you to leverage the intelligence in your Android

  • camera to identify people in an image

  • as well as faces and their expressions.

  • You also get the nearby APIs, which lets you--

  • when you have two devices near each other--

  • allows those devices to interact with each other

  • in interesting ways.

  • You get all the Cast APIs, which lets you

  • from your Android device cast to a cast enabled device

  • somewhere else.

  • Next, you get all the location services,

  • which lets you query things like,

  • what are the cafes near me and what are their hours.

  • You also get the Google Fit APIs,

  • which allow you to attach sensors and accelerometers

  • to your device and then visualize

  • this data as steps or other activities in interesting ways.

  • Finally, you get Firebase, which we'll

  • talk about more in a minute.

  • Some of you might know about CTF certification

  • and how CTF certification is a necessary step in order

  • to get these Google Play services.

  • With Android Things, because of our hardware model

  • that I just talked about, these modules

  • actually come pre-certified.

  • So they're all pre-CTF certified,

  • meaning Google Play Services will work right out of the box.

  • You have to do absolutely no work

  • to get these Google Play services on your Android Things

  • device.

  • We also have, for Android Things,

  • a custom IoT variant of Google Play services.

  • Now I actually think this is a pretty big deal.

  • This allows us to make Google Play services more lightweight

  • by taking out things like phone specific UI elements

  • and game libraries that we don't think

  • are relevant for IoT devices.

  • We also give you a signed out experience

  • of Google Play services.

  • So, no unauthenticated APIs because these just aren't

  • relevant for many IoT devices.

  • So now, let's dive into Firebase in a little bit more detail.

  • I'm going to walk you through one of our code samples.

  • So this is the code sample for a smart doorbell using Firebase.

  • It involves one of our supported boards,

  • as well as a button and a camera.

  • So I'm going to walk you through this diagram.

  • On the left, you see a user interacting

  • with the smart doorbell.

  • What happens is, they press the button on the smart doorbell

  • and the camera takes a picture of them.

  • On the right, there's another user

  • who, in their Android phone, they

  • can use an app to connect to a Firebase database that

  • can retrieve that image in real time.

  • So how does this work?

  • When you press the button on the smart camera,

  • the camera takes a picture of you.

  • Then, using the Android Firebase SDK,

  • which uses the Google Play services APIs

  • all on the device, it sends this image

  • to the Firebase database in the cloud.

  • The user on the other end can then

  • use the exact same Google Play services and Android Firebase

  • SDK on their phone to connect to this Firebase database

  • and retrieve that image.

  • In our code sample, we also send this image

  • to the Cloud Vision APIs to get additional annotations

  • about what's in the image.

  • So these annotations could be something like, in this image

  • there is a person holding a package.

  • So that can give you additional context about what's going on.

  • It's pretty cool.

  • If you actually go and build this demo, you can see.

  • When you press the button and it takes a picture, in less than a

  • second the picture will appear.

  • And then a few seconds later, after the image

  • is propagated through the Cloud Vision APIs,

  • the annotations will appear as well.

  • So to really show you how this works,

  • I'm going to walk through some of the code that

  • pushes this data to Firebase.

  • So the first line you see here is just

  • creating a new door ring instance

  • that we're going to use in our Firebase database.

  • Then, all we need to do to make this data appear

  • in our Firebase database is set the appropriate fields

  • of our door ring instance.

  • So here you can see in the highlighted portion,

  • we're setting the time stamp and the image fields so that--

  • with the server time stamp and the image URL--

  • and then this image as well as the timestamp

  • will appear in our Firebase database

  • to be retrieved by the user on the other side.

  • As I mentioned in our code sample,

  • we also send our images to the Cloud Vision APIs

  • to get those annotations.

  • So, we do that by calling the Cloud Vision APIs

  • and then simply setting the appropriate field

  • for those annotations so that that additional context

  • will appear as well for the user on the other end.

  • So, Firebase is one of the many Google Play services

  • that you get with Android Things.

  • But in the interest of time, I can't talk about

  • all the Google Play services.

  • So instead, I want to move on to TensorFlow.

  • We really think that TensorFlow is the perfect application

  • for the on device processing of your Android Things device.

  • So, as you've heard from some of the previous talks on Android

  • Things, Android Things is not really

  • well suited if you're just making a simple sensor.

  • To fully utilize the Android Things platform,

  • it should be doing more.

  • There should be some intelligence on this device.

  • You might wonder, though, if you're making an internet

  • connected device-- an IoT device--

  • why do you actually need this on device processing?

  • There's actually several reasons why

  • this could be really important.

  • One reason has to do with bandwidth.

  • If, for example, you're making a camera that's

  • counting the number of people in a line

  • and you just care about that number,

  • by only propagating out that number

  • you save huge amounts on bandwidth

  • by not needing to send the image anywhere.

  • The second reason for on device processing

  • has to do with when you have intermittent connectivity.

  • So if your device is only sometimes connected

  • to the internet, for it to be really functional

  • it needs to have on device processing for when

  • it's offline.

  • The next reason for on device processing

  • has to do with the principle of least privilege.

  • So if you, again, had that camera where all you care about

  • is the number of people standing in a line,

  • by the principle of least privilege

  • you should only be propagating that number

  • even if you trust the other and where you're sending it.

  • There's also some regulatory reasons

  • where this could be important for your use case.

  • The final reason for device processing

  • has to do with real time applications.

  • So if you're, for example, making

  • a robot that has to navigate through an environment,

  • you want to have on device processing

  • so if something comes in front of that robot,

  • you'll be able to react to the situation.

  • Again, I want to mention that we have a code lab for TensorFlow

  • and Android Things.

  • So you can try it out in the code lab area or at home.

  • But to really show you TensorFlow in action,

  • I actually want to do a live demo so we can really

  • see that it works.

  • So what I have here--

  • it's a pretty simple setup.

  • We have one of our supported boards, which

  • is a Raspberry Pi in this case, as well as a button, a camera,

  • and a speaker.