Placeholder Image

字幕表 動画を再生する

  • [MUSIC PLAYING]

  • CHET HAASE: Hey.

  • [CLAPPING]

  • DAN SANDLER: Hey everybody.

  • Welcome back to What's New in Android.

  • I'm Dan Sandler from the System UI team.

  • CHET HAASE: I'm Chet Haase from the Android Toolkit team.

  • ROMAIN GUY: And I'm Romain Guy from--

  • I still don't have a name for my team.

  • So from the Android team.

  • [LAUGHTER]

  • And you may remember us from other talks,

  • such as What's New in Android 2017,

  • What's New in Android 2016, What's New in Android 2015,

  • What's New in Android 2014.

  • Not What's New in Android 2013.

  • CHET HAASE: We don't talk about that one.

  • ROMAIN GUY: We don't talk about that one.

  • DAN SANDLER: You know, that was the time

  • that we had Jelly Bean two years in a row.

  • It was brilliant.

  • We didn't have to redo the logo or anything.

  • [LAUGHTER]

  • ROMAIN GUY: But now--

  • CHET HAASE: What's New in Android,

  • what I like to call the Android keynote.

  • Nobody else does.

  • But I like to call it that.

  • Because this is where we talk to you about all of the developer

  • stuff going on in the Android platform.

  • In particular, let's talk about Android P. Specifically,

  • let's talk today about--

  • oh, hang on.

  • Android APIs.

  • [LAUGHTER]

  • ROMAIN GUY: All right, first we'll start with distribution.

  • You saw in the keynote we introduced the dynamic ad

  • bundles.

  • [? Tor's ?] demo was pretty clear.

  • It's pretty easy for you.

  • All you have to do is click your different menu

  • when you build your application.

  • And we're going to save you some space.

  • It's going to be faster and easier for your users

  • to download your app.

  • And I'm sure you have a lot of questions about it.

  • So we have a couple of talks this afternoon,

  • at 5:00 PM and 6:00 PM.

  • Go there if you want answers, because we don't have them.

  • CHET HAASE: So you're going to see a slide a whole lot

  • like this through the rest of the talk.

  • I feel like the main function we serve

  • in this talk is to tell you the other talks to go to.

  • We're like the appendix.

  • We're like the index for the rest of the content.

  • DAN SANDLER: I mean, we're like obsolete, like vestigial.

  • Is that what it is?

  • CHET HAASE: I don't like to think about that.

  • ROMAIN GUY: Yeah.

  • Let's be clear.

  • A lot of people back at work have done all the hard work.

  • We just get to go on stage and talk about their hard work.

  • CHET HAASE: So speaking of that, let's talk

  • about Android Jetpack.

  • We heard Steph talk about this in the developer keynote.

  • This is a set of components, as well as

  • guidance on how to build better Android applications.

  • All of you are familiar with most of what

  • is in Android Jetpack already.

  • What we're doing is adding to it over time

  • with stuff that's going to make it even better.

  • And we're also improving it over time.

  • One of the major steps that we're taking

  • is what I like to think of as a refactor, because it's

  • a refactor.

  • My favorite thing about the support library

  • is how the package names embed the release number in them.

  • So, for example, we support things like v4.

  • Actually we don't support v4 anymore.

  • We have a min SDK of at least 14 now.

  • But it's in the package name.

  • Isn't that a little bit silly?

  • So we're doing away with that.

  • We're doing a whole lot of tedious renaming.

  • And we're also providing tools to make it easier

  • for you to do the similar refactoring that you're

  • going to need to do in your application,

  • as well as in Android Studio.

  • Everything is being renamed to something more appropriate

  • called Androidx.

  • If you want to know more about the details of that,

  • the renaming, as well as doing more modular, more fine-grained

  • splits to make sure that you don't drag in too much stuff,

  • go to the talk.

  • Learn what's new in Android Support Library.

  • Also, there was an article that was

  • posted on the Android Developers blog about a half hour ago.

  • Check that out for more of the details there.

  • Let's talk about Android Test, which

  • is part of this new Jetpack thing going on.

  • Android Test is the ATSL stuff, the Espresso

  • stuff that hopefully you were already using really good ways

  • to test your application.

  • Now, they provide first class Kotlin support, as well as

  • more elegant APIs for reducing a lot of the boilerplate.

  • Here's a simple example.

  • We used to have a way of asserting,

  • which, A, was not necessarily obvious

  • in the parameters you were passing.

  • Also, unobvious in the order of the parameters

  • that you were passing.

  • And then it would give you an error message that also

  • didn't really help very much.

  • So we have something a little more sensible now.

  • You can assert that it's actually working

  • on the visible property.

  • And the error message gives you something more

  • that you can work with a little bit better.

  • Go to the Frictionless Android Testing

  • talk for more information about that stuff.

  • Jetpack architecture is about the architecture components

  • that were announced last year at I/O

  • and then iterated with feedback from the community,

  • and finally went 1.0 in the fall.

  • So we have the release parts of those,

  • which includes all the Lifecycle stuff, and the ViewModel stuff,

  • as well as the Room, the persistent data model

  • stuff, and LiveData.

  • So hopefully you are using that stuff already,

  • at least in your new applications.

  • And what we also have is recently we

  • released the paging library for doing asynchronous data paging

  • into RecyclerView.

  • That was alpha, then beta, because that's

  • how those things work.

  • And now it's 1.0 this week.

  • So please start using that.

  • And we also talked in the developer keynote

  • about a couple of new things that you should check out soon.

  • WorkManager is currently in preview.

  • There's going to be a talk about it.

  • It's about job scheduling, but job

  • scheduling in a way where we handle all the cases back

  • in previous releases, instead of you having

  • to use specific approaches, depending

  • on what version and device that you're on.

  • Also, navigation.

  • It turns out that up versus back is a hard problem

  • for applications to solve.

  • We are making that much easier.

  • And we're integrating with the tool

  • to make it even easier yet.

  • So go to all these talks.

  • There's an overview talk, as well as

  • specific talks on Navigation Controller and WorkManager,

  • and also a talk on RecyclerView and Paging.

  • Me?

  • Again?

  • ROMAIN GUY: It says your name on the slide.

  • CHET HAASE: I keep building suspense into this thing.

  • What's going to happen next?

  • Who's he going to hand the clicker to?

  • It's still mine.

  • DAN SANDLER: It's still you.

  • CHET HAASE: OK.

  • Let's talk about battery.

  • This is one of the ongoing efforts in Android

  • to help the users, because it turns out

  • battery is really important.

  • We're all power users.

  • Unfortunately, we just keep using the power.

  • So what can we do about it?

  • We can create these app standby buckets.

  • We're going to monitor the usage of applications,

  • and see how actively the user is using it,

  • and then make determinations about how much access

  • that application has to ongoing things

  • in the system which are going to take up more battery.

  • We also had background restrictions

  • that the user has the ability to kick in in settings.

  • So if an application is behaving badly,

  • let's say holding wake locks for long periods of time,

  • or waking up constantly, or accessing services

  • way more than they should when it's not on a charger,

  • then we'll note that and expose that in settings.

  • And then the user can take some action on that

  • if they deem that necessary.

  • Go to the battery session on Thursday morning

  • to learn more of the details there.

  • ROMAIN GUY: So one of the things that we've been focusing on

  • with Android P is privacy.

  • Maybe that's what it stands for.

  • So one of the things we've done is

  • that when your app is in the background,

  • it doesn't have access to the microphone anymore.

  • It doesn't have access to the camera anymore.

  • And it doesn't have access to the sensor kind of.

  • So you won't receive the data from the sensors automatically.

  • You can manually pull from the sensors,

  • and you'll get batch updates, but the best thing

  • to do if you want to get access to the sensor data

  • is to keep a foreground service running instead.

  • So no more microphone.

  • No more camera for you.

  • I think I've heard in the past that some apps were trying

  • to stay alive in memory by playing a white noise

  • or keeping the microphone on.

  • Don't do that anymore.

  • It's not OK.

  • Kotlin-- it's this little thing we announced last year.

  • So we are busy.

  • We want to make it better for all the Kotlin developers

  • out there.

  • I'm sure there's a lot of you here today.

  • So some of the things we've been doing,

  • the R team has been really busy with D8, R8, and ART itself.

  • They've been looking at the byte code generated by the Kotlin

  • compiler.

  • They've analyzed some of the byte code patterns that

  • were different from the ones generated by the Java

  • programming language compiler.

  • And they've been optimizing for those patterns.

  • We've also been adding a lot of nullability annotations

  • to our Java APIs, both in the core libraries,

  • so libcore, and our support libraries

  • to make it easier for you to use the platform APIs when you're

  • in Kotlin.

  • And finally, we launched on GitHub

  • a new library called Android KTX.

  • It's a set of Kotlin extensions for existing platform APIs.

  • And the goal here is to try to take advantage

  • of some of the Kotlin language features

  • to make existing APIs easier to use.

  • They're already easier to use just by using Kotlin,

  • but with the extension they get even better.

  • And I want to thank the community,

  • because we've received dozens of pull requests,

  • and also bugs and feature requests from you.

  • And we've accepted a bunch of them.

  • So if you have ideas, if you have

  • things that you would like to see in Android KTX,

  • please go to GitHub, and we'll take a look at your PR.

  • And this is an example of the kind of code

  • you can write with KTX.

  • If you want to create a bitmap, you

  • don't have to specify its ARGB 8.8.8.8 anymore.

  • You can call applyCanvas, which automatically

  • creates the canvas for you.

  • It becomes this.

  • And at the bottom you can see, for instance, the destructuring

  • assignment for color integer.

  • So you don't have to do any shifting or masking

  • of the int into bytes.

  • We'll take care of that for you.

  • There is a talk by Jack Wharton on Thursday at 10:30 AM.

  • He's going to go through most of the extensions.

  • He's going to talk about the philosophy

  • behind the extensions, how we write them, what

  • are the kind of extensions that we want to see in that library

  • that we're not looking for in that library.

  • So before you do all the work and send a PR,

  • go attend that talk to understand

  • what we're looking for.

  • CHET HAASE: We already talked about the Android Test

  • stuff that is part of Jetpack earlier.

  • That's probably a better, more holistic way

  • to test your application.

  • But if there is a specific situation

  • in which you find it necessary or helpful

  • to mock the framework--

  • and I don't mean ridicule it, because that wouldn't be nice--

  • then it is possible to do that in easier ways now in Mockito.

  • We are not changing the framework,

  • but we're actually integrating changes into Mockito itself.

  • So you can now mock final methods.

  • And soon you should be able to mock static methods.

  • [CLAPPING]

  • [CHEERING]

  • ROMAIN GUY: And Chet is making that face

  • because he doesn't understand why

  • that is so interesting to you.

  • [LAUGHTER]

  • CHET HAASE: And system created objects like activity,

  • eventually.

  • Working on that internally, but it should be on the way

  • eventually.

  • A background text measurement.

  • This is part of a bunch of smaller changes

  • that we made in the text area.

  • It turns out that measurement is really expensive.

  • So most applications do text.

  • And I would bet that the text operations in your application

  • are some of the most expensive going on in the UI thread,

  • which can contribute to jank.

  • Wouldn't it be nicer if you could offload that

  • to a background thread so that by the time you actually

  • need to render the text, or perform

  • those operations on the UI thread, most of the hard work

  • was done for you?

  • So the observation is that 80% to 90%

  • of the operations that are necessary for actually

  • displaying text happen in text measurement.

  • Well, we've made it possible and much easier to actually perform

  • this as a background operation.

  • So we have a class called pre-computed text.

  • And you can query that.

  • And you can say, I want you to pre-measure this.

  • And then you can set that text that's spannable on the text

  • view later whenever you want.

  • So you do it in background threads, like this.

  • You say create this thing.

  • And then you can set that on the text view

  • later when you actually need it.

  • Should be much faster.

  • Magnifier is something that if you're

  • using the preview releases you might see,

  • if you select some text, it pops up this little bar up above it.

  • It makes it easier to manipulate the cursor.

  • That's really great for text, but the other cool thing

  • about it is that it's also available for your applications

  • for any other use case.

  • So there's an API that allows you

  • to pop up this magnifier for whatever

  • happens to be in your view.

  • So you can show this, and dismiss it, and use it

  • for your stuff as well.

  • So core functionality that we want in the system user

  • interface.

  • But also useful APIs for developers

  • to use for their own purposes.

  • I don't know if you've worked with your design department,

  • and they've specified something about text in your UI,

  • and, OK, well, I want this aligned this many dips away

  • from the top.

  • And I want the baseline on the bottom

  • this many dips away from the bottom.

  • And then you have this interspersed vertical alignment

  • stuff going on.

  • And then you sort of puzzle with this for a while.

  • And you basically futz with padding

  • in all kinds of different configurations

  • to sort of get it to where they wanted to get it.

  • We have worked with that and created

  • some new attributes, and methods, and properties for you

  • to use that make that much easier.

  • So we allow you to just pass us the information

  • about the baseline alignment calculations

  • that you would like to perform.

  • And then we'll futz with padding on your behalf.

  • Smart Linkify.

  • I think of this as being Linkify, but smarter.

  • So we already have the ability to ask

  • for links in a block of text, and it'll detect things

  • like phone numbers and addresses.

  • But we also have the ability, through machine learning

  • models, and stuff that you've seen through smart text

  • selection, to detect other entities.

  • We can do the same thing with Linkify.

  • It takes a little bit potentially longer.

  • So you do this off thread.

  • So you would basically generate the links off thread,

  • and then set it on your text view

  • later, using code similar to this.

  • There's a text talk on Wednesday evening.

  • So please go to that for more details about all

  • of this, as well as more.

  • ROMAIN GUY: So location.

  • Now you can take advantage of a new package,

  • android.net.wifi.rtt.

  • It's the WiFi round trip time API.

  • It requires compatible hardware on your phone.

  • It also requires a compatible access point.

  • And it allows you to find a precise location indoors

  • for the user's device.

  • You need to request the Find Location permission.

  • And you don't need to connect to the access point.

  • So if you're building an application that

  • requires locating the user inside a big building,

  • you can take advantage of this API in Android P.

  • CHET HAASE: Accessibility has some improvements

  • for navigation through the app.

  • So it's easier for you to declare

  • these functional blocks.

  • It makes it easier for accessible users

  • to understand how things are being grouped on the screen.

  • There's an important talk on accessibility right now?

  • No, that's tomorrow.

  • OK.

  • Hopefully it's tomorrow.

  • I'm not sure that's correct.

  • ROMAIN GUY: It's tomorrow or 40 minutes ago.

  • CHET HAASE: OK.

  • All right.

  • DAN SANDLER: It's on YouTube.

  • CHET HAASE: Yes, it is.

  • Eventually.

  • Like now.

  • Yeah.

  • I think I got the day wrong.

  • Sorry about that.

  • It's now.

  • If you're in the wrong talk, I invite

  • you to go to the other one.

  • [LAUGHTER]

  • ROMAIN GUY: Oh, my turn.

  • Security.

  • New API in Android P, the Unified BiometricDialog.

  • So we deprecated the FingerprintManager

  • because there are more ways to authenticate yourself

  • with your body than just with a fingerprint.

  • Could be your eyes.

  • Could be whatever else that device manufacturers will

  • think of next.

  • So now we have a single UI for all devices

  • and all means of authentication.

  • We also have stronger protections for private keys.

  • And very important in your application,

  • if you're using the API, Build.SERIAL,

  • it doesn't work anymore.

  • The API is still there.

  • But it basically returns bogus data.

  • So you cannot rely on it at all anymore.

  • CHET HAASE: Various changes in Enterprise.

  • Just a couple of them that are interesting.

  • We made it easier to work with work profile

  • apps or different profile apps by having these different tabs

  • that you can associate with them so they're not all mixed

  • together, but you can actually have these whole sections

  • of these different profiles.

  • Also, you're allowed to lock packages to a specific task.

  • You could have a launcher with just a minimum set of a group

  • or a single application.

  • That works in combination with the ability

  • to have ephemeral users, which now gives you kiosk mode.

  • So you will no longer have experiences

  • like I had on a recent flight, as you

  • can see from my blurry picture, where you see a movie

  • and you wonder what operating system is running under that.

  • So you swipe from the bottom of the screen.

  • You see the ICS navigation bar.

  • And then you press on the recent tasks,

  • and you swipe the movie away, and you confuse

  • the heck out of the system.

  • ROMAIN GUY: I have to admit, that's what

  • I try every time I'm on a plane these days.

  • And it works a surprising number of times.

  • DAN SANDLER: Homework for all of you.

  • CHET HAASE: This is like fun for Android engineers.

  • This is what we do.

  • ROMAIN GUY: Hopefully this is not

  • linked to the plane navigation.

  • DAN SANDLER: Now?

  • Is it OK?

  • Can I talk now?

  • Is that all right?

  • CHET HAASE: Very briefly.

  • DAN SANDLER: Very briefly.

  • All right.

  • Let's talk about actually a lot of the system UI stuff

  • typically gets shown at one of the keynotes that

  • precedes What's New in Android.

  • So you've all seen a lot of the great stuff

  • that users are getting.

  • I'm going to talk to you about some of the stuff

  • that as developers you might be interested in.

  • The first one is display cutouts, a.k.a.--

  • well, there are other names for it.

  • These are coming to the ecosystem all over the place.

  • And so as a developer, you need to know where it's safe to draw

  • and where it isn't.

  • When you get your window insets on viewWindowInset, something

  • or other, you get DisplayCutout object,

  • which gives you all kinds of interesting data

  • about the cutout.

  • But you're probably going to want

  • to use something called windowLayoutInDisplayCutoutMode

  • on your window.

  • So there's the basic way, which is, I never

  • want to overlap the cutout.

  • Just leave a black bar at the top or the bottom.

  • Whatever.

  • I'm not all that interested.

  • A little more advanced would be display

  • cutout mode default, which is, if you were already

  • going to clear the status bar just fine,

  • we'll let the app window draw underneath the cutout as well.

  • So you'll get the nice action bar

  • color extending through the status bar

  • and extending underneath the cutout.

  • Better still, or more advanced still

  • is shortEdges cutout mode, which means

  • essentially any time there's a cutout on the short edges

  • of the device, and we're in portrait,

  • I will just draw underneath.

  • You don't have to do anything special.

  • And in that situation you do need

  • to look at the display cutout and ask it for the safe insets.

  • Essentially, well, OK, I'm drawing everywhere.

  • But you tell me what single rectangle of the screen

  • is safest to draw in.

  • And then finally, the cosmic version of this.

  • You can do shortEdges, but you can actually

  • get the bounds of the cutout as a region.

  • So you will get the exact set of rectangles

  • that are unavailable to you on the screen

  • so you can display UI in the corners

  • if the corners are available.

  • Or if there's a corner cutout, you

  • can move things out of the way so

  • that it's visible in the center of the display.

  • This is the most advanced version of it.

  • And you can put the electrical tape away,

  • because you can actually simulate notches now

  • in developer options on your device.

  • This is really, really exciting.

  • Slices actually was what we were calling it internally.

  • And we liked it so much we just kept it.

  • You've seen Slices now in a couple of keynotes.

  • It's essentially something that we've discovered and learned

  • about on system UI and in the toolkit over many years

  • of dealing with remote views for app widgets dealing

  • with notifications.

  • Essentially the problem of getting content from your app

  • into some other place.

  • So Slices is our new approach to remote content

  • that you can actually use to project UI into your own app,

  • or into other apps that support it.

  • It's very structured.

  • This is not sort of, here's a canvas, or an absolute layout.

  • Go nuts with it.

  • We give you a structure to fill out,

  • and a whole bunch of very flexible templates in which

  • to populate that data with some display

  • hints so that the receiving end of the slice, the slice host,

  • kind of knows what to do with it.

  • These are interactive.

  • These are updateable.

  • This is meant to be something that holds rich UI.

  • Sliders, controls, live information, possibly

  • videos, things that actually feel like real UI,

  • as opposed to a snapshot of something happening

  • in a distant process somewhere.

  • Slices are addressable by content URI.

  • And this is how they're passed around the system,

  • and how they're passed along to app indexing

  • to be shown in context like search.

  • And then finally, Slices is entirely

  • inside the support library.

  • It's entirely in Jetpack.

  • So it's backwards compatible.

  • You can use Slices all the way back to API 19.

  • There's a great talk about Slices tomorrow

  • bright and early, building interactive results

  • for Google Search.

  • Come to find out more about how all this technology works

  • and how you can build your own.

  • Related to Slices is Actions.

  • You can think of these as shortcuts with parameters.

  • Romain likes to think of them as visible intent.

  • This is essentially a deep link into your app

  • with some additional payload.

  • It's not just a link to music.

  • It's linked to the particular album, or something like that.

  • And you saw these as well in the keynotes,

  • showing up in a predictive space inside our app launching

  • experience, actions you define in an actions XML

  • file that goes into your APK or app bundle.

  • And that too can get registered with app indexing

  • so that search results and predictive features

  • can show those actions.

  • And there's a talk about this too,

  • Thursday, slightly less early in the morning.

  • Integrating your Android apps with the Google Assistant.

  • Notifications.

  • There was a lot of great stuff about digital wellness

  • and controlling notifications that you saw in the keynote.

  • And I'm very excited about it in P. I'm going to talk about some

  • of the developer stuff, though, that we have in here.

  • We asked users what notifications

  • are most important to them.

  • Users love messages.

  • So we focused our energy in enhancing the messaging style

  • API.

  • You can do inline images now.

  • You can do participant images, and attach other metadata

  • about the participant.

  • And we finally now have UI for smart reply,

  • which we've had on Android Wear for years.

  • So when you use RemoteInput.setChoices,

  • those will now appear as chips right in the notification

  • so you can respond instantly in the middle of a chat

  • without leaving the notification shade.

  • And there's tons of other stuff as usual.

  • I had one other slide that we added

  • about 10 minutes ago to this deck about notifications.

  • And I'm just going to let that sit

  • on the screen for a little while.

  • [CLAPPING]

  • So if you're doing something in the background,

  • the user still needs to know.

  • But Android P does a much better job of allowing notifications

  • that you may already have running

  • testify to that background activity,

  • including things like overlaying windows.

  • So with that, talk about the runtime, Romain.

  • ROMAIN GUY: So probably one of the most important things

  • for you as an Android developer is to understand

  • our deprecation policy.

  • It was announced a few weeks ago.

  • Soon we will require all the applications to target

  • some of the newest API levels.

  • And we're doing that to make sure

  • that we can keep the security level of Android

  • as high as possible, as well as performance,

  • and a lot of other nice things.

  • So what does it mean for you?

  • As of August this year, new applications

  • published on the Play Store will have to target API level 26.

  • And as of November this year, any update

  • to an existing application that you publish on the Play Store

  • will have to target API 26.

  • And you can expect those numbers to go up over time.

  • If you have native code in your application,

  • we've been supporting 32-bit and 64-bit for a few years now.

  • We will make 64-bit APIs required

  • as of August of next year.

  • You'll still be able to ship 32-bit support

  • in your application, but we will ask you to ship 64-bit as well.

  • One of the reasons to ship 64-bit on 64-bit devices

  • is that you get better performance and much

  • better code out of it.

  • If you want to know more about the deprecation policy,

  • there is a talk on Thursday afternoon.

  • And I'm sure you'll have a lot of questions for the folks

  • there.

  • App compatibility.

  • If you've tried the developer preview in P,

  • you might have noticed something different

  • if you're one of those naughty applications that

  • uses some of our private APIs.

  • On Android we have two types of private APIs, the APIs that

  • are actually marked private.

  • And then there's this weird @hide thing

  • that we use in our source code.

  • It's a Java Doc tag that we process especially

  • to indicate that this is a public API for us.

  • Not for you.

  • Just for us.

  • And we're a bit jealous because a lot of you are using them.

  • So from now on, a lot of those APIs

  • will trigger warnings in one form or another.

  • It might be toast, or it might be logs

  • when you make an illegal call.

  • For some of these APIs we need to hear from you

  • that you need those APIs for your application

  • to keep working.

  • Sometimes it's just an oversight.

  • We didn't make the API public.

  • It's just because we didn't think about it.

  • So please go to this URL, and let

  • us know if there's an API that you

  • think should be made public.

  • We might say yes.

  • We might also say no.

  • We have three types of lists.

  • I won't go into too much detail here.

  • But basically, if an API falls into the blacklist,

  • you will not be able to call it ever.

  • I'm not sure if we have anything in the blacklist right now.

  • But those will evolve over time.

  • So, again, let us know what APIs you need.

  • CHET HAASE: This is one of the important reasons.

  • This is why we ship previews.

  • We need you to go out there and try your applications

  • on the previews, because this is the time

  • to find out when there's problems that either you

  • can fix before the real release is out there,

  • or you can let us know if it's a problem that we

  • need to work on instead.

  • ROMAIN GUY: And that's Chet trying

  • to increase his engagement with his podcast.

  • So there's this podcast called ADB, Android Developer

  • Backstage.

  • And in episode 89, they had-- who

  • did you have in the episode?

  • CHET HAASE: Brian Carlstrom.

  • ROMAIN GUY: Brian Carlstrom.

  • And they talked about this app compatibility issue,

  • and what it means for you.

  • So go listen to it.

  • NDK.

  • The release 17 of the NDK brings a lot

  • of very interesting things.

  • So, first of all, the neural networks

  • APIs that were part of API level 27.

  • We also have a new shared memory API if you do a lot of JNI.

  • More importantly, we have finally ASAN,

  • the address sanitizer to make sure your code doesn't

  • scribble all over the memory.

  • Now you don't need a rooted device anymore to use it.

  • And we also have an undefined behavior sanitizer.

  • It can be very difficult to detect undefined behaviors

  • in your C or C++ code.

  • So now there's a tool for that.

  • We finally removed support for the deprecated APIs.

  • So if you still use ARMv5 or MIPS, MIPS 32-bit or 64-bit,

  • support is gone.

  • You should not be shipping those anymore.

  • In the upcoming release, release 18, we will remove GCC.

  • So the GCC compiler was deprecated last year.

  • Now everything is compiled with Clang in the NDK.

  • We think we gave enough time.

  • So GCC is going away.

  • So if you're still using it, maybe you

  • should not be in this talk.

  • You should go fix your code.

  • [LAUGHTER]

  • And finally, we added support for the simpleperf CPU

  • profiler.

  • And we also have support in Android Studio

  • for native profiling.

  • So you don't even have to type anything in the terminal.

  • Graphics and media.

  • The camera APIs are getting better and better.

  • We've added a lot of things that we were using ourselves

  • in the camera applications.

  • For instance, we give you access to the timestamps

  • of the optical image stabilization.

  • So if you want to build the kind of stabilization

  • we've built in the video recording part of the camera

  • application, now you can.

  • If your app is doing a lot of selfies

  • and using the display as a flash,

  • you can tell the camera that you're

  • doing that so you can adapt the exposure accordingly.

  • We have support for USB cameras.

  • I haven't seen any use for it, but I've heard some of you

  • ask about it.

  • So now it's available.

  • Multi-camera support.

  • There are some phones coming out out there

  • with multiple cameras in the back, or in the front

  • I suppose.

  • And now we can expose them as a logical camera that contains

  • more than one stream of data.

  • ImageDecoder.

  • I'm sure a lot of you are familiar with BitmapFactory.

  • And if you're not fond of that API, trust me,

  • we're not either.

  • So there's a new one called ImageDecoder.

  • It's part of Android P. The idea is to make it not only easier

  • to decode images, but also to make it possible to decode

  • animated images.

  • So ImageDecoder can decode bitmaps, but also drawables,

  • including animated drawables.

  • CHET HAASE: There's this new thing called animated GIFs.

  • I don't know.

  • All the kids are using them.

  • It's possible to actually load those now.

  • ROMAIN GUY: So there are a few concepts

  • that you have to learn when you learn ImageDecoder.

  • And we're going to go through them in an example.

  • So we have the concept of the source.

  • We have the concept of the post processor,

  • and finally, the header listener.

  • So this is what the API looks like.

  • First, you have to call createSource

  • on the ImageDecoder.

  • The source can be an asset.

  • It could be a file descriptor.

  • It can be many different things.

  • And the goal here is that once you create a source,

  • you can decode multiple images from the same source.

  • This is particularly useful if you want to build thumbnails.

  • You can decode the same source once at high resolution,

  • once at lower resolution, or even intermediate resolutions.

  • And that can be done from multiple worker threads.

  • Then you can call decodeBitmap.

  • You pass the source.

  • And you can optionally pass a header listener.

  • So the header listener, we have a lambda.

  • It gives you back the decoder itself,

  • the metadata about the image and the source.

  • And it's inside the header listener

  • that you can set the options.

  • So in BitmapFactory you had bitmapfactory.options

  • that you had to pass through Decode.

  • Here you have to wait for the header listener to be invoked.

  • CHET HAASE: We should point out too,

  • that set target size, that's kind

  • of a fundamental difference where the old bitmap--

  • DAN SANDLER: It's up there for them.

  • CHET HAASE: No.

  • I know.

  • I see that.

  • It's right there it.

  • [INTERPOSING VOICES]

  • CHET HAASE: That's a huge difference.

  • Like before, if you wanted the right target size,

  • you needed to work with, what?

  • In density?

  • As well as?

  • ROMAIN GUY: In sample size, in density,

  • and do a lot of trickery.

  • CHET HAASE: We didn't know how they worked either.

  • ROMAIN GUY: Yeah.

  • [LAUGHTER]

  • CHET HAASE: So a lot easier now.

  • ROMAIN GUY: Yeah.

  • Usually our answer was, figure it out.

  • Anyways, so now you can just tell us

  • what size you want the bitmap to be in.

  • And we'll take care of it, finally.

  • And you can also set the post processor.

  • So the post processor is something

  • with a simple interface.

  • It gives you a canvas so you can draw on the bitmap right

  • after it's decoded so you can add a header, like your title,

  • or some metadata on it.

  • Media.

  • We are adding support in the platform for the HDR

  • profile of VP9 so you can play HDR videos

  • in your applications.

  • YouTube was doing it, but they had their own way

  • of decoding videos.

  • If you own a device that doesn't support HDR,

  • the playback will be in low dynamic range instead.

  • But on a device that's capable, like a Pixel 2,

  • you will be able to see the HDR stream in all its glory.

  • We're also adding support, and bear with me,

  • cause it's a little bit confusing,

  • a format called HEIF.

  • Yeah.

  • It rolls off the tongue.

  • The High Efficiency Image Format.

  • So it's based on HEVC, which was also called H.265.

  • And the filename extension is commonly HEIC.

  • I don't know why they used different letters.

  • But that's what they did.

  • So we have to deal with it.

  • So it is a container.

  • And it can store multiple images.

  • So you can use it to store a single image with higher

  • quality and higher compression ratios than JPEG, for instance.

  • JPEG is a lot easier to say.

  • I like JPEG.

  • But you can also sew multiple images

  • if you want animated images or short movies.

  • This is part of the support library,

  • or, I guess, Jetpack now.

  • And this is not part of the compressed API

  • that you find on bitmap.

  • Because it's a container, it works differently.

  • So this is what it looks like.

  • You have to create a builder.

  • You have to tell us the path where

  • you want to output the file.

  • You have to give us in advance the width

  • and the height of the image.

  • And the source of the image can come from a bitmap.

  • But it can also come from a surface.

  • So you don't necessarily have, for instance,

  • if you're doing GR rendering, or if you're doing video playback,

  • you can include that directly as an image.

  • You don't have to go through an intermediate bitmap.

  • And then when you call writer to choose,

  • you can add multiple bitmaps.

  • In this case, we're adding only one.

  • And when you call stop, we write it out.

  • CHET HAASE: I should point out too

  • that even though it's spelled HEIC, it's pronounced GIF.

  • [LAUGHTER]

  • All right.

  • DAN SANDLER: Go.

  • ROMAIN GUY: What are you still doing here?

  • Vulkan.

  • So this is the slide that I'm excited about.

  • Very few of you will be.

  • And I'm sure you will be super excited by things like subgroup

  • ops and YCrCb formats.

  • More seriously, anybody who's building

  • middleware, for instance, the Unity engine, the Orion engine,

  • or if you're building games, Vulkan

  • is a low level graphics API that gives you

  • a lot more control over the GPU so you get higher performance

  • code.

  • And Vulkan 1.1 adds new capabilities that

  • were not possible in OpenGL.

  • But it also closes the gap with OpenGL.

  • So, for instance, the support of protected content.

  • We were not able to play back protected content in Vulkan

  • before.

  • So that will unlock a lot of things, like video players,

  • for instance.

  • Neural networks APIs.

  • They are not technically part of P. I mean, they are part of P,

  • but we announced them in API level 27.

  • You might have missed that.

  • It's a C API that's designed for machine learning.

  • It's also a fairly low level API.

  • It's meant for basically playback of trained models.

  • So things like TensorFlow are built on top

  • of your neural network API.

  • So you would use TensorFlow to do all the learning.

  • And the neural network APIs can do basically

  • the inference on device.

  • The interesting thing is that we're also unlocking the access

  • to the DSP that we use on Pixel 2

  • so you can get hardware accelerated machine learning

  • effectively on select devices.

  • ARCore you just saw in the developer keynote.

  • A couple months ago we introduced ARCore 1.0.

  • And they just announced ARCore 1.2.

  • This takes care of tracking things in the real world.

  • The problem is that you had to write a lot of OpenGL code.

  • And it's not always pleasant.

  • So to makes your life easier, we're

  • introducing support in the emulator.

  • So when you create an AVD, you can specify a virtual camera

  • stream for the back camera, and you get the full 3D scene.

  • And it works in any view that displays the camera stream.

  • And using the keyboard and the mouse,

  • you can just navigate around the UI.

  • You even get patterns on that fake TV screen

  • so you can do image recognition and pattern

  • tracking, that kind of stuff.

  • It's available today.

  • Also we have Sceneform.

  • So they already talked about it in the dev keynote.

  • I won't talk about it too much.

  • I will only say, this is what I've been working on

  • for the past few months.

  • So I still don't have a name, because we did that

  • to Daydream.

  • But anyway, that's what I was up to.

  • And if you want to know more about Sceneform,

  • there is a talk tomorrow afternoon at 5:30 PM.

  • And we're going to talk about the underlying tech

  • and how to use the API.

  • And finally, Chrome OS on Pixelbook,

  • you can now run Linux applications

  • on your Chromebook.

  • And in particular, you can run Android Studio.

  • I believe there are some limitations.

  • But one of the things you can do is run your Android apps

  • as apps on Chrome OS.

  • So you can use Chrome OS as a full blown Android development

  • platform, I guess whenever it's available.

  • [CLAPPING]

  • Many, many more sessions today, tomorrow, and Thursday.

  • You know, I said earlier, there's

  • a lot of folks back home who have

  • been doing all the hard work.

  • We're just talking about it.

  • So thank you very much to all the engineering

  • teams, and the PMs, and the designers, and tech writers,

  • everyone who made this possible.

  • We didn't do anything.

  • We're just here to hear you be happy.

  • CHET HAASE: Some of us did even less.

  • ROMAIN GUY: Yes.

  • And that's it for today.

  • So thank you very much.

  • DAN SANDLER: See you at the Sandbox office hours.

  • Bring your questions.

  • CHET HAASE: Yes.

  • We have Framework office hours.

  • There is Android Sandbox running all week.

  • DAN SANDLER: We've done our best to bring

  • as many people from our teams here

  • to be able to talk to you about these new features,

  • these new APIs as possible, because we

  • know you can't get that anywhere else besides Google I/O.

  • CHET HAASE: And also there's an overflow area.

  • I can't remember where it is exactly.

  • Most of the speakers from Android sessions

  • will make their way afterwards.

  • So if you can't get your question answered

  • at the session, you should be able to find them

  • after that, like us there now.

  • Thank you.

  • [CLAPPING]

  • [MUSIC PLAYING]

[MUSIC PLAYING]

字幕と単語

ワンタップで英和辞典検索 単語をクリックすると、意味が表示されます

B1 中級

Androidの新機能(Google I/O '18 (What's new in Android (Google I/O '18))

  • 24 2
    Tony Yu に公開 2021 年 01 月 14 日
動画の中の単語