初級 236 タグ追加 保存
動画の字幕をクリックしてすぐ単語の意味を調べられます!
単語帳読み込み中…
字幕の修正報告
[MUSIC PLAYING]
PATRICK BRADY: I'm Patrick Brady from the Android Auto team.
And I'm here with Lauren, Rasekh, and Dean
to give you an update on Android in cars.
We've got a lot to cover today and a lot of automotive puns.
So let's buckle up and get driving.
We're incredibly excited about the automotive space
right now because we see it going
through huge transformations, from
electrification to connectivity, sensors and input
devices, sharing and autonomy.
Cars are rapidly turning into full-blown computers on wheels,
with high-speed mobile connections, microphones,
cameras, and screens everywhere of all shapes and sizes.
It's a computing transformation unlike many we've seen before.
And it opens up new opportunities
for innovation and for developers.
Android Auto is an effort from Google and our partners
to bring these advances together and create
a safe and seamless connected experience for drivers
everywhere.
Of course, that's a lot easier said than done.
There are dozens of different car platforms
today, many different input types,
from touch screens to touch pads and rotary controllers,
many different screen shapes, and sizes, and resolutions.
Layer on top of that the myriad driver distraction
guidelines and regulations across the globe.
If every software developer had to deal
with this level of complexity, we
would never be able to realize the potential that automotive
holds.
Well, at Google, we believe that platforms
can reduce that complexity and allow developers to just focus
on innovation.
We saw that happen on the web and in mobile.
And we're working hard to do the same thing here,
making Android the platform for the car.
Today, you can see that vision at work in any Android
Auto-compatible car.
Drivers have access to their favorite apps
right from the car's display.
And developers build their app once for every car,
without having to worry about the make and model, the screen
size, the input type.
We started this journey back in 2014,
partnering with leading car brands
to bring Android Auto to their cars.
We've made a lot of progress since then,
and I'd like to give you just a quick update on how
things are going.
We continue to see incredible growth in compatible cars
and aftermarket systems, with over 500 models
available today from 50 different brands.
Since last I/O, we've seen first launches from Fiat,
Alfa Romeo, Tata Motors, and Smart; and 2018,
bringing exciting new launches from Jaguar, Land Rover, Mazda,
and many more.
Just last month, JVC and Kenwood launched the first head unit
supporting Android Auto wireless.
We're really excited about wireless,
and we look forward to more partner launches in the months
ahead.
With these great partnerships, we're well on our way to
making Android the platform, the developer platform,
for every new car shipped.
But what about the 1 billion cars on the road that
predated Android Auto?
What do we do for those cars?
Well, of course, we know not everyone
has a compatible vehicle.
So we made the Android Auto app available to all Android users
as a standalone experience.
And this opens up the platform and the ecosystem
to many millions of drivers, no matter what kind of car
they drive.
And as we've made Android Auto available to more cars and more
drivers, we're happy to say that the strong user
growth continues.
Here's where we were in I/O 2016,
I/O 2017, and today, at I/O 2018.
We've seen over 300% user growth in the last year alone.
In fact, as you can see on the chart,
our growth is accelerating as we make
Android Auto available to more cars and more drivers.
And that consistent platform and large install base
is a great opportunity for developers.
So we continue to see strong growth on that front, as well.
In the last year, with some popular apps
like Waze, Google Play Books, TextMe, Sirius XM, and Plex
launching Android Auto, Google Play
now has thousands of Android Auto apps available.
And that's a 200% growth from last year.
In January, we also launched the Google Assistant
in Android Auto, opening up a new way for drivers
to stay safely connected in the car and for developers
to build car-friendly experiences.
And the great thing is actions built for the Google Assistant
work across the Assistant services,
from smart speakers to TVs, wearables, and mobile.
We're excited about some of the early favorites in automotive,
such as Spot Hero, which helps you find and reserve
parking spots; busuu, bringing personalized language learning
right into your car; and CNBC, bringing you
the most up-to-date business news and market
insights on the go.
The Actions on Google ecosystem is exploding,
and we're super excited to see what you all build next.
But of course, that's just the beginning for Android in cars.
As cars advance, we have the opportunity
to create an even more seamless experience by building Android
right into the car itself.
And that's why we started an effort back in 2016
to build automotive support right into the platform.
The first demo of that work was right here at I/O in 2016
with this concept Maserati.
With a lot of help from the industry,
we're investing in building automotive features
into each Android release, making Android a turnkey
platform for infotainment, just as it's been
a turnkey solution for phones.
And the rest of the industry is taking notice.
Last year, two of the world's most innovative car makers,
Audi and Volvo, announced that Android will power
their future in-car experiences.
If you had a chance to see the in-car demos last year,
we think you'll understand our excitement for the future
and why we look forward to many more car makers
following in their tracks.
Earlier this week, Volvo and Google
announced a partnership to bring Google services
to the next-generation, Android-powered Volvo cars.
With the Google Assistant, Google Maps, and the Play Store
seamlessly integrated into the Sensus system,
were creating a new benchmark for the connected driving
experience and a great opportunity
for all of you, our developers.
And of course, this all works no matter what kind of phone
the driver has, or even if they leave their phone at home.
To get a preview of that experience today,
swing by the Android Auto Sandbox,
where you can see it live and in action in the next C40.
So we've made a lot of progress since Android Auto first
hit the road.
And we're well on our way to establishing Android
as the platform for automotive in old cars, new cars,
and the next generation of cars coming up the road ahead.
But of course, it's not enough to just make Android available
everywhere.
We're also working hard to make sure it's
the best possible experience for our users and our developers.
We've listened to them a lot over the last year,
talking with great developers, like Spotify and others,
to hear what they want to see from Android Auto next.
And today, I'd like to give you a sneak peek at some
of the exciting changes we're making
and what it means for the developer community.
So now, I'd like to invite up Lauren
to tell us where Android is going next.
[APPLAUSE]
LAUREN WUNDERLICH: Good morning, and hello, everyone.
My name is Lauren.
I'm going to be speaking about some of the new challenges
we are facing on the auto space and how we're adjusting them.
We've spoken before about why we building Android Auto.
And at the core of our experience
is our overarching commitment to bringing a safe and connected
experience to every vehicle.
This is particularly important, as the primary responsibility
of the driver is to drive.
Everything else we do is secondary.
To really highlight this, we showed a photo
of a driver playing Pokemon Go during one of our research
sessions.
Don't worry, everyone was safe.
We were parked.
But this extreme example showcases the struggle
we face while balancing the tension we
feel between balancing our digital and our physical lives.
By developing strategies to manage attention,
we can provide a safer in-car experience
to keep drivers connected.
And as you heard from Patrick, screens
are getting bigger, wider, taller, more shapes, and more
inputs, such as touchpad.
We are starting to think about how to evolve our UI to better
scale and adapt for the future.
The best part is, as developers, you
don't have to worry about these layout changes, display size
variants, or even supporting those input mechanisms,
because the Android Auto platform takes
on these challenges for you.
Furthermore, we're looking to see where
Android Auto is going next.
As part of our commitment to developing the best possible
in-vehicle experience, we are developing
a built-in experience that feels more at home in the cockpit.
So I'd like to introduce our new Google In-Car Concept.
From design's point of view, the goal of this concept
is to adopt Android Auto's design
to a vehicle-specific theme.
This includes additional ergonomic details
and nods to the vehicle's interior design.
We can leverage these vehicle concepts to prove out
new features and designs, working with our partners
to bring them to life.
And besides abstracting hardware complexity,
we're also thinking about how to enhance our core experiences
and introduce new ways to promote your content.
Today, we'll be highlighting some of the updates
to messaging and media experiences.
But be sure to stay tuned as we roll out
all of these updates this year.
For a live demo of these vehicles,
including a preview of this onscreen experience,
stop over to the Android Auto Sandbox.
As part of helping drivers stay connected,
with user interfaces that are safer and more predictable,
we have developed multiple ways for you
to integrate with Android Auto, depending
on the nature of your service.
For navigation apps, we work directly
with developers to ensure they follow our strict distraction
guidelines.
And you can also integrate with the Google Assistant,
which is voice-centric and ideal for in-car use.
Be sure to check out their sessions this year at I/O.
Messaging and media make up the largest number of apps.
And today, we'll be speaking about the updates we've made
to both of those experiences.
So let's dive in.
Beginning with messaging updates, communication apps,
like WhatsApp, Telegram, and Messages,
integrate with a messaging template
and are displayed via notifications.
Notifications require less effort
and are highly visible throughout the experience.
We've improved the notification layer this year,
including more and unified actions.
More and unified actions across the surfaces
help drivers react to incoming content
without ever leaving their current experience.
And group messaging, a feature developers and drivers
have been asking for--
now you can stay connected with multiple people
at the same time.
Media apps, like Spotify, Pandora, Pocket Casts,
and iHeartRadio, integrate with the media template.
The template is designed to remove
these complexities, such as input,
screen sizes, and orientations.
And they're also designed with the industry's best practices,
promoting predictability for drivers
across all of our systems, including
the phone screen, car display, and built-in experiences.
We are thrilled to showcase some of the biggest redesigns we've
made to the media experience since launching Android Auto.
By focusing on these three key areas--
media template-- we have personalized
a modern, content-forward experience
for all of our drivers.
New architecture, more flexible UI components, and new search
capabilities make up the majority of our updates.
So let's dive in--
new architecture.
Before, when drivers got going, they
would have to navigate to the media section, open the drawer,
and find that very special commute playlist.
Now, when a driver gets going, they no longer
have to seek out the content they wish to play,
but the content is brought forward in a very visual way.
We've re-architectured the media template in order
to pull the content out of the drawer.
Drivers can easily see content, saving additional taps,
and getting on the road just faster.
Our new foundation has created many more opportunities for you
to showcase content, for which we've
introduced more components to promote a new flexibility
and ability to express your content in a more familiar way
that is native to the app.
Now, you can create more customized layouts with list
views and subheadings, as well as
grid views with really large graphics, additional, richer
metadata hints, such as explicit content tags, downloaded tags,
and new and in progress tags.
With the added benefit of the template approach,
each of these new additions are available regardless of theme.
From the Android on the phone or on a screen up,
as well as customized by the OEM,
your branding and layouts will be respected.
See firsthand how we're doing that with the Google In-Car
Concept at the Android Sandbox.
Our last major update showcases a new search and pivot
functionality.
Search is still hands free.
And when a driver asks for a specific song or artist,
it just begins playing.
In addition to playing the best-matched song or artist
automatically, drivers can now pivot
into the app for app-provided content and related
suggestions.
With the new search functionality,
we've further optimized the search results
to include more digestible format.
Things like subheadings and subgroups
help drivers find the content they're
looking for even faster.
So that's what we've been working on
to enable a healthy in-car ecosystem.
I've showcased the new messaging updates and media updates
by focusing on the new structure, new components,
and more search capabilities.
So I'd like to turn it over to Rasekh
to teach you how to get started with these new updates.
Thanks.
[APPLAUSE]
RASEKH RIFAAT: Thanks, Lauren.
Hi, I'm Rasekh, and--
oh, there we go--
I'm here to talk to you about what developers
need to do to support our new messaging and media app
features.
Today, I'll go through some code examples that
will let messaging apps use support group notifications
and using MessagingStyle.
I'll also go over our media architecture
and show you how to provide those in-app search results
that Lauren just mentioned.
And then finally, I'll look at code changes
for media apps to enable our content browsing visuals.
Let's start with an example.
So I live in Seattle.
And one of the best things to do on the weekend--
grab a group of friends and head out to the hills
to hit the slopes.
Let's use that as our example today.
Luckily, all of us are using Android Auto in the car.
And we can start to-- we can interact with it by voice,
keeping our hands on the wheel and our eyes on the road.
But we need to coordinate things, like breakfast, gas,
and just our general progress.
When Android Auto started supporting messaging,
there wasn't a standard way to do messaging notifications
on Android.
We introduced Car Extender helper class
for one-on-one conversations.
But Android notifications have come a long way since then.
MessagingStyle is a newer, standard helper class
for generating large-format notifications that
include multiple back-and-forth messages of varying types
between any number of people.
You should already be using it to write rich notifications
on mobile devices today.
And enabling Android Auto support
is a few short changes away from that.
So let's take a look at some code.
So here's Stephan asking, which run are we going to hit first?
All right, we set that up in our MessagingStyle as the message.
We set our conversation title, Ski Trip.
And since it's a group conversation,
we set that to be true.
Next, we set up our reply action.
We use a semantic action, which is new to Messaging Style.
This allows Android Auto and the Google Assistant
to know what they should be sending to the pending intent.
After the Google Assistant has recognized your voice
and composed your reply, it can launch the messaging app
to send that reply.
Next, we'll create a Mark As Read action.
You can imagine a scenario where I'm in the car,
and I listen to my message, but I don't really--
I'm not really ready to reply yet.
This gives the messaging app a callback
to say, oh, the user read this.
Mark it as read and let other participants
in the conversation know.
Finally, we bring all that together
in a standard way with Android notifications.
We had our reply action, our MessagingStyle,
and our Mark As Read action.
And the Mark As Read action won't show up
in the standard Android UI.
And we can see the final result. Our conversation title
is set up.
The group messaging icons are all
set up from the notification.
And then Assistant and Android Auto
know that you can reply by voice.
And even if you just listen to the message,
we can tell the other participants
in the conversation that they've seen it.
So now that we've coordinated with everyone,
we've figured out where we're going,
we want to listen to something on the way
out to the mountains, all right?
Media in the car is one of the core user experiences.
And getting drivers access to their content
should be front and center.
Now, for media apps to integrate with Android Auto,
they have to implement two services.
The MediaBrowserService provides a tree
of playable and browsable items.
And browsable items are basically
folders to help apps organize their content so they're not
returning a giant list of songs or podcasts.
Once a user has picked something playable from the Browse tree,
the Media Session Service, which the app implements,
is used to start music and to provide metadata and controls
for what's currently playing.
You can see, in this example, the app
supports play, pause, skip to next, skip previous.
And there's also a mechanism for setting up custom actions--
things like a 30-second skip.
Once a media app has implemented these two services,
the Google Assistant just works out of the box.
They say, hey, Google, play my ski jams.
The Google Assistant will recognize your voice,
issue the query, and music starts playing right away.
What we're introducing today for our in-app searching is
the ability to implement an extra function
on the MediaBrowserService to get those in-app results--
MediaBrowserService on search.
In this case, the app has surfaced my ski jams from 2018,
as well as my playlist from my skiing trip last year.
Awesome.
Now, let's look at the code to make this happen.
So for apps which already support Android Auto,
this is actually really familiar.
You have a function which has issued
the query, an extras bundle, and a destination
for the result. Let's break it down.
First off, apps should just return an empty list
if they can't support the query, OK?
If the thing is like an empty query,
just return an empty list.
Normally, queries actually require
a network call or a database fetch.
And so if that's the case, and you
want to do work on an extra-- on a different thread,
detach the results, do the work on that other thread,
and then, when you're ready, send the results
back to Android Auto.
I want to mention that these code snippets actually
come from the Universal Music Player, an open-source media
app published by Google on GitHub.
It can easily be cloned, compiled,
and it's a great reference for creating a media
app across Android Auto and other surfaces.
So now, we've got our search results--
our Ski Trip 2018, 2017, and Ski Jams.
Notice that it returned two playlists, as well
as some albums.
It would be really nice if Android Auto could group those.
We have a way to do that, though.
In this function, we're creating media items
for sending back to Android Auto.
And it's very easy to annotate immediate item
with a group heading.
Android Auto will look at successive media items,
and if they're in the same category,
will group them together.
So if we annotate our Ski Jams with playlist,
and our albums with albums, Android Auto groups them
and shows that for you, regardless
of screen size and surface.
We're also adding a couple more annotations,
as Lauren mentioned.
Here, I'm showing the explicit annotation.
It's great.
I'm in the car.
I got my family.
I don't want that explicit song to come up.
Also, we're adding a support for downloaded.
That way, I'm in the mountains, I don't have good cell
coverage, or I don't want to burn my data plan on music,
I have a chance to pick the right playlist.
Android Auto shows them right with the content.
Looks like Ski Trip 2018 is the right playlist for my trip.
There's one more function that needs
updating to take advantage of these new features.
When the MediaBrowserService starts,
it provides Android Auto with a media route and extras.
In order for search and bringing content out of a drawer
to be enabled, you'll need to add a couple of extras
to let Android Auto know that you support these features.
Now, Lauren introduced the concept
of bringing content out and being able to show things
as grids or lists.
And by default, Android Auto will
show browsable items, the folders,
as lists, because they generally have an icon.
And they'll show content, or playable items,
as grids, because they're usually much richer
and have a much more detailed artwork.
But there are times when it's better to switch that around.
So for instance, let's say you have a podcast up.
In that case, the podcasts, which are browsable items,
have a much richer artwork, right?
But all the episodes, which share the same artwork,
you want to see more about their detail in a list format.
We're giving you the ability to choose that
by adding an extra hint in your on route so that you
can switch the order around and choose grids or lists depending
on the content in your media app.
So to sum up, I've shown code samples
for MessagingStyle, MediaBrowserService, onSearch,
specifying additional item metadata,
adding content browsing, and search hints.
I'm excited to see developers access
these features in the future.
And look for our documentation later in the summer.
Now, please welcome Dean up to talk about tools
for Android Auto development.
[APPLAUSE]
DEAN HARDING: Thanks, Rasekh.
Hi, everyone.
So as Rasekh mentioned, I'm here to talk to you
about the tools we provide for developing and testing
your media and messaging apps, and also
to give you some hints and tips to help you
make the most of the platform.
The first thing I want to talk about is the Desktop Head Unit.
This tool is ideal for testing both your messaging and media
apps, as it will render your apps exactly the same
as you'd see them in production.
And just as importantly, it allows
you to interact with the Assistant,
just like in a real car.
You get the Desktop Head Unit as part of the Android SDK.
Just select it from the Download Manager.
The instructions for running the Desktop Head Unit
are on the developer site, but it's fairly easy.
Just start the Head Unit Server from Android Auto's developer
menu on the phone.
You get that by going to About, then tapping 10 times
on the About Android Auto header to enable developer mode.
Then, choose Start Head Unit Server from the overflow menu
that you can see here.
Plug your phone into a USB port on the computer
and launch the Desktop Head Unit.
From there, you can see how Android Auto will
respond to your app's notifications,
and also, how your media app will appear in an actual car.
But if you're developing a media app,
I encourage you to check out the Android Media Controller, which
is an open-source app that we have hosted on GitHub.
It will connect to your app's media session and media browse
service and shows the information
your app is presenting to Android
Auto in a nice semantic format.
If you're whitelisting to block apps other than Android Auto
from accessing your browse tree, it
would probably be a good idea to either add the media
controller to the whitelist or disable the whitelist while
testing.
Rasekh also briefly touched on the Universal Media Player.
But I wanted to reiterate that we
have this great, comprehensive sample media
app available on GitHub.
It gives you a canonical implementation
of a media app, not only on Android Auto,
but also on all our other services, such as Wear and TV.
Let's get into some code now.
The first bit of code I wanted to talk about
was the standard actions that your app returns
as part of the playback state.
You should try to make sure your app supports
the standard actions defined in PlaybackStateCompact,
such as ACTION_SKIP_TO_NEXT and ACTION_REWIND.
By using the standard actions, you not only
have a great integration with Android Auto,
but you also get Assistant integration
with those actions for free--
actions like, OK, Google, skip tracks.
And as we bring Android Auto to more surfaces,
it may not be the case that those custom actions appear
in the same place they do today.
But you can be sure the standard actions are kept in a place
where it makes the most sense.
You can use the media controller app
to see what standard actions you've implemented,
and also to see what's available.
You can see here that we have an app that
does not implement a standard forwards, backwards action,
but has instead chosen to implement those actions
as custom actions.
Now, if you were to look at this app in the Desktop Head Unit,
you might not necessarily know that there's anything wrong.
Because of the way we render the custom actions,
everything looks OK.
But by implementing those actions as standard actions,
like this, we can leave the custom actions
for truly custom functionality.
We've done a lot of research with users
and found that users do not want to have
to browse while driving.
Here are a few quotes from some of our research sessions,
where users talk about some of the issues
that they have with custom search and browse.
Users typically want to get in and go without much fuss,
and this is a great opportunity for you
to provide some additional value to your users
by giving them easy access to the playlists they like
or to recommend new content that they may not
have known about otherwise.
So I would suggest that you prioritize
surfacing things in the browse tree with items most relevant
to driving at the top.
Rasekh showed you earlier some of the things
we're doing to bring content out of the drawer
and put it front and center.
And you'll definitely want that first page of content
to be the most useful.
Now, that's not to say you can't include
all the user's playlists, or whatever, in the browse tree.
Just keep in mind that the first few items on your home page
should be the most relevant to the task of driving.
A corollary to that would be to keep your browse tree
shallow, no more than two or three levels deep.
In fact, to make the most of the new design we've shown you,
a two-level-deep browse tree is ideal,
because then, your users will be able to get
to their favorite music or podcasts with a single tap.
You can use the onSearch method that Rasekh described earlier
to allow users to search for specific things,
like a particular song or a particular artist.
One of the challenges we have in Android Auto
is that it is used on devices with many different screen
sizes and resolutions.
Your app isn't necessarily going to know,
until the moment it's plugged into a car, what
the resolution of the screen it's displayed on is,
which means that it's difficult to predict
what resolution your icons need to be to look right
on any particular screen.
You can see here, this star icon is too low resolution
for this display, and it really looks out
of place with the rest of the icons.
So package your icons in the vector graphics format,
rather than a bitmap format.
Vector drawables have been supported natively in Android
since API level 21, which, coincidentally,
is the version of Android that Android Auto support was
introduced.
So there's no reason not to use the native vector format.
You can see here an example of the star icon
from before as a vector graphic.
And no matter how big I make the icon,
it still has nice, smooth edges.
As an added bonus, the file size of this icon
is really small, especially compared
to having multiple bitmaps at different resolutions.
Don't forget, you can use Android Studio's Vector Asset
tool to convert your SVGs to the native Android format.
While we're on the topic of icons,
you should prefer using the URI-based methods
for passing us your media, rather than
the drawable-based methods.
This example shows returning one of your built-in resources
as a URI rather than a drawable.
Using URIs allows us to case your resources
and to only allocate memory for the items
that we are actually physically displaying.
If the user never scrolls down to the fourth page,
there's no need to allocate memory for the items
down there.
I also wanted to briefly touch on
something we've seen with ad-supported apps.
Typically, the adds will include some call
to action that may not make sense when running in the car.
For instance, they might say, tap the screen to learn more,
or click here, or something along those lines.
If you're running inside Android Auto,
I recommend adjusting your messaging
so that it's not confusing for people not actually using
your main app's UI.
And lastly, the process of actually getting your app
onto the Play Store--
once you've added this metadata snippet to your manifest,
and you've uploaded your app to the Play Store,
it will need to be reviewed against our Driver Distraction
Guidelines.
This is a fairly complicated process,
but the guidelines for ensuring a smooth review
are documented in the developer center at this link.
If our reviewers find issues, you'll be notified by email
and will be able to resubmit an updated
application for re-review.
So I've gone over a few things we've
learned over the years developing for Android Auto.
We've got a great resource in the Media Controller
for testing media apps.
And don't forget to check out the Universal Media
Player for a comprehensive example implementation.
Use the standard actions where it makes sense,
and leave custom actions for things
that are actually custom.
Surface the important actions at the top in browse.
And use search for finding specific content,
like a particular song or a particular album.
Keep the browse tree shallow, no more than two
or three levels deep.
Use vector graphics for icons.
And use the URI-based methods rather
than sending raw bitmaps.
And finally, be aware of the surface you're
playing on for ads.
And now, I'd like to hand it back over
to Patrick for our wrap-up.
[APPLAUSE]
PATRICK BRADY: Yeah, thanks.
Thanks, Dean.
So today, we showed you how we're
working to make Android the platform for developers
in automotive and how you can develop
your apps on that platform.
We also showcased a new design for Android Auto in your apps
across surfaces, from old cars to new, and new developer
APIs that enable your apps to take advantage
of these new designs and features,
including MessagingStyle for group messaging and custom
actions, media browse enhancements
to bring your content forward and annotate it with rich media
metadata, and media in-app search to help your users
find the content they're looking for across your catalog.
Of course, that's just a sneak preview
of all the developer resources.
If you'd like more information to help you update your apps,
please visit the Android Auto Developer site
at developer.android.com/auto.
And I'd also encourage you to take a look at the Universal
Media Player and Android Media Controller examples
that Rasekh and Dean mentioned, which
are freely available as open-source software on GitHub.
And while you're here, you can also
stop by the Android Team office hours, which
are happening again on Thursday at 2:30 PM,
to chat with the Android engineers and business
development teams and ask them questions directly.
But if you really want to see these things in action,
please come and check out the Android Auto Sandbox,
which is just around the corner, where you can see
the new Android Auto design.
And I promise you, it's not as janky.
It seems like we were having some AV issues
with the animations earlier.
But you can see the new design operating fluidly
in a new Range Rover Velar, on a beautiful, wide screen,
showing simultaneous maps and media on the UI
at the same time.
It's really, we think, a fantastic experience.
You can also find the Google In-Car Concept
that Lauren talked about, which is showing the default
experience in Android that we're making
available as open-source software in Android
P, and of course, the latest in-car experience
from our collaboration with Volvo
in the new Android-powered Sensus Connect system,
running the Google Assistant, Google Maps, and the Google
Play Store.
Last but not least, as with every session at I/O,
we want to--
we value your feedback, and we want
to hear from you so we can make our sessions better every year.
So please grab a browser and swing by google.com/io/schedule
to fill out a short survey when you have time and leave
your feedback.
Thanks very much, and have a great I/O.
[MUSIC PLAYING]
コツ:単語をクリックしてすぐ意味を調べられます!

読み込み中…

What`s new in automotive (Google I/O `18)

236 タグ追加 保存
Tony Yu 2019 年 1 月 2 日 に公開
お勧め動画

コメント

読み込み中…
  1. 1. クリック一つで単語を検索

    右側のスプリクトの単語をクリックするだけで即座に意味が検索できます。

  2. 2. リピート機能

    クリックするだけで同じフレーズを何回もリピート可能!

  3. 3. ショートカット

    キーボードショートカットを使うことによって勉強の効率を上げることが出来ます。

  4. 4. 字幕の表示/非表示

    日・英のボタンをクリックすることで自由に字幕のオンオフを切り替えられます。

  5. 5. 動画をブログ等でシェア

    コードを貼り付けてVoiceTubeの動画再生プレーヤーをブログ等でシェアすることが出来ます!

  6. 6. 全画面再生

    左側の矢印をクリックすることで全画面で再生できるようになります。

  1. クイズ付き動画

    リスニングクイズに挑戦!

  1. クリックしてメモを表示

  1. UrbanDictionary 俚語字典整合查詢。一般字典查詢不到你滿意的解譯,不妨使用「俚語字典」,或許會讓你有滿意的答案喔