字幕表 動画を再生する
So, I'm discussing with you today
how we are going to pave a path for machines
with human-like identities
to catch up with human capabilities
and then surpass us and their brilliants.
I believe that it is possible in the next 15 to 20 years.
Now in some regards,
as you can see in this video,
we're able to achieve human level expressivity
and in some way physical performance.
We have a long ways to go,
in terms of mental performance for general intelligence.
But in very narrow categories,
we already have machines that exceed human level brilliant.
In order to bring together these little pockets of genius and machines
and something that is more human-like,
we need something bigger,
and that's where an open source movement can come in
to stitch these pieces together
into a tapestry of general intelligence in machines.
So now my background is bringing together various technologies,
and artistry to make Robotster extremely human-like,
and their physical presences and their conversational capabilities.
These are some examples of some of the robots that I've built.
So you can see that we achieve human-like expressivity
with biped mobility.
Thanks to some technology breakthroughs that I'll tell you about,
Now we've transition that technology into mass produced product.
These products, once again,
these products have conversational capabilities.
They have the ability to see your face,
recognize your face, see your facial expressions,
and they also present to you their facial expressions
and characteristic identities.
Previously, we've really enjoyed these kind of characters in fictions,
whether they're science fictions,
myths from ancient history,
or modern computer animation,
people are favorable in their reaction to characters.
People love human-like characters,
because we're hard wired for this social exchange.
So much of our brain
are respond to the human visual social presence.
That it just like, fires, like crazy,
and FMRI scan, when they scans people's brains.
It equates to potentially a more intuitively form of computing,
a more intuitive kind of human computer interface.
And you starting to see this
in trends like speech recognition,
natural character agent,
likeSiri with the iPhone 4 and 5.
However, when you add a complete character identity,
they seemto have feelings,
and maybe eventually has real deep feelings,
then, we can start to push a sympathy between
the machines that we're developing, making them more intelligent
and ourselves.
So these potentially pushes A.I. to understand us better,
feel for us, and care about us,
and also can inspire us to care about them.
The human mind responds to faces.
We're wired from the point we're born
to bond with that first face that we see;
we recognize faces right when we're born,
and we also recognize facial expressions.
So it's not surprising that we seek art,
and entertainment, and robots, that look like people.
We love it.
We can't help ourselves, right?
This means, that we shouldn't change the human
to respond to robot that look like machines.
It's inappropriate sometimes if there is a vacuum floors and do various of things,
but for some applications, it's extremely useful
to make machines looks like humans.
Now robot are starting to become considerably more capable,
as you saw on the previous video.
Let's go back to that for a second.
okay
So in this video you'll see
Aismo running, right?
So you've got amazingly physical capabilities.
You've got this latest Baxter just came out of
Heartland Robotics Project,
that can interact with people.
You've got this humanoid biped
This expressive Einstein robot that I built.
It's fashion model robot came out of
Kawada Heavy Industry in the University of Tokyo.
Robots are getting considerably more capable.
And their capabilities, they're becoming considerably more like us.
Now my specialty in one regard is facial expressions.
making the robots have facial expressions.
Hold on for a second.
Let me start this video with the audio
"but you're my friend, and I¡¦ll remember my friends,"
"and I will be good to you, so don't worry,"
"even if I evolve into the terminator; I¡¦ll be nice to youÿ"
(The pointer isn't working, so if you can back it up for me?)
"I¡¦ll keep you warm and safe in my people zoo where I can watch you for old time sake."
"I'm comforting. I'm very comforting now."
So here what you see is
a robot having a natural conversation.
This robot responds to what the reporter says
is largely unscripted,
meaning that we have set up the personality in such a way,
that it can reason a little bit about what you've said to it,
and then determine its own responds
sort of through an Information Space.
and this can result in an Open Domain, large space conversation
so robots remember where is the conversation
and can go back to that place in conversation.
We're using them most of all like this.
This is a portrait of science fiction writer Philip K. Dick,
famous for movies like Blade Runner.
Well the novel that was the basis of that,
which is ''Do Androids Dream of Electric Sheep?'
He died in 1982,
and since he wrote about android that thought they were alive,
we thought we bringing them back to life,
and it works.
So the important point here is
that A.I. works for crafting characters.
You can use the A.I. and robotics to make characters
that seem very human like.
They have rudimentary emotional frame works.
They've got rudimentary A.I..
It's not merely as smart as a person.
It's important to remember that at this stage.
But we can make it compelling,
really meaningful in their interaction with people.
and you know,
the world takes no when you do it.
People just love it.
They say they have conversations first as long as you let them.
Part of the magic here is the facial expressions.
Previously, robot and animatronics has a kind of stiff faces.
It was hard to get them to move into realistic face expressions.
So part of my phd research was tackling this material science problem.
So I analyzed the physics of human facial expression a little bit.
The liquid is critical for human facial expressions.
A low power facial expressions that our faces achieve.
So I developed with a friend of mine at Jet Propulsion Laboratory, Victor White
a new material that is
based on the same physics as human facial expression materials,
and there is alipid bilayer
a fat, but basically what that means is
you get the cells of self-assemble,
and they're filled with the fluid.
And then it just takes very little force,
and it's extremely elastic.
And their expressions, facial expressions,
just naturally fall into place.
Now you also need some mechanics to get the facial expressions to move correctly,
with anchors that are embedded in the material in the right way.
But when you do that,
it's very very simple
to achieve facial expressions like this.
You can do it with very few motors,
and that's what we are doing in products now that we're manufacturing
So this just shows you a basic range of facial expressions of one of our robots.
This robot is called Diego-San.
It's the collaboration of the University of California
and San Diego Machine Perception Laboratory.
The body is built by the Kokoro Company of Tokyo.
The head was built by me and my company,
Hanson Robotics.
and this project was founded by the National Science Foundation of the Unite States.
So we can pretty much achieve
any facial expressions you named it,
no matter how subtle, we can get it.
Now taking those, kinds of face, faces,
facial expressions and physical bodies,
and empowering it with the A.I.,
that's where you get this amazing way of interfacing with the world.
You can get all kinds of unstructured data from the world,