字幕表 動画を再生する
For the past 15 years I've been trying to change your mind.
In my work I harness pop culture and emerging technology
to shift cultural norms.
I've made video games to promote human rights,
I've made animations to raise awareness about unfair immigration laws
and I've even made location-based augmented reality apps
to change perceptions around homelessness
well before Pokémon Go.
(Laughter)
But then I began to wonder whether a game or an app
can really change attitudes and behaviors,
and if so, can I measure that change?
What's the science behind that process?
So I shifted my focus from making media and technology
to measuring their neurobiological effects.
Here's what I discovered.
The web, mobile devices, virtual and augmented reality
were rescripting our nervous systems.
And they were literally changing the structure of our brain.
The very technologies I had been using to positively influence hearts and minds
were actually eroding functions in the brain necessary for empathy
and decision-making.
In fact, our dependence upon the web and mobile devices
might be taking over our cognitive and affective faculties,
rendering us socially and emotionally incompetent,
and I felt complicit in this dehumanization.
I realized that before I could continue making media about social issues,
I needed to reverse engineer the harmful effects of technology.
To tackle this I asked myself,
"How can I translate the mechanisms of empathy,
the cognitive, affective and motivational aspects,
into an engine that simulates the narrative ingredients
that move us to act?"
To answer this, I had to build a machine.
(Laughter)
I've been developing an open-source biometric lab,
an AI system which I call the Limbic Lab.
The lab not only captures
the brain and body's unconscious response to media and technology
but also uses machine learning to adapt content
based on these biological responses.
My goal is to find out what combination of narrative ingredients
are the most appealing and galvanizing
to specific target audiences
to enable social justice, cultural and educational organizations
to create more effective media.
The Limbic Lab consists of two components:
a narrative engine and a media machine.
While a subject is viewing or interacting with media content,
the narrative engine takes in and syncs real-time data from brain waves,
biophysical data like heart rate, blood flow, body temperature
and muscle contraction,
as well as eye-tracking and facial expressions.
Data is captured at key places where critical plot points,
character interaction or unusual camera angles occur.
Like the final scene in "Game of Thrones, Red Wedding,"
when shockingly,
everybody dies.
(Laughter)
Survey data on that person's political beliefs,
along with their psychographic and demographic data,
are integrated into the system
to gain a deeper understanding of the individual.
Let me give you an example.
Matching people's TV preferences with their views on social justice issues
reveals that Americans who rank immigration among their top three concerns
are more likely to be fans of "The Walking Dead,"
and they often watch for the adrenaline boost,
which is measurable.
A person's biological signature and their survey response
combines into a database to create their unique media imprint.
Then our predictive model finds patterns between media imprints
and tells me which narrative ingredients
are more likely to lead to engagement in altruistic behavior
rather than distress and apathy.
The more imprints added to the database
across mediums from episodic television to games,
the better the predictive models become.
In short, I am mapping the first media genome.
(Applause and cheers)
Whereas the human genome identifies all genes involved
in sequencing human DNA,
the growing database of media imprints will eventually allow me
to determine the media DNA for a specific person.
Already the Limbic Lab's narrative engine
helps content creators refine their storytelling,
so that it resonates with their target audiences on an individual level.
The Limbic Lab's other component,
the media machine,
will assess how media elicits an emotional and physiological response,
then pulls scenes from a content library
targeted to person-specific media DNA.
Applying artificial intelligence to biometric data
creates a truly personalized experience.
One that adapts content based on real-time unconscious responses.
Imagine if nonprofits and media makers were able to measure how audiences feel
as they experience it
and alter content on the fly.
I believe this is the future of media.
To date, most media and social-change strategies
have attempted to appeal to mass audiences,
but the future is media customized for each person.
As real-time measurement of media consumption
and automated media production becomes the norm,
we will soon be consuming media tailored directly to our cravings
using a blend of psychographics, biometrics and AI.
It's like personalized medicine based on our DNA.
I call it "biomedia."
I am currently testing the Limbic Lab in a pilot study
with the Norman Lear Center,
which looks at the top 50 episodic television shows.
But I am grappling with an ethical dilemma.
If I design a tool that can be turned into a weapon,
should I build it?
By open-sourcing the lab to encourage access and inclusivity,
I also run the risk of enabling powerful governments
and profit-driven companies to appropriate the platform
for fake news, marketing or other forms of mass persuasion.
For me, therefore, it is critical to make my research
as transparent to lay audiences as GMO labels.
However, this is not enough.
As creative technologists,
we have a responsibility
not only to reflect upon how present technology shapes our cultural values
and social behavior,
but also to actively challenge the trajectory of future technology.
It is my hope that we make an ethical commitment
to harvesting the body's intelligence
for the creation of authentic and just stories
that transform media and technology
from harmful weapons into narrative medicine.
Thank you.
(Applause and cheers)