字幕表 動画を再生する
A few years ago I came across these commercially available E.E.G. headsets, NeuroSky, these
As part of my company that I was running at the time
We started experimenting with different games and different things to do with them
but the main thing that we started doing is like
taking edits, taking video edits
and then dropping them into a program that would allow the view to cut basically.
We're using like just stuff off the cutting room floor really.
This was a different way of making a film as I usually do
usually like you script it, and you storyboard it and go out and shoot it.
This one was made specifically with the interactivity of the film in mind.
We had to do like four different treatments from the script
and then also in editing I usually
you
I take the viewer on and journey in the edit and you choose very
succinctly, when, what they see when they see it
And this, the creativity that you would put in the edit that gets passed off onto the viewer
Interviewer: How does it work from a technical point of view?
This sits on your head
electrical signals from your brain and also from your skin it gets filtered out and sent via Bluetooth
We've got a Python program here.
Interviewer: So that's live Data now
Yeah, it's just getting like fuzz at the minute
(Quietly to self) Just stick this on.
So, as a I put it on
The signal quality should drop to zero and that means I've got a good
connection with the data coming out
so you can see in the background the film's playing
as I blink it's cutting
As you can see as I blend between two layers
that's my attention going from high to low
You see if I blink, you'll see this blink coming up.
When I press play it fires off four videos at once, they're all playing in sync.
And when I blink it chooses a set of two videos to play.
When it's playing those two videos, the attention is changing the blending between those two videos.
When I blink it will jump to the other two videos.
Blending between those.
At the same time meditation is controlling the mix of the music
so the higher my meditation level the more levels of the music.
So the data is coming from the headset via Bluetooth to the computer
then our Python program is picking that data up
send it via OSC (Open Sound Control)
to MaxMSP, which is a program a lot of artists use code.
It's a visual programming language
It looks a lot like this
So the data comes in here from the headset
It's receiving the blink, the attention, the meditation
It's coming down there to blink to cut between scenes.
and then over here the the meditation has been
scaled to the four audio levels, so we can see
So my meditations about 50% here.
That one's going up a bit.
You just like add these patch cords together
so there's there's my meditation point for it's between zero and one so I've got that number
This bot- This number here, so you can see it's
35 which is round about here?
And it's going up, 66.
Interviewer: Do you find some people are more in control of their brains than others?
Yeah totally and like I've had a few people who do a lot of meditation in the morning
they are really able to control the meditation aspect of it, most people can't control that at all
The biggest differences of people, [is] how people want to control the film
some people want to experience a film
some people want to control it
and a few
Group experiences where there was like one person who just wanted to see their feedback
And they made this crazy film that just really hurt other people's heads
so this was initially designed as a one-person experience
initially we had a lot a little tent and you went in and just watched it by yourself
when we took on tour we found that people would have a little bit.... (fades out)