B1 中級 8 タグ追加 保存
動画の字幕をクリックしてすぐ単語の意味を調べられます!
単語帳読み込み中…
字幕の修正報告
what's going on?
Everybody.
And welcome to part two of the Karl, a self driving car likely with reinforcement learning Siri's in this video.
We're going to just carry on with what we were doing.
The last video we just made sure Carl, that was working.
Connections were being made.
Scripts were running.
And now what we want to do is actually start working on our own, um, python code.
So my goal for the Siri's is to use reinforcement learning to train a self driving core.
Now, in order to do that like any big problem, we have to break these problem down into little itty bitty junks.
So the next small chunk that we need to be able to do is we need to be able to figure out how to spawn this car in the environment, how to control this car in the environment, and then how to get sensory data back from that car.
We need basically like at least to start two types of sensor data.
The first is like a hood camera, basically so, just like some sort of forward facing camera, and then we also need to sense whenever that car has been in an accident.
So to do reinforcement, learning pretty much all we want to do is drive around.
If we get an occlusion, that's bad.
If we drive around at decent pace, that's good.
Cool.
Done.
So this isn't gonna be a it's kind of gonna kind of be like the self driving car in Grand Theft Auto.
I'm not actually gonna try to honor road rules or speed limits or anything of anything.
I just find it much more entertaining to watch a car drive really, really fast.
So So take note if you want later, you can, ADM or rules and how you just determine that that reward function on and you can get the car to do whatever you want it to.
D'oh!
So anyway, what we're gonna do here is figure out.
How do we actually interface with Carla using python?
So if you go to the get started bit and scroll down to the python a p I tutorial, we've been on this page briefly in the previous video.
If I recall correctly, it's been a while for me.
It's probably just a few moments ago for you, but so be it.
And we basically don't forget, we've got actor Blueprint World actor.
This is both your vehicle.
Other vehicles, pedestrians as well as your sensors thes air.
All actors then blueprints these air the attributes of those actors of the vehicle Pedestrian sensors and the world.
This is your environment.
Basically the server.
Uh, we need to connect.
Set a time out.
This way we don't end up hanging forever and ever and ever we grab the world Once we have that information, basically, these will be your first few lines of code every time as well as this one.
You're pretty much gonna always reference the blueprint library.
And then here right away we have at least this one sensor, right?
This is for your collision sensor, which we will definitely be using.
Um And then this basically is going to show you maybe how to attach that sensor.
I don't really see it here.
Transform, Spawn Actor Blueprint transform.
This is probably doing it.
This little bit is kind of confusing to me as faras Like how we actually add things to things.
But I'll show you guys hopefully it will make sense when I get through it.
Uh, okay, so I think we're ready to rumble.
Uh, what I'm gonna do is we can leave this up for reference, I think.
And what I'd like to do is I'm gonna write everything in the inn.
That examples directory.
If you want, you can move.
What would it be, Carla than dissed?
You could move this into your site packages of python 3.7 and imported.
You could do that.
I'm just gonna keep it nice and simple and just keep it in the same package here.
So what I'm gonna do is just take, like, any one of these files.
I'm just gonna copy pasta just so you can not write a few lines on, we'll call this tutorial video to what a great name.
Uh and then we will open in sublime text.
Move this aside, and now we are ready.
Cool.
So what I'm gonna do is basically delete everything from there to here on the import.
Carla, that's fine.
And then everything else I'm just gonna delete from this script basically just didn't want to write that like, Well, this this bit, it just changes that system path so we can import that Carla egg easy enough So now what I want to do is begin talking about some fundamentals of what's going on here.
So, first of all, any time you've got a client server, we need to also probably clean up what we're doing.
So the first order of business that we're going to attend to is going to be We're gonna have this actor underscore list, and that'll be an empty list.
And then what we're gonna do is attempt to basically do our connection to the server, all that fun stuff, and then when we're all done, we need to clean up.
And what the way we clean up is four Actor in actor underscore list actor dot Destroy.
Uh, and then Prince all cleaned up.
Okay, so this'll probably be the starting script for pretty much any Carla code that you're right.
Honestly?
Well, actually, well, we would first need to connect, so let's do that as well.
So we're gonna take client equals, carla dot Kalai in, uh and then we're connecting local host again.
That could be some sort of remote public I p, which is super awesome.
In the case of reinforcement learning one of the hardships of doing it that way would be the Leighton see tune from.
So with reinforcement learning it's a particularly challenging one because each step needs to know what the result of that step was.
So if you have a server and you've got, like, 100 millisecond ping, it's not like everything's just kind of shifted over 100 milliseconds.
Instead, you have to take a step and then send that out and then wait for it to come back so you would get 200 milliseconds per steps.
So five frames per second, which probably isn't good enough.
So we're gonna keep everything local running on.
So in theory, this would all have to run inside the car.
Um, luckily, you would only need to run the model.
The model would actually predict pretty quick.
Like you probably wouldn't be doing exploration in reality.
Anyways, I digress.
So continue on client dot set underscore.
Time out.
We're gonna set this to two seconds in the docks.
They set a time out.
I think of 10.
I hope we don't take that long.
If we take more in two seconds to do something, there's probably something wrong.
So anyway, we'll keep to their They're going to say world equals equals client dot get underscore world And then we're gonna do that.
Blueprint library, Eagles, World war world dot Get underscore.
Blueprint underscore library.
Okay, this is probably more along the lines of the code that will start every Carla endeavor.
Um, then we're gonna pick a car.
This might also be pretty common, but we're gonna say BP for basically blueprint is going to be equal to that blueprint library dot filter.
And we're gonna filter by things that say model three.
So we're gonna grab a Tesla model three here, and I'm gonna say the item at the zero with index.
That's what we want.
So there's only one model three.
So this will load us in a blueprint.
That is the blueprint default for the Tesla model three.
So what?
I'm thinking about it.
I probably should just fire up Carla.
Um And then while that's firing up, I'll continue.
Hopefully nothing glitches out on me.
Well, that's doing it, but we'll find out.
So while we wait on that, um, we're gonna set once we've got the blueprint for the vehicle, the next thing I could hear the GPU must have it already.
Yep.
Um okay, So we've got our world, and what I'm gonna do is I'm just gonna look down and scroll up again.
Kind of center over here.
This thing looks like a strange smiley face with the goofy knows, huh?
I wish I could scroll out a little quicker, but anyway, this way, when we do spawn a car will be able to actually see it in the environment.
That's good enough.
Ah, the only place the car could spawn and we don't see it is if it's ponds in this, like tunnel area.
So cool.
All right, Now, uh, coming over here, basically, we've got blueprints.
OK, so what we want to do is if we spawn a car, we want to, actually, um, to do that, we need to spawn it somewhere.
So we're gonna say Spawn point equals, uh, we're gonna use random dot choice, and we have not yet imported random.
So let's go ahead, import random and come back down here.
Spawn point eagles, random dot choice for world get underscore.
Map dot Get spawn points.
So get spawn points has I believe, like, 200 spawn points that we uh, we can choose from, So we're just choosing randomly one of those spawn points and at least one is in the tunnel, so just take notes.
Sometimes you won't see it, but that's okay.
Again.
We're gonna only look down upon our our world here, Uh, a handful of times before we're, uh we're done looking at it this way, except for debugging purposes only.
So Okay, so once we've done, we've got the spawn point.
What we want to do is we want to say our vehicle is equal to world dot spawn underscore actor.
The actor that we want to spawn is the blueprint that we just set up here again.
We can change attributes of that blueprint before we spawn it.
If we so choose in this case, we're just gonna go with whatever the default is.
Um, respond the blueprint.
Where do we want to spawn the spawn?
The blueprint at the spawned point.
Okay, So that will spawn our car.
Now, one thing that we can say there's kind of two options here, So if you want to spawn n peces orjust computerized vehicles, what you could do is vehicle dot set underscore auto pilot to true.
Now this is funny because even in their docks, they they're like and we're stressing that this is not a self driving car code.
This is just rule based because has connected to our environment.
And to me, it's just crazy that we have to explain this.
But it was super common in this self driving car in Grand Theft Auto.
Siri's, where people were like the taxi's drive.
So much better than this and the other.
You know, cars and grand theft auto drive better than this.
Why don't you just use that code?
Well, that's hard coded that it's connected to the game engine so it knows where the things like lanes are.
It knows where other drivers are.
It's all rule based, um, because it's connected to the game engine, whereas in reality we're not connected to some game engine.
Otherwise, self driving cars would be so simple.
So anyway, I wish I didn't have to make that clear, and I wish they didn't either.
But you do so if you just want the car to drive around and act, sort of like a self driving car are not a self driving car, just a car that's driving on the road.
You can do this.
Um, I don't think they take erratic actions or do anything unexpected.
Those I wouldn't use this to replace human drivers or, like, simulate human drivers.
I suspect it's nothing like a human driver.
Anyway, we don't want to set autopilot.
We actually want to control the vehicle ourselves.
So to do that, Um, although one thing I would say is you probably could set autopilot and really quickly train a model to drive like autopilot drives.
We could do that again.
I don't want to do that, because it just happens to be the case that we have access to, uh, the game engine.
I don't really necessarily want toe training that way, but you probably good.
Anyway, um, vehicles are a pilot.
Okay.
So rather than setting the autopilot instead, we want to control the car.
And the way that we do that is just a vehicle.
Don't apply, underscore control.
And then it's carla dot vehicle control.
And we're gonna set throttle equal to full throttle and steer for now.
We'll just say 0.0.
We're just gonna make this car go straight now.
Uh, every time we spawn anything.
Um, we want to actor underscore list on upend and we want to upend that vehicle.
Now we're gonna say Blueprint is going to be equal to Well, I think what I'll do for now.
Uh, let's just make sure that runs.
Actually, that's probably a good idea.
So we need a couple other things.
One, we need the camera.
And then, too, we need the collision sensor.
Um, but for now, let me just run this.
Let's just see where we are right now.
Uh, where Carla is this?
Yeah.
Okay, Uh, what do we call it to Touro via?
So pie Dash 37 tutorial video to dot pie.
And what I'll do is I'll just put this over here.
I'll pull up the environment.
You can't see the whole thing.
I know that.
I just know the car's gonna spawn where you like.
Can't see it in your your view.
Really?
Just missing the bottom, though.
I'm gonna laugh if that's where its bonds.
But let's just response to what happens.
Uh, wow.
Ow!
Ow.
Okay, it cleaned up immediately because we didn't do like we just spawned the vehicle.
Got rid of vehicle.
Everything happened so fast.
We don't even see it.
Let's sleep for five seconds.
Try that again.
Okay, Let's go.
Over here.
Up.
Oh, darn it.
What an amateur.
Okay, import time.
Poor times that.
Come back over here.
Rerun and pop over here.
I actually did not see the car.
I believe it was there.
Um, you hear?
Let's try again.
Oh, there it is.
I saw it.
It was over in this area.
How is really hard to see?
It was, like, so far zoomed up.
Maybe we should do more than five seconds.
I'll see if I can spot it quickly enough this time.
Here it is.
It's gonna crash into the fountain.
Yep.
Okay.
So hopefully you can see yours.
Sorry.
If not, uh, but you could run for more than five seconds or something like that, but basically wanted to make sure things run.
So now what I want to do is, uh, attach a camera sensor to the car.
I think we'll do that and we'll deal with the collision.
Probably in another video or something Is that the collision center really only has to do with when we actually go to do?
Um uh, reinforcement learning.
Okay, so, uh, going to sensors.
We can see.
Hopefully.
Didn't they have a Oh, no.
We don't want collision.
We want the RGB camera, right?
And so in this code, basically, we could pretty much copy and paste this code.
The thing is, I don't actually want to do this with the I'm not trying to say.
I don't know why they're saving the image to disk.
Um, I don't understand when that would be actually useful.
You would really want it.
That that image in memory at best, I'm not sure why you'd want to deal with i 02 disc, but whatever, um, s so we can take this code.
Uh, but I think we'll just, uh I also want to change a few things about this could.
Anyway, whatever, um, I'll just use the code from Manu ts because there's a couple of different things that we want to do from there.
So actor list a pen vehicle will keep time, not sleep at the very end.
Uh, and now we want to do is we're gonna have another blueprint.
Let's call it Cam.
BP for something like that in this surprise.
Should've been vehicle be Pierre.
I don't know.
Whatever.
It doesn't really matter.
Um, you were gonna say that is blueprint Library don't find.
And we want to find the sensor.
That is eight camera, That is an RGB camera and RGB.
Um, and don't be fooled.
It's actually an rgb a camera.
So Carla people get on it because it also returns a flattened array For some reason, that needs to be reshaped.
And if you re shape it to whatever your dimensions are by three, you are going to be a sad panda.
So anyway, um, continue on cam bp dot set underscore attributes.
Um, actually don't know what the defaults for this camera are, but you definitely.
If you're gonna be feeding it through a neural network, you want to be very specific.
Um, so image underscore size underscore x.
And we're going to set that to be, uh, we're gonna make it in F string, actually.
Oh, well, we don't have to uh, ok, definitely making Have Neff string instead.
Just say m underscore.
Well, it has to be a strain.
I guess I will use an answering I don't know because it needs to be in string form, so M with I might be run there anyway.
Copy paste and with and then em high eat slips.
Make this all capitals ex wife.
Okay, Now we need to set in with an M height.
So we'll just go to the tippy top here.
I'm will say em under scorer with equals will do 6 40 am high.
It will make that for 80.
Fabulous.
Uh, and the next thing that we want to do is we can also set, which is kind of cool field of view.
So you could make kind of like a fish eye lens.
So cam underscore bp dot set underscore attributes.
Field of view, lips.
He's being quotes, field of view.
Uh, and then we could send it to be, like, 1 10 I'm sure the default is like, I don't know, 70 90.
I don't know.
Anyways, I don't know what any of the defaults are.
You could, uh, query the actual the default blueprint to find out all of those things.
I don't really care at the moment.
Um, if you want to know more things that you could do, you could always just dir over the blueprint if you wanted.
But you would.
I guess that would only give you set attributes.
I guess you'd have to print out the string form to see all the attributes.
I'm shorts.
Also, somewhere in the documentation.
Um, I don't know where exactly You have to go to find all that.
Really?
I just read through the documentation, Um, saw some of the examples of them tweaking certain things and went from there.
I didn't actually read anything.
Pee on that.
So, um Okay, so we set the field of view.
Okay, so we've got the blueprint set up.
Now we have to do is one.
We have to yes, spawn that blueprint.
But where do we spawn that blueprint and on?
There's kind of two things.
One is you would spawn.
The what you can say is you can spawn.
Um, you have to spot.
We want you quite literally.
We want to attach the sensor to the vehicle.
And there is a little flag when we go to spawn the actor to attach to something so it will always follow.
But we don't.
Not only do we want to attach it to the car, but where on the car.
So what I'm going to say here is spawned.
Underscore point is equal to Carla Don transform.
And that's gonna be carla dot location.
And this is a relative location to the car.
So I've just kind of trial and trialed and error to find this.
I want to say X eyes forward.
I'm probably run exes forward.
I think Why is left and right and then Z is up and down s O.
Because I think the camera wants to start, like, at the center point of the car.
So I moved the camera forward and then up a little bit.
So anyway, um, but depending on what car you use, these this Look, this relative location is gonna vary.
Like if you're in a truck and it sits up higher or something like that, you're gonna want to change that spawned the occasion.
I wish that, and they might totally have this, and I'm just not aware of it.
I wish they had just like, ah, hood camera, um, or like center windshield spot or something like that, just by default.
So because part of self driving will be from different perspectives.
So, um, you know again.
We're seriously simplifying this problem because we're just gonna do it with a Tesla model three car to start.
But there's other cars that we might drive.
And we want this model to drive in all kinds of different conditions in different cars, different heights of cars and so on.
So anyway, that's gonna be that for now.
What we're gonna say now is our sensor or camera is world, not spawn.
Underscore actor.
Uh, and what we want to do is spawn, uh, the cam blueprint.
We want to spawn it to that spawned point, which is relative when we attach it to our vehicle.
So now it will follow around that vehicle.
Now we've spawned an actor.
What should we do?
Well, we need to actor list on upend that sensor.
Otherwise, we're going to be sad.
Um, okay, so we've got the sensor.
How do we get the data?
So, uh, as you could see if I pulled up.
Um uh, what was it?
I think it's not.
Listen.
Yeah, you can see here.
They're using these like little lambda functions to interact.
Okay, copy that.
So, basically lambda and then the image information.
So we could do anything with that image that we want to do And apparently image duff rain.
Well, that's the frame number.
Um, so images, apparently an object that has a method dot save two disc.
Okay.
Um, I don't totally know what this like, Carla image.
It is.
I honestly just attempted an array conversion, and it worked.
So it's some sort of object that allows itself to be converted to an array.
So it must have, I guess, in a rain method, I have no idea.
I've absolutely no idea how you would do that, because the umpire Ray is something else entirely.
But anyway, they have that built in so awesome.
So, um, hello.
I want talk to you anyway.
Um, okay, so let's move this over.
So So, in order to actually get information from the sensor were going to say sensor dots, uh, listen, and then we're gonna listen using Lambda, and then you're getting data.
It's not, actually, it's like some sort of image object.
From what I can tell, uh, and then what we're gonna say is process image, that data.
Okay, so we want to do that.
Now, what we're gonna do, process is that function just doesn't exist yet.
That's why it's getting mad.
So now we're gonna do is just come up to the top Here and after are constants.
Let's just add define, uh, process.
Image gets past image or probably better called data.
But, um, And from here, what we want to do is I'm going to say I for image is equal to numb pie array of image dot actually reading the news now raw data.
So the raw data is a like a just a flattened array.
So if you try thio display that you would find that, um, it's it's just a long it doesn't actually have the dimensions of an image.
Eso also, I need to import numb pie.
As in p.
I totally forgot that had raw data method.
So yeah, it's some sort of like image object.
Uh, that just so happens and again what I did was I passed that to the function, and then I just did a print your image and read out my options.
So printer image and we can even do that, uh, image empire.
Yes.
So I can actually just say that and then come over to our script here.
Let's just run that real quick image signs.
Uh, capital, Why?
We need a lower case that why and save and run again.
Cool.
So I almost don't want to break excitedly Vehicle cleaned up, So Yeah, So, as you can see, that has these thes methods that are built in um so all these, like, Dunder methods you can just ignore, but there's a convert field of view.
I guess maybe we could have no idea what what field of you would do, because this is after the fact that we took the image we have framed number height with on the raw data.
I figured, Hey, I'll look into raw data.
That's probably what I want saved two disc, which again?
I didn't really want that.
That was one thing that was kind of hard in their docks.
Eso Carla, people, if you're listening honestly, I would think most people their goal is actually just to access that dad alive, not save it to an image and then access that image That doesn't make any sense to me.
Why, that would be the default that you showed otherwise.
I like all the other stuff that you guys did a great job.
So, um yeah, but anyways, raw data is what you really want.
I don't know what transform is with Time State.
Maybe we get maybe transform would help us transform the data.
I don't know.
Um, actually, transform probably has to do with the actual Carla transform.
I don't know.
That doesn't matter.
Not important.
So So anyway, that's how I found out that raw data existed.
So once we have the raw data, um, and now it's converted to a numb pie ray.
We want to reshape it, because, again, print I don't shape.
You can probably just take my word for it, but yeah, it's just like one big flat array, not the shape of a photo.
So we need to get this into the shape of a photo.
So what we're gonna say is I two eagles I don reshape, and then the to pull that we're gonna use is him.
Hi, eats.
Don't forget that one in with so again, um, images, like with by height.
But in array terms, we first have the height of the array.
Then we have a width of Honore because it's like a length of an array.
And then you've got, like, the contents within.
So anyway, it that'll pry trip you up lots of times.
If you do work with imagery in programming anyway and then it's bye four, because again, it's rgb a for Alfa information.
So, um yeah, cool.
I mean, I guess the alphas nice toe, have, um I just wish they didn't call the sensor rgb.
I was having a hard time with that one because it's called RGB, but it's returning rgb alfa.
So anyway, cool.
I actually really, truly want just rgb I don't care about the Alfa values.
Maybe later we could grab Alfa values and care about them, but right now I don't.
So I'm gonna say I three now is equal to whatever I too, was.
Um and then this is useful numb pi stuff.
Two Just get the 1st 3 elements were going to use this Coghlan's in tax of colon.
So everything comma everything, uh, kama up to the three.
So, for the entire height, his phone is killing me the entire height, the entire with and then inside after the height and the width.
What do we have?
What we've got our G B A values.
I just want the 1st 3 So the 1st 3 or rgb values so kind of nifty way too quickly.
Just grab RGB.
It's quite it's very fast.
There is an open CV method to do this exact thing.
Which reminds me we actually need open CV.
So I'm gonna import CV to for CV to you don't have none pie or open TV B t dubs Just do a pip in stall You can't see that.
Ah, Pip Install open C v dash python and somebody cool.
Easy now and then you just import CV to So coming back down here, the next thing that we want to do is anyway, what I started to say, I'm not sure, but finished There is a functionality and open CV that converts.
What's your problem?
Okay, uh, converts RGB Alfa to just straight up RGB.
Interestingly enough, it is slower than the num pie method.
You would have expected it to be identical like that.
You would have expected it to literally just do this, but it does not.
I don't know what it does and why it's slower, but it is fascinating, riveting information.
The more.
You know, uh, now we're just gonna show it.
So we're gonna give the title Nothing.
Just empty.
And we're just gonna show I three cv to dot Wait, key camel cased Don't mess that up one millisecond, and then we're gonna return I three divided by 2 55 So here will actually get a representation of what the camera sees and then returning it.
We're just gonna We're just gonna normalize this data before I forget to do it later on.
Any time we're working with imagery data, passing it through a neural network.
Um, generally, neural networks prefer information to be between zero and one or a negative one and positive one or something like that.
So values of 022 55 are kind of silly to a neural network s o.
We can normalize them or scale them basically to be between zero and one.
Okay, Who?
I think you're right.
And I think we're ready to see a camera popping out from our car.
We still print in shape.
We don't want to do that anymore.
It's going to slow things down.
Uh, let's run this bad boy.
So I should be able to just go up sure can tutorial Too.
Good.
That's right.
Immediately, we do get the front camera, the car drops down and it goes forward.
Awesome.
Let's just do that one more time.
Make sure everything's looking good.
Drops down and drives forward.
Does not stop.
Awesome.
Okay, Long tutorial.
But we got actually made a lot of progress here.
We almost have.
We basically have all the information we need to either control the car via other logic.
Like if you don't want to do reinforcement learning with us, that's totally fine.
If you want to go more rule based approach, you can totally do that again.
There are also lied.
Our sensors.
I don't know if I'll find one really quickly.
Well, it exists.
Um, you've got those.
You've got other all kinds of sensors that you can work with.
Um, so if you want to go that route, have at it.
We're gonna go off of pretty much a camera based model to start and see how far we can get reinforcement learning there.
We also need to add a collision sensor, uh, and then convert everything to more of like a reinforcement learning environment.
setting.
So that's what we'll be doing.
The next tutorial.
If you've got questions, comments, concerns, feel feeling below.
You can also join us in discord.
Discord dot g sel Ash Centex and finally, a shout out two recent new and long term Channel members.
We've got over one here, Kevin.
See Alexander Sir Bay Robert Diamond Mojo.
War of Gupte.
I think that was Gupta.
Can't I might have trimmed your name if I did.
I'm sorry.
Cyber Carp Cigar Jarvis and Suzanne Williams Thank you guys so much for one or more years.
We've got people up to, I think, 14 months now, which is crazy.
Six months.
Chris Oland, Frederick Fail, Joey Santana and progeny 16 05 and for brand new members video shot Auto Copus Kopecks Key Jeffrey Miguel Ski.
You guys are killing me.
This next name.
I wouldn't know how to pronounce it.
If you know how to pronounce that name or it's your name, tell me in the comments and I'll do my best to actually pronounce it next time.
Von Schau Games Michael Schwartz and door for Patrick Carroll Srinivas Ada Jatin Nand Wadhwani Mental Al.
See, Harv.
Demand which I recognise your name, Did you re sub?
What's going on?
Kevin Higgins.
Lots of people to think.
But seriously, guys, I love what I d'oh.
I'm really enjoying, uh, focusing much more on doing YouTube than I have.
Really, ever.
So yeah.
Awesome.
It's thanks to you guys who are supporting the channel directly.
So really, really bottom My heart.
Thank you so much for, uh, becoming a member and sponsoring Basically.
So Okay, uh, I will see everybody in the next video.
コツ:単語をクリックしてすぐ意味を調べられます!

読み込み中…

Controlling the Car and getting Camera Sensor Data - Self-driving cars with Carla and Python p.2

林宜悉 2020 年 4 月 1 日 に公開
お勧め動画
  1. 1. クリック一つで単語を検索

    右側のスプリクトの単語をクリックするだけで即座に意味が検索できます。

  2. 2. リピート機能

    クリックするだけで同じフレーズを何回もリピート可能!

  3. 3. ショートカット

    キーボードショートカットを使うことによって勉強の効率を上げることが出来ます。

  4. 4. 字幕の表示/非表示

    日・英のボタンをクリックすることで自由に字幕のオンオフを切り替えられます。

  5. 5. 動画をブログ等でシェア

    コードを貼り付けてVoiceTubeの動画再生プレーヤーをブログ等でシェアすることが出来ます!

  6. 6. 全画面再生

    左側の矢印をクリックすることで全画面で再生できるようになります。

  1. クイズ付き動画

    リスニングクイズに挑戦!

  1. クリックしてメモを表示

  1. UrbanDictionary 俚語字典整合查詢。一般字典查詢不到你滿意的解譯,不妨使用「俚語字典」,或許會讓你有滿意的答案喔