Placeholder Image

字幕表 動画を再生する

  • [MUSIC PLAYING]

  • Hello.

  • I'm Martin Crawford.

  • And this is the IoT Developers Show

  • where we look at IoT technology, share learning

  • opportunities, and showcase cool demos and the creators

  • behind them.

  • In this episode, we talk with Stuart Christie, an IoT

  • evangelist here at Intel, as he walks us

  • through two demos showcasing the Movidius Neural Compute Stick.

  • Stuart, thank you so much for joining us today.

  • Do you want to tell us a little bit more about yourself?

  • Sure.

  • I'm an IoT evangelist, like yourself, Martin.

  • And I'm the hardware guy in this software group.

  • On a personal side, I'm interested in robotics

  • and photography.

  • And this is sort of a photography example

  • here we're going to be showing.

  • Fantastic.

  • Well, I'm really excited in this demo.

  • Do you want to tell us a bit more about it?

  • Sure.

  • It's a follow on from last month's example

  • where we were doing the unboxing of the Upsquared Board.

  • This time, we are showing some demos

  • that you can run on this board using these Movidius Compute

  • Sticks.

  • These are also development devices from Intel, much

  • smaller, as you can see here.

  • They're USB devices.

  • There's an SDK available if you want

  • to write your own programs.

  • But today, we're just going to demo

  • some of the prepackaged applications

  • that you can find on GitHub and the Movidius Application Zoo.

  • Sure.

  • So kind of like monkeys?

  • Yes, zoo as in animals in a zoo.

  • That's correct.

  • In fact, they call it a park-like area

  • in which deep learning models are housed for exhibition.

  • I'm not sure why they call it a zoo.

  • But a quick search for GitHub on zoo in Python

  • shows a lot of deep learning examples.

  • Excellent.

  • So what kind of applications are housed in the zoo then?

  • We don't have any monkeys.

  • But we do have some of birds.

  • But today, we're going to demo two of them.

  • We'll feature you and me in the zoo

  • using a USB camera to do object recognition.

  • And then the second example shows the scalability

  • of these USB sticks.

  • And we'll run the same demo side by side,

  • one running on one stick and one on the second,

  • so you can see the speed difference.

  • And so all of these demos are running on pretrained models?

  • Exactly.

  • The first one using the webcam has been trained

  • on a selection of objects.

  • It's fun to get it confused.

  • There are Windsor ties, lab coats,

  • and other seemingly random objects that show up.

  • This demo doesn't do any background removal,

  • processes the whole video frame.

  • So we'll likely see some odd guesses.

  • One of my favorites I've been trying to get

  • is to recognize my face.

  • And then I cover it up so it looks like a ski mask.

  • It thinks it's a ski mask.

  • Well, let's see what it recognizes here right now.

  • I've got a bulletproof vest on.

  • It sees me as a person, that's good.

  • Oh, I'm there as a person.

  • Interesting.

  • So I'm assuming that you can also

  • tweak the threshold levels to kind of get

  • it to be more exacting.

  • Exactly.

  • This one's actually running a threaded example.

  • So it's running TinyYOLO on one of these USB sticks.

  • And then the second one is really GoogLeNet.

  • And you can tweak the thresholds for it for both of those things

  • using the keyword.

  • OK.

  • And YOLO is you only look once.

  • And that is being used for object detection.

  • And then the GoogLeNet is being used as an object

  • classification essentially.

  • Exactly.

  • It refines the models, yes.

  • There are 20 things that the YOLONet can detect.

  • OK.

  • Well, do you want to take a look at the next demo?

  • Yes.

  • We'll close this down.

  • And then we'll start it up and see it

  • with these two sticks versus the one stick.

  • So this is the parallel compute demo.

  • And you're saying that it's using

  • three of these Movidius sticks?

  • Exactly, two sticks here versus one stick.

  • And it shows how the system is scalable.

  • This demo process is a system of directory of images.

  • There's about 65 images.

  • And it scans an image, uses Open CB to print the object name

  • that it's finding of the image.

  • And then it also says running one stick, running two sticks.

  • And you can see this one, the frame rate's a lot faster.

  • The scrolling here, the text, says stick one and stick two

  • there.

  • Again, one stick, two.

  • And then this one's running as stick zero.

  • So this shows the scalability.

  • You can add as many sticks as you really

  • need for your application.

  • OK.

  • And can I add any photos to this database,

  • or is it only trained to recognize these 65?

  • And

  • No.

  • You can add as many photographs as you want.

  • I've run it on my own directory of photographs.

  • So that works fine.

  • But obviously, they're trained to recognize certain devices.

  • So if I've got a cathedral photograph

  • and they haven't been trained to find a cathedral,

  • it's not going to find it.

  • It may say that it's a porcupine or something.

  • Porcupine or a dog, yes, because it does do probabilities.

  • This is what it thinks it is.

  • So how is it actually doing the parallel processing?

  • It's simply doing a ping pong, alternating between the sticks,

  • as all three sticks are loaded with the same detection

  • algorithms.

  • I've seen other demos where the controlling program

  • pulls the sticks and looks for a free stick to use.

  • That's a better algorithm to use, for example, if the time

  • to recognize something is indeterminate,

  • from, say example, a live video feed.

  • OK.

  • Is there a limit to how many sticks you can use in parallel?

  • Not really.

  • A USB-3 port can support for.

  • I believe that's the bandwidth limit on the port.

  • We've got to photograph we can show you on the screen

  • right now. it's got eight.

  • It's a bit of a problem to move.

  • It has an external PCI to USB card running on an extender.

  • So it's unwieldy, but you can use it.

  • OK.

  • So you mentioned that this is a development platform.

  • Now, are there any kind of production versions

  • of this thing out there?

  • There are.

  • It's shipping in the Google Clip camera and the DJI drone.

  • So there's the chip itself.

  • There's also a hardware developer kit,

  • which is a bit more like a regular PCB for doing

  • some software development.

  • I think this is a great platform for developers that want

  • to get into computer vision.

  • And thank you so much for coming down here and showing it

  • off to us.

  • Well, I've enjoyed showing it off.

  • Thanks very much for inviting me.

  • In February, Intel Technologies will

  • help fans experience the Olympic Winter Games, Pyongyang,

  • 2018 with the latest innovations in virtual reality,

  • 5G, drones, and gaming.

  • Check out the links to learn about Intel

  • at the Olympic games.

  • Thanks for watching the IoT Developers Show.

  • Don't forget to like and subscribe.

  • And remember to check out the links provided to learn more.

  • Thanks, guys.

  • [INTEL THEME SOUND]

[MUSIC PLAYING]

字幕と単語

ワンタップで英和辞典検索 単語をクリックすると、意味が表示されます

B1 中級

IoT Developer Show - Intel Movidius Neural Compute Stick|Intel Software (IoT Developer Show - Intel Movidius Neural Compute Stick | Intel Software)

  • 47 2
    alex に公開 2021 年 01 月 14 日
動画の中の単語