Placeholder Image

字幕表 動画を再生する

  • Hi, everybody.

  • And welcome to this episode of Tensorflow meets on this episode.

  • It's my privilege to have Chris got breath from NVIDIA, an X Q from right here in Google to talk about tense or or t tense.

  • Artie.

  • Chris.

  • Tell us all about it.

  • Consortia Czar, Programmable inference accelerator.

  • It's basically software that we created to allow people who are creating a neural networks and artificial intelligence to run those networks in production or in devices with the full performance that Jeep use can offer.

  • A lot of the software that's developed for artificial intelligence is designed for training, and this optimization we can apply for inference that can get even more performance that are appropriate for for training.

  • Again, we created, uh, temps Artie to accomplish this sort of goal of performance because that's what GP you come to g pews for on because we're gonna be using it.

  • In inference, we also wanted to focus on robustness we created as a shared library.

  • It's a very compact modular rise thing that you can build on top of as a kind of what we've done with the tensorflow integration.

  • We can also be used in things like cars and other sort of devices.

  • So it's not just good for, like, tensor Artie and and video right?

  • It's also been great for Tensorflow for us to be able to work with you.

  • Could you tell us all about it?

  • Excuse Tensorflow is a machine learning system as we are improving our inference time performances, and we really want to give our users the Bassett experience.

  • Where's Test Artie on?

  • And so we really want to use her to within their without stepping outside of tensorflow within their inner development cycles, be able to enjoy the benefit off.

  • The 10 sorority and just Calvin automatically have the performance benefit without much trouble.

  • It's easy for them to try and eats them to find you.

  • They're models for battle performance.

  • Eso We come to this design that basically give the best off the boat ward to the developers so they can enjoy the full diversity and the flexibility of tensorflow while still enjoy the performance benefit off test authority.

  • So we come with the design and the start discussing the start of cooperation with Zambia.

  • It's actually really cool that you the uh, that idea because of the original sort of thought that we had for for 10 Sarti was to think of is a post processing step.

  • I think it's a cool idea to really bring it into the development environment.

  • And as we've talked to some of the early folks who are looking at this, they really liked that change.

  • It's a very positive change for user's.

  • So how did this I'll get started?

  • Um, well, I mean, users have been coming to us who are starting with tensorflow.

  • Um, you know, really, from from the very beginning, with with pence.

  • Artie.

  • So we knew that it was important community to engage with an NVIDIA.

  • And Google had been working together for a long time for a GP optimization on on tensorflow.

  • There was already existing working relationship, I think, actually, you just you suggested early on with architecture Quick little white paper.

  • Hey, it wouldn't be cool if, uh and, uh, we noodle on it.

  • We thought, actually sounded like a really good idea and got engaged.

  • Do you still have that white paper?

  • Have you framed it?

  • Someone?

  • It's on disk somewhere.

  • It's in Google Drive, I think.

  • Actually.

  • Okay, so it's it's it's kept security exactly.

  • So I'm a developer on.

  • I want to start using this.

  • I want to take advantage of this.

  • What do I do?

  • How do I get started?

  • So will you start in tensorflow itself?

  • So you created a graph in tensorflow.

  • And then, uh, with the capabilities have been added through this integration, Uh, I think you start with tensorflow 1.7 and you'll have your graph.

  • You'll be tuning it.

  • As excuse said, you get the point.

  • Okay, now, now what's my performance, like on a view 100.

  • So you go on a machine that has a V 100 you'll save the graph, which I think means freezing.

  • That's the freezing.

  • Is what tensorflow idiom for?

  • For saving it.

  • Uh, you're still in tensorflow.

  • But now you have the graph set up to do inference.

  • Then there's a command that was added that allow you to do a graft transform, okay.

  • And the graph transform will take the take the network anus.

  • Excuse said it will find the places the sub graph of the whole graph that tin star T can handle, and it will actually convert that into I think a single kind of tense R T op.

  • She could still look at it with with tense aboard or something like that.

  • And you'll just see that you know the data input.

  • It will go into the tensor Artie op.

  • Then you get whatever post processing you might do.

  • Uh, and, uh, the cool thing is that's a graph, right?

  • So you're still intense airflow and all the things you would normally do with graphs you can do in particular, you can You can run inference on it right there where you can save it and then run it in concert flow serving or something like that later on.

  • Cool.

  • We're only really getting started, right?

  • Yes.

  • So this is a nice approach about this.

  • This is the first step toward the grand vision, and the design that we have in mind means the colonel form.

  • It is very interesting that you can steal the user.

  • Everything happens with intense rifle, and the users can use a full flexibility and the generality off ops that offered by testifying in whatever we they want.

  • Because do beauty the graph to the heart's intent.

  • And still, for the sub graph that the test already can support.

  • They will give them the performance, man, if they would just how this they basically that the colonel form you still need to do a graft transformation.

  • And in the next few in the future, our team's gonna work very close together, so you don't even have to create a new graphic.

  • Just could choreograph and the mention that I want to enable this Artie now and the witches magically get a performance improvement from the front that is already still enjoy the natural development experience ways tensorflow.

  • And that was when the thing's actually isn't it in and video.

  • Your process is run on multiple environments, right?

  • There's automotive and there's embedded in this cloud and they just work across all of them.

  • Yeah, and it's also supplies the yes, it does, and it streamlines the data flow.

  • So one of the things that previous to this work we had customers and we would work and help them through through this challenge is that it is taking the data they could export the tensorflow Graffin intense artie, but in intensive low, they were probably doing some pre processing and probably doing some post process using the tools that tensorflow environment brings to them, they would have to replace that with with C code.

  • And, uh, and if they can, you know, if they can take advantage of the integration, then they could just use the existing code.

  • They've already written intestine flow, so reducing a lot of friction but unable to access All being able to do again, again toe apply thgraf The artificial intelligence model that they created.

  • They can put it in a drone or they can put it in a car or in the cloud.

  • Cool.

  • Cool.

  • So again, as a developer, there's a There's a site and video right for developers.

  • We'll have the ur element to say, http slash you know what t s surely so Yeah, it's basically our developer site has a 30 page.

  • Okay, cool.

  • So we'll put the Lincoln description below.

  • So if you're a developer, go visit that link, and you can learn about all about tense roti and work through some scenario.

  • You ableto But you can you could get information there, and it's also just available in tensorflow mean so?

  • So when you get tensorflow 1.7.

  • Yeah, we'll be there.

  • Okay, I'm on that site every couple of weeks, downloading my cue to drivers anyway, so I'll check it out.

  • Thank you so much, Cris.

  • And thank you so much.

  • Excuse.

  • And thanks everybody for watching this episode of Tensorflow meets.

  • If you've got any questions for me or any questions for my guests, just please leave him in the comments below We hope to see you in the future episode.

  • Thank you.

Hi, everybody.

字幕と単語

動画の操作 ここで「動画」の調整と「字幕」の表示を設定することができます

B1 中級

NVidia TensorRT:高性能ディープラーニング推論アクセラレータ(TensorFlow Meets (NVidia TensorRT: high-performance deep learning inference accelerator (TensorFlow Meets))

  • 0 0
    林宜悉 に公開 2021 年 01 月 14 日
動画の中の単語