Placeholder Image

字幕表 動画を再生する

  • ♪ (music) ♪

  • And we're here for Ask TensorFlow.

  • I'm Paige Bailey, a TensorFlow Developer Advocate,

  • and I'm here today with Alex Passos.

  • I'm an engineer on the TensorFlow team.

  • Excellent. And we're very excited to answer

  • all of your TensorFlow questions.

  • So our first question is from Prince Canuma who asked,

  • "What are the major improvements to TensorFlow Core?"

  • as part of TensorFlow 2.0.

  • Do you want to take that one, Alex?

  • Yeah, thanks for the question, Prince.

  • I'm very happy to talk about this.

  • We're changing a lot of things in TensorFlow with TensorFlow 2.0

  • to make it much, much easier to use.

  • We're improving the high-level APIs at launch,

  • we're integrating them with Keras,

  • we have a new high level API for distributed TensorFlow

  • that we call the distribution strategy

  • that let's you write your code once

  • and then change how it runs on multiple devices,

  • mutliple CPUs, multiple GPUs,

  • even multiple TPUs, if you want.

  • Eager execution is started on by default

  • which means that interacting with TensorFlow

  • line by line on a non-Python notebook

  • or a Colab, or Jupyter is a lot friendlier.

  • Also, debugging is much easier.

  • And we're also planning on building out a lot of additional documentation

  • and tutorials and resources for TensorFlow 2.0,

  • so very much focusing on making it easier to get started than ever.

  • Yeah, if you haven't seen the tensorflow.org website,

  • we completely changed it for this 2.0 process.

  • It's much cleaner.

  • All sorts of cool useful contents.

  • There are places you can just click and run the code.

  • You should check it out.

  • Absolutely. Awesome!

  • So our second question is from Katayoun who asked,

  • "Is there XGBoost-like functionality within TensorFlow?"

  • and I am certain that there is.

  • That's a great question.

  • We have gradient boosted trees, right?

  • Yeah, and the gradient boosted trees that were in TensorFlow

  • contributing in 1.2, they're now in Core TensorFlow.

  • Excellent, and they're part of the Estimators API, I believe?

  • - Yes. - Yes, awesome.

  • So if you want to use boosted trees for all of your machine learning projects,

  • you're absolutely able to.

  • And the nice thing too about TensorFlow

  • is that it isn't just about deep learning,

  • you can use traditional methods as well.

  • So, other frameworks might be focused just on creating neural networks,

  • we give you the ability to do a lot of other things as well.

  • Yeah, we also have k-means, for example, as an estimator there.

  • Excellent.

  • So our next question is from amir who asked-- oh man--

  • he asked what's going to happen to tf.contrib.slim.

  • Do you want to take that one?

  • That is a tough question.

  • So, tf.contrib.slim was great when it came out,

  • but I think we have a better higher level API in Core TensorFlow now.

  • So slim has deprecated.

  • We'd really encourage you to take code that's using slim

  • and see if you can do it better and cleaner

  • using our new Keras higher level APIs.

  • If you want to keep using slim, it's open source software,

  • so you can fork it.

  • And slim relies on a lot of things that are core to TensorFlow 1.0

  • that we think should not move forward to TensorFlow 2.0.

  • However, we're not deleting the components that slim depends on,

  • we're putting them in the tf.compat.v1 package,

  • so you can still use slim, if you fork it onto your own project,

  • but we think there are better higher level APIs

  • for you to use now, right?

  • Absolutely, and another thing that we want to mention is,

  • as part of TensorFlow 2.0,

  • tf.contrib completely is being cut out,

  • and this is really for the betterment of 2.0 itself.

  • There was a lot of code that was placed in contrib

  • that was not supported, that was maybe working

  • in an earlier version of TensorFlow and then stopped working later.

  • So really, getting rid of contrib is a good idea.

  • But it's not that we're getting rid of--

  • contrib used to be this very useful channel

  • where you could easily get code into TensorFlow

  • with less care and overhead

  • than what to get code inside the Core TensorFlow package.

  • This is a really important use case and we're not getting rid of it.

  • Now we have fully open source community-driven special interest groups

  • to which we contribute but we do not own

  • and there's SIG add-ons, we have IO,

  • we have many others where if you feel like

  • you still need a home for a TensorFlow related project,

  • odds are we can find a home for it somewhere in our special interest groups.

  • Absolutely, and there's also an RFC that details the fate of every endpoint

  • that was created as part of tf.contrib.

  • So a lot of it has been migrated to the Core API.

  • Some of it's been migrated to other repos.

  • So, for example, the tf.contrib.gan

  • is being migrated to its own repo.

  • And then also, some functionality,

  • like SIG to SIG, landed in /addons, and some image operations as well.

  • Yeah, a good rule of thumb

  • is that if you just run the TF 2.0 upgrade script on your code,

  • it will pick any contrib symbols that have moved to Core

  • and will just rename them for you.

  • So if any contrib symbols are left on your code after that,

  • then you should probably search and see what was the new home for that package,

  • or in the case of slim, you might want to copy it into your code.

  • So if you're a person who loves using contrib,

  • there are a lot of different ways to contribute.

  • That's an excellent question.

  • Thanks so much for it, amir.

  • So the next question is from Donstan, who asked,

  • "Is it possible to use TF 2.0 with Python 3.7?"

  • Yes! (laughs)

  • Excellent answer.

  • We have released bit packages, finally, for Python 3.7.

  • Finally. (laughs)

  • This is true facts.

  • And also, TensorFlow has signed the Python 3 statement,

  • which if you're not familiar, means that we will not be supporting

  • Python 2 as of January 1st, 2020,

  • and it's been a long time coming.

  • So definitely have Python 3 support,

  • definitely have Python 3.7 support, in particular, Donstan,

  • and we will not be supporting Python 2 as of January 1st, 2020.

  • Yeah, I'm so excited about getting to delete all the Python 2-related code

  • and being able to use all the cool new stuff with Python 3,

  • so thanks for the question, Donstan.

  • Excellent.

  • And our next one is from Siby, who asked,

  • "Is it better to use TF 2.0 alpha, or TF 1.13 in Google Colab?"

  • Wow, this is a good question.

  • I think you should really use both and probably by the time you see this,

  • we might have even released 1.14.

  • I encourage you to try 2.0 alpha

  • because we're trying really hard to make it much, much easier to use,

  • especially in an interactive environment like Colab.

  • But if you're importing modules

  • or using libraries that rely on TensorFlow 1.x

  • and have not been updated to TensorFlow 2.x instead,

  • that will keep working.

  • And you know, in Colab, you can just,

  • !pip install whatever version of TensionFlow you want.

  • Yep, and I believe 1.13 might be the default right now,

  • but, as Alex mentioned, pip installing 2.0

  • or whether it's the alpha or an earlier release--

  • though we recommend the alpha (laughs)--

  • any package that you choose

  • is perfectly capable of working in Google Colab.

  • So, for the next question, Kris asked,

  • "How does TF 2.0 work with Keras?"

  • That's an excellent question, Kris.

  • So we talked about this a little bit before,

  • but Keras is really a first-class citizen in TensorFlow 2.0.

  • If you're familiar with the standalone version of Keras,

  • we've integrated much of that functionality

  • into something called tf.keras,

  • so it ships directly with TensorFlow itself.

  • There's not a one-to-one correlation,

  • but most of the stuff that you love to use

  • is included as part of the TensorFlow package.

  • Yeah, we have all the useful layers, metrics, losses, optimizers,

  • really all the tools you need

  • to build your models and train your models,

  • and the cool thing about tf.keras is that it's integated deeply

  • with distribution strategy, eager execution,

  • and other features that are new to TF 2.0,

  • so, it's a very nice, well-rounded experience

  • that we really encourage you to try.

  • In the future, I think, we'll probably externalize this

  • and move it to the original keras-team/keras,

  • but for now, the version that's integrated with TensorFlow 2.0

  • is higher performance, and easier to use.

  • Excellent, and we also know that the Keras community,

  • the standalone Keras community, is very strong and very enthusiastic.

  • So if you'd like to get involved with tf.keras development,

  • or if you have suggestions, we're starting a special interest group.

  • So take a look at the community section on the TensorFlow website,

  • sign up, and send us your questions.

  • Thanks for the question, Kris.

  • Thanks so much to everyone for your questions,

  • and if you have additional ones, we would love to hear them.

  • Just use the #AskTensorFlow on social media,

  • and we'll probably answer your question in the next video.

  • See you next time.

  • ♪ (music) ♪

♪ (music) ♪

字幕と単語

ワンタップで英和辞典検索 単語をクリックすると、意味が表示されます

B1 中級

TensorFlow 2.0。概要 (#AskTensorFlow) (TensorFlow 2.0: An Overview (#AskTensorFlow))

  • 1 0
    林宜悉 に公開 2021 年 01 月 14 日
動画の中の単語