Placeholder Image

字幕表 動画を再生する

  • Hi there, everybody plus up.

  • My name is Magnus and your watching coding tensorflow the show where you learn how to code intensive Phil.

  • All right, In this episode, we'll talk about model over fitting and under fitting.

  • Ah, common challenge when doing machine learning and sister is quite a lot to cover this material will be a two episode video in this first episode will focus on getting the empathy already.

  • Remember the text classification episode, a coding tensorflow where we classified I m d B movie reviews.

  • If not, you should check it out now, using the links below.

  • Don't worry, it's okay, I'll wait for you here.

  • This is a diagram from that video.

  • The loss initially decreased on both the training and the validation data sets, but after a bit training loss would continue to go down while validation loss would start to increase.

  • This is called over fitting.

  • The model becomes too specialized in solving for the training data and starts to before worse when validated on the test data, you can see the model memorizes the answer's in the training data set and does not generalize to the test data.

  • Set the opposite situation is called under 50 where the model does not have enough variables to solve the training data set.

  • And a more advanced bottle with more parameters and variables would perform better.

  • In this video, we'll use the IMDB data set, as we did in the text classification video, to classify if movie reviews are good or bad.

  • But the important stuff in this video is to explore what over fitting and under fitting means.

  • All right, that's enough words.

  • Let's get started executing the code.

  • You'll find it below.

  • We start by checking out the awesome licenses here at the top, then importing all stuff and print the tensorflow version.

  • And it's totally okay if you get a later version printed here.

  • Of course, now loading the data set into the train and Test two prepares, the numbers parameter specifies to maximum number of words we should consider.

  • Remember, from the text classification video.

  • We don't actually get the movie reviews as English words, but rather as a set of numbers where each number is an I d.

  • Of a word.

  • Actually, I d number one denotes the word the and two is the word, and and so the IMDb data set is also sorted, meaning that the is the most common word in the reviews.

  • And since we're loading 10,000 words, that means we're loading the 10,000 most common words across all the reviews for our model.

  • We want the input to be a multi hot, encoded array, which is achieved by this code.

  • Slip it.

  • So what is this multi hot, encoded array, anyway?

  • Well, let's say we want to feed the model.

  • The sentence, the small cat.

  • Then we will feed the model with an array, which has to value one for each word index.

  • Present in the review and we're all other indexes are set to zero.

  • So now you know why it's called a Multi hot and Cody here, we thought, the first training data example where the Y axis indicates the hot encoding on the X axis is the word I d.

  • You can see we mostly have words with low ID's, which makes sense, since these are the most common words used to murder reviews, and that's almost it for this episode.

  • But I will give you a little bit of homework while waiting for the next part.

  • Remember I said that number one map to the word the and to to end.

  • How did I find out that these numbers met to these words?

  • As a hint, you should check out the following function, and as a second hint, you should check out the code in the text classification video.

  • Also.

  • All right, that's it for this episode.

  • Now it's your turn to go out there and create some great models, and I will see you in the next episode.

Hi there, everybody plus up.

字幕と単語

ワンタップで英和辞典検索 単語をクリックすると、意味が表示されます

B1 中級

モデルのオーバーフィットとアンダーフィットの問題を解決する - Pt.1 (Coding TensorFlow) (Solve your model’s overfitting and underfitting problems - Pt.1 (Coding TensorFlow))

  • 3 0
    林宜悉 に公開 2021 年 01 月 14 日
動画の中の単語