字幕表 動画を再生する
DA-CHENG JUAN: Neural networks have
emerged as an effective and promising approach
to many machine learning tasks including
computer vision, language understanding,
or classification in general.
In this video series, we are going
to introduce a new learning framework
to you called neural structured learning, which
enables neural nets to learn with structured signals
for improving model quality and robustness.
I am Da-Cheng, and I'm going to be your guide.
You do not need to know a lot to get started,
and we will be coding with Python language.
Don't worry you have never used it.
It's simple to understand, and you will
be up and running in no time.
So let's get started with a simple example.
Consider you are creating a neural network to classify
an image into a cat or a dog.
Like this figure shows, the image
is fed into the neural net, activating neurons layer
by layer, forming several activation
path that determine this image to be
classified as a cat or a dog.
Seems pretty straightforward, isn't it?
What if I tell you there are other similar images related
to this input image?
That is, there there's actually a structure,
for example, a graph representing the similarity
among all these images.
And as you can see, all these images are English bulldogs.
So is it possible that we can make a neural net learn better
with the whole structure in addition
to just using one image?
And the answer is yes through neural
structured learning framework.
Neural structured learning jointly
optimizes simple features and the structured signals
existed among samples in order to learn a better neural net.
Specifically, we now have two types
of input for a neural net.
The first input is the features of a training sample,
for example, the pixels of an image.
And a second input is the structure,
for example, the graph representing
the similarity among samples.
Both the features and the structure
will be fed into a neural net for training.
You may now have a question.
We know in a neural net the input features
are used to activate the neurons layer by layer
for making a classification.
But how do we use the structure to help a neuron net learn?
The structure is used to regularize the training
of a neural network.
Don't worry if you are not familiar with this concept.
We are going to provide more details to you
about this whole training process.
Are you ready?
First, each training sample is augmented
to include its neighbor information from a given
structure, specifically the neighbor information here
refers to the features of a neighbor.
So we get a new training batch where
both the original training samples and their neighbors
are included.
Next, both the training sample and its neighbors
are fed into the neural net.
After the training sample is fed into the neural net,
its features activate different neurons layer by layer,
forming and embedding representation for this sample.
If you're not familiar with the concept of embedding layers,
just think of it as a new representation
formed by the second to last layer of a neural net.
The neighbor is processed in the same way.
So we will have an embedding representation for the neighbor
as well.
Then the difference between the samples embedding
and its neighbors embedding is calculated and added
into the final loss as a regularization term.
So what is the intuition here?
By adding this regularization term,
the neural network learns to keep
the similarity between a sample and its neighbor.
In other words, the neural net learns
to maintain the local structure of a sample
and its neighborhood.
By leveraging these structured signals,
neural nets can learn with last label data
and also be more robust.
We also provide several hands-on tutorial
to guide you step by step how to use the neural structured
learning framework.
In a next video of these series, we
will take what you have learned and apply that
to a language understanding problem,
classifying the topic of a document.
You will find a tutorial for that in the description below
as well as more information on getting
started with neural structured learning.