## Cost Function

Two types of classification problems:

- Binary classification : where the labels y are either zero or one.
- multiclass classification : where we may have k distinct classes.

## Backpropagation Algorithm

It’s too hard to describe.

## Backpropagation Intuition

It’s too hard to describe.

## Implementation Note_ Unrolling Parameters

It’s too hard to describe.

## Gradient Checking

Back prop as an algorithm has one unfortunate property is that there are many ways to have subtle bugs in back prop so that if you run it with gradient descent or some other optimization algorithm, it could actually look like it’s working. And, you know, your cost function \(J(\theta )\) may end up decreasing on every iteration of gradient descent, but this could pull through even though there might be some bug in your implementation of back prop. So it looks like \(J(\theta )\) is decreasing, but you might just wind up with a neural network that has a higher level of error than you would with a bug-free implementation and you might just not know that there was this subtle bug that’s giving you this performance.

Gradient checking that eliminates almost all of these problems.

## Random Initialization

To train a neural network, what you should do is randomly initialize the weights to, you know, small values close to 0, between \(-\epsilon\) and \(+\epsilon\).

## Putting It Together

How to implement a neural network learning algorithm ?

- Pick some network architecture, it means connectivity pattern between the neurons.
- Once you decides on the fix set of features x the number of input units will just be, the dimension of your features x(i) would be determined by that.
- The number of output of this will be determined by the number of classes in your classification problem.
- If you use more than one hidden layer, again the reasonable default will be to have the same number of hidden units in every single layer.
- (
*As for the number of hidden units – usually, the more hidden units the better*)

What we need to implement in order to trade in neural network ?

- set up the neural network and to randomly initialize the values of the weights
- forward propagation
- compute this cost function \(J(\theta )\)
- back-propagation
- gradient checking
- use an optimization algorithm

## Autonomous Driving

A fun and historically important example of Neural Network Learning, just so so.