Udacity: Intro to Deep Learning with PyTorch:
Lesson 2: Introduction to Neural Networks
– Why “Neural Networks”?

– Perceptron Algorithm
— Coding the Perceptron Algorithm
Recall that the perceptron step works as follows. For a point with coordinates , label , and prediction given by the equation :- If the point is correctly classified, do nothing.
- If the point is classified positive, but it has a negative label, subtract and from and b respectively.
- If the point is classified negative, but it has a positive label, add and to and respectively.
– Error Function
An error function is simply something that tells us how far we are from the solution
– Log-loss Error Function
– Discrete vs Continuous Predictions


Day 22 of #60daysofudacity:
– The Softmax Function
softmax function, which is the equivalent of the sigmoid activation function, but when the problem has 3 or more classes.



– One-Hot Encoding
Day 23 of #60daysofudacity:
– Maximum Likelihood
Cross-Entropy
There’s definitely a connection between probabilities and error functions, and it’s called Cross-Entropy. This concept is tremendously popular in many fields, including Machine Learning.