### Udacity: Intro to Deep Learning with PyTorch:

### Lesson 2: Introduction to Neural Networks

#### – Why “Neural Networks”?

#### – Perceptron Algorithm

#### — Coding the Perceptron Algorithm

Recall that the perceptron step works as follows. For a point with coordinates $(p,q)$, label $y$, and prediction given by the equation $y^ =step(w_{1}x_{1}+w_{2}x_{2}+b)$:- If the point is correctly classified, do nothing.
- If the point is classified positive, but it has a negative label, subtract $αp,αq$and $α$ from $w_{1},w_{2}$ and b respectively.
- If the point is classified negative, but it has a positive label, add $αp,αq$ and $α$ to $w_{1},w_{2}$and $b$ respectively.

#### – Error Function

An error function is simply something that tells us how far we are from the solution

#### – Log-loss Error Function

#### – Discrete vs Continuous Predictions

#### – One-Hot Encoding

## Day 23 of #60daysofudacity:

#### – Maximum Likelihood

**Cross-Entropy**

There’s definitely a connection between probabilities and error functions, and it’s called **Cross-Entropy**. This concept is tremendously popular in many fields, including Machine Learning.