Neural Network Backpropagation Derivation

I have spent a few days hand-rolling neural networks such as CNN and RNN. This post shows my notes of neural network backpropagation derivation. The derivation of Backpropagation is one of the most complicated algorithms in machine learning. There are many resources for understanding how to compute gradients using backpropagation. But in my opinion, most of them lack a simple example to demonstrate the problem and walk through the algorithm.

Read more

Logistic Regression vs. SVM

Logistic Regression and SVM often give the similar results. SVM costs longer to train than logistic regression, so it seems that there is no obvious reason to use SVM. Actually, in industry logistic regression is the most frequently used algorithm.

The reason that logistic regression and SVM have similar performance is that the training data is linearly separable, which happens very often. Therefore, there is no need to project the value to a higher dimension to separate them.

Read more