Ridge regression is an extension of least squares in which we impose a penalty on the size of the regression coefficients. This results in the shrinkage of the estimated coefficients towards 0 (biasing the estimators), and a reduction in their variance. Adding the squared $l_2$ norm also has a nice...
[Read More]

## Scalar-by-vector derivatives

### The vector calculus needed for deriving the least squares solution

Here we are taking a break from the last post and going to go through, step by step, the solution to:
[Read More]

## Linear regression for dummies

### Deriving the least squares solution step-by-step

In this post I’m going to assume you know the basics of linear algebra. If you are unfamiliar with matrix vector multiplication, this video is all you need. (Also check out the other videos in that Linear Alegbra review playlist for bonus points) Stage setting: Okay, so last post we...
[Read More]

## Lines of best fit!

### Plotting lines of best-fit using python

In this post we are going to through fitting a line of best fit using python. If you just want the python code feel
free to just read the first section.
[Read More]