Class Central is learner-supported. When you buy through links on our site, we may earn an affiliate commission.

Johns Hopkins University

Advanced Linear Models for Data Science 1: Least Squares

Johns Hopkins University via Coursera

Overview

Prepare for a new career with $100 off Coursera Plus
Gear up for jobs in high-demand fields: data analytics, digital marketing, and more.
Welcome to the Advanced Linear Models for Data Science Class 1: Least Squares. This class is an introduction to least squares from a linear algebraic and mathematical perspective. Before beginning the class make sure that you have the following:

- A basic understanding of linear algebra and multivariate calculus.
- A basic understanding of statistics and regression models.
- At least a little familiarity with proof based mathematics.
- Basic knowledge of the R programming language.

After taking this course, students will have a firm foundation in a linear algebraic treatment of regression modeling. This will greatly augment applied data scientists' general understanding of regression models.

Syllabus

  • Background
    • We cover some basic matrix algebra results that we will need throughout the class. This includes some basic vector derivatives. In addition, we cover some some basic uses of matrices to create summary statistics from data. This includes calculating and subtracting means from observations (centering) as well as calculating the variance.
  • One and two parameter regression
    • In this module, we cover the basics of regression through the origin and linear regression. Regression through the origin is an interesting case, as one can build up all of multivariate regression with it.
  • Linear regression
    • In this lecture, we focus on linear regression, the most standard technique for investigating unconfounded linear relationships.
  • General least squares
    • We now move on to general least squares where an arbitrary full rank design matrix is fit to a vector outcome.
  • Least squares examples
    • Here we give some canonical examples of linear models to relate them to techniques that you may already be using.
  • Bases and residuals
    • Here we give a very useful kind of linear model, that is decomposing a signal into a basis expansion.

Taught by

Brian Caffo

Reviews

5.0 rating, based on 1 Class Central review

4.5 rating at Coursera based on 184 ratings

Start your review of Advanced Linear Models for Data Science 1: Least Squares

  • Really nice course. Everything was explained clearly through linear algebra topic. Of course there are some pre-requisites that are essential for understanding the course. Also the R coding examples were really interesting. Dr. Brian Caffo has made a great job!!!

Never Stop Learning.

Get personalized course recommendations, track subjects and courses with reminders, and more.

Someone learning on their laptop while sitting on the floor.