From the course: Complete Guide to AI and Data Science for SQL Developers: From Beginner to Advanced
Unlock the full course today
Join today to access over 23,200 courses taught by industry experts.
Dropping insignificant variables and re-creating the model - SQL Tutorial
From the course: Complete Guide to AI and Data Science for SQL Developers: From Beginner to Advanced
Dropping insignificant variables and re-creating the model
- [Instructor] Welcome back! In this step, step 16, you'll drop insignificant variables, then recreate your model. But before we jump into the details, let's recall why you're here. In step 15, you created your linear regression model and thoroughly examined its performance. You used the ordinary lease squared, OLS method to build the model, analyze metrics like R-squared and F-statistic, and even broke down the coefficients and diagnostic statistics. Now in step 16, you're taking a closer look at your model's coefficients to ensure that it's accurate and reliable as possible. But why are you doing this? When you build a model to predict something like home prices, you want to be sure that each piece of information you put into the model actually matters. So you check if each piece of information like crime rate or air quality is really important in predicting home prices. If something doesn't make a big…
Practice while you learn with exercise files
Download the files the instructor uses to teach the course. Follow along and learn by watching, listening and practicing.
Contents
-
-
-
-
-
-
-
-
-
-
-
-
-
(Locked)
Creating the linear regression model and model summary: Part 19m 33s
-
(Locked)
Creating the linear regression model and model summary: Part 27m 16s
-
(Locked)
Creating the linear regression model and model summary: Part 35m 33s
-
(Locked)
Dropping insignificant variables and re-creating the model7m 57s
-
(Locked)
Checking assumptions for linear regression3m 18s
-
(Locked)
Assumption 1: Checking for mean residuals2m 47s
-
(Locked)
Assumption 2: Checking homoscedasticity3m 13s
-
(Locked)
Assumption 3: Checking linearity2m 12s
-
(Locked)
Assumption 4: Checking normality of error terms3m 24s
-
(Locked)
Q-Q plot for checking the normality of error terms3m 14s
-
(Locked)
Model performance comparison on train and test data6m 7s
-
(Locked)
Applying cross-validation and evaluation4m 40s
-
(Locked)
Challenge: Model building48s
-
(Locked)
Solution: Model building1m 16s
-
(Locked)
-
-
-