There are usually several factors which influence an outcome and we need to consider all of those when trying to predict that event using regression. Explore the techniques and risks involved when using multiple factors for regression.
Linear Regression Models: Multiple and Parsimonious Linear Regression
- Course Overview
- identify the reasons to use multiple features when doing a regression and the technique involved in creating such a multiple regression model
- prepare a dataset containing multiple features to used for training and evaluating a linear regression model
- configure, train and evaluate the linear regression model which makes predictions from multiple input features
- create a dataset with multiple features in a form which can be fed to a neural network for training and validation
- define the architecture for a Keras sequential model and set the training parameters such as loss function and optimizer
- make predictions on the test data and examine the metrics to gauge the quality of the neural network model
- use Pandas and Seaborn to visualize correlations in a dataset and identify features which convey similar information
- identify the risks involved with multiple regression and the need to select features carefully
- apply the principle of parsimonious regression to re-build the Linear Regression model and compare the results with the kitchen sink approach
- build a Keras model after selecting only the important features from a dataset
- encode categorical integers for ML algorithms as well as use Pandas and Seaborn to view correlations, and enumerate risks
If you would like to provide feedback for this course, please e-mail the NICCS SO at NICCS@hq.dhs.gov.