Syllabus week! Tips for success.
Machine learning tasks. Probability and statistics review. Computing tips, tricks, and resources.
Introduction to supervised learning. Using linear regression for prediction. Estimating conditional means. Using
lm(). Metrics for regression tasks. Data splits for model evaluation.
k-nearest neighbors. Decision trees. Parametric versus nonparametric models. Using Categorical features and interactions.
Bias-variance tradeoff. Regression overview.
Introduction to classification. Probability models and the Bayes Classifier. k-nearest neighbors and decision trees again.
Specifics and metrics for binary classification. Logistic regression.
Generative versus discriminative models. LDA, QDA, and Naive Bayes.
Resampling methods. Model tuning using cross-validation.
Regularization with ridge and lasso. Dimension reduction.
Ensemble methods. Bagging, random forest, and boosting.
Not much happened this week.
Using machine learning for data analysis.
Work on analyses!