Learn about Ensemble Methods and their implementation from scratch. This course covers the understanding and implementation of multiple ensemble methods such as Bagging, Random Forest, AdaBoost, and Gradient Boosting Machines like XGBoost without relying on high-level libraries.
Ensemble Predictions with Bagging and Decision Trees
Navigating the Data Cosmos with Bootstrapping and Prediction Functions
Implementing Bootstrapping and Prediction in Ensemble Learning
Predicting with Bagging and Decision Trees
Observing Bagging with Decision Trees in Action
Evaluating Random Forest Accuracy on Iris Dataset
Adjusting the Depth of Our RandomForest
Seeding the Forest: Random State Initialization
AdaBoost Accuracy Demonstration
Tweaking the AdaBoost Learning Rate
Boosting the Weights in AdaBoost
AdaBoost Prediction Challenge
Launching the Stacking Model into Orbit
Switching the Meta-Model in Stacking Ensemble
Stacking Ensemble: Combining Base Model Predictions
Assemble the Stacking Ensemble: Meta-Model Predictions