Ensembles in Machine Learning
Learn about ensemble learning techniques, such as bagging, boosting, and stacking, which combine multiple models to achieve superior predictive performance.
Lessons and practices
Adjust the Number of Estimators
Train and Evaluate Bagging Classifier
Optimize Bagging Classifier for Wine Classification
Enhance Your Bagging Classifier
Adjusting Random Forest Tree Depth
Complete the Random Forest Classifier for Wine Dataset
Improving Random Forest for Wine Classification
Evaluate Random Forest Accuracy with Varying Depths
Change the Weak Classifier in AdaBoost
Train and Predict with AdaBoost
AdaBoost vs RandomForest
Adjust Gradient Boosting Estimators
Complete the Gradient Boosting Setup for Digit Classification
Gradient Boosting vs. AdaBoost on Synthetic Data
Comparing Models Efficiency
Gradient Boosting with Varying Estimators
Change Meta-Model to Gradient Boosting
Change the Meta-Model in Stacking Classifier
Complete the Stacking Classifier
Tune the Stacking Classifier
Interested in this course? Learn and practice with Cosmo!
Practice is how you turn knowledge into actual skills.