@@ -99,4 +99,32 @@ Advantages:
99
99
100
100
Disadvantages:
101
101
=================
102
- * May not significantly improve performance if base learners are not diverse.
102
+ * May not significantly improve performance if base learners are not diverse.
103
+
104
+ Boosting
105
+ ====================================
106
+ * This is an ensemble technique aimed at improving the accuracy and stability of ML models
107
+ * It is done by combining weak learners(models that perfrom slightly better than random chance) to create a strong learner
108
+ * The strong learner is built in iterations with focus on misclassified instances.
109
+
110
+ How Boosting Works:
111
+ ===============
112
+ * Sequential Learning: Models are trained sequentially, where each new model focuses on correcting errors made by previous models.
113
+ * Weight Adjustment: Misclassified instances are given higher weights so that subsequent models pay more attention to them.
114
+ * Final Prediction: Combine predictions from all models, typically using weighted voting or averaging
115
+
116
+ Popular Boosting Algorithms:
117
+ ==========
118
+ * AdaBoost
119
+ * Gradient Boosting
120
+ * XGBoost
121
+
122
+ Advantages:
123
+ ==========
124
+ * Often achieves high accuracy and performs well even with limited data.
125
+ * Can handle various types of data and relationships.
126
+
127
+ Disadvantages:
128
+ =================
129
+ * More prone to overfitting than bagging if not carefully tuned.
130
+ * Requires careful tuning of parameters.
0 commit comments