Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Commit 1e36fb4

Browse files
Update GeneralMLPrep.md
1 parent 5c1272a commit 1e36fb4

File tree

1 file changed

+29
-1
lines changed

1 file changed

+29
-1
lines changed

‎DataScience/GeneralMLPrep.md

Lines changed: 29 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -99,4 +99,32 @@ Advantages:
9999

100100
Disadvantages:
101101
=================
102-
* May not significantly improve performance if base learners are not diverse.
102+
* May not significantly improve performance if base learners are not diverse.
103+
104+
Boosting
105+
====================================
106+
* This is an ensemble technique aimed at improving the accuracy and stability of ML models
107+
* It is done by combining weak learners(models that perfrom slightly better than random chance) to create a strong learner
108+
* The strong learner is built in iterations with focus on misclassified instances.
109+
110+
How Boosting Works:
111+
===============
112+
* Sequential Learning: Models are trained sequentially, where each new model focuses on correcting errors made by previous models.
113+
* Weight Adjustment: Misclassified instances are given higher weights so that subsequent models pay more attention to them.
114+
* Final Prediction: Combine predictions from all models, typically using weighted voting or averaging
115+
116+
Popular Boosting Algorithms:
117+
==========
118+
* AdaBoost
119+
* Gradient Boosting
120+
* XGBoost
121+
122+
Advantages:
123+
==========
124+
* Often achieves high accuracy and performs well even with limited data.
125+
* Can handle various types of data and relationships.
126+
127+
Disadvantages:
128+
=================
129+
* More prone to overfitting than bagging if not carefully tuned.
130+
* Requires careful tuning of parameters.

0 commit comments

Comments
(0)

AltStyle によって変換されたページ (->オリジナル) /