Path: blob/master/Machine Learning Ensemble Methods/4 AdaBoost.ipynb
7216 views
Step 1: total samples = 500 weight = 1/500
Step2: Train First weak Learner Age: age<20 - Class Young age>=20: Class Adult
17-Adult 23-Young
Step3: Error = weighted sum of misclassified points 1-strong 2-weak 3-strong total=.9
Step4: Compute Model Importance: Each weak learner gets a vote weight: if error=.9 alpha = 1/2ln(1.3/.9) alpha =
step5: Update obserations: wcorrected - wi*e**(-alpha)
step6: Train next weak learner i ijertaions
AdaBoost stands for Adaptive Boosting.
It's a boosting ensemble method that:
Combines many weak learners (usually shallow decision trees)
Builds them one after another
Each new model focuses more on the mistakes made by earlier models
Produces a strong final classifier through weighted voting
Key Concept (In Very Simple Terms)
Start with equal weights on all training samples.
Train a weak learner (e.g., a decision stump = depth‑1 tree).
Increase the weights of misclassified samples.
Train the next learner with these updated weights.
Combine all learners using a weighted vote (stronger models get higher weights).
Result: A powerful model that focuses on hard‑to-classify points.
Visual Analogy
Imagine a teacher testing students:
First test → some students struggle.
Teacher focuses more on those weak areas → next test.
Again focuses on remaining weak topics → next test.
After many small tests, the teacher combines all scores → final understanding.
This is exactly AdaBoost’s strategy!
