Search and Compare course prices, ratings, and reviews. Over +350 Design and Technology courses in one place!

Ensemble Machine Learning in Python: Random Forest, AdaBoost

(12 customer reviews)
Product is rated as #80 in category Development

What you’ll learn

  • Understand and derive the bias-variance decomposition
  • Understand the bootstrap method and its application to bagging
  • Understand why bagging improves classification and regression performance
  • Understand and implement Random Forest
  • Understand and implement AdaBoost

In recent years, we’ve seen a resurgence in AI, or artificial intelligence, and machine learning.

Machine learning has led to some amazing results, like being able to analyze medical images and predict diseases on-par with human experts.

Google’s AlphaGo program was able to beat a world champion in the strategy game go using deep reinforcement learning.

Machine learning is even being used to program self driving cars, which is going to change the automotive industry forever. Imagine a world with drastically reduced car accidents, simply by removing the element of human error.

Google famously announced that they are now “machine learning first”, and companies like NVIDIA and Amazon have followed suit, and this is what’s going to drive innovation in the coming years.

Machine learning is embedded into all sorts of different products, and it’s used in many industries, like finance, online advertising, medicine, and robotics.

It is a widely applicable tool that will benefit you no matter what industry you’re in, and it will also open up a ton of career opportunities once you get good.

Machine learning also raises some philosophical questions. Are we building a machine that can think? What does it mean to be conscious? Will computers one day take over the world?

This course is all about ensemble methods.

We’ve already learned some classic machine learning models like k-nearest neighbor and decision tree. We’ve studied their limitations and drawbacks.

But what if we could combine these models to eliminate those limitations and produce a much more powerful classifier or regressor?

In this course you’ll study ways to combine models like decision trees and logistic regression to build models that can reach much higher accuracies than the base models they are made of.

In particular, we will study the Random Forest and AdaBoost algorithms in detail.

To motivate our discussion, we will learn about an important topic in statistical learning, the bias-variance trade-off. We will then study the bootstrap technique and bagging as methods for reducing both bias and variance simultaneously.

We’ll do plenty of experiments and use these algorithms on real datasets so you can see first-hand how powerful they are.

Since deep learning is so popular these days, we will study some interesting commonalities between random forests, AdaBoost, and deep learning neural networks.

All the materials for this course are FREE. You can download and install Python, Numpy, and Scipy with simple commands on Windows, Linux, or Mac.

This course focuses on “how to build and understand“, not just “how to use”. Anyone can learn to use an API in 15 minutes after reading some documentation. It’s not about “remembering facts”, it’s about “seeing for yourself” via experimentation. It will teach you how to visualize what’s happening in the model internally. If you want more than just a superficial look at machine learning models, this course is for you.

 

“If you can’t implement it, you don’t understand it”

  • Or as the great physicist Richard Feynman said: “What I cannot create, I do not understand”.
  • My courses are the ONLY courses where you will learn how to implement machine learning algorithms from scratch
  • Other courses will teach you how to plug in your data into a library, but do you really need help with 3 lines of code?
  • After doing the same thing with 10 datasets, you realize you didn’t learn 10 things. You learned 1 thing, and just repeated the same 3 lines of code 10 times…

 

Suggested Prerequisites:

  • Calculus (derivatives)
  • Probability
  • Object-oriented programming
  • Python coding: if/else, loops, lists, dicts, sets
  • Numpy coding: matrix and vector operations
  • Simple machine learning models like linear regression and decision trees

 

WHAT ORDER SHOULD I TAKE YOUR COURSES IN?:

  • Check out the lecture “Machine Learning and AI Prerequisite Roadmap” (available in the FAQ of any of my courses, including the free Numpy course)

Who this course is for:

  • Understand the types of models that win machine learning contests (Netflix prize, Kaggle)
  • Students studying machine learning
  • Professionals who want to apply data science and machine learning to their work
  • Entrepreneurs who want to apply data science and machine learning to optimize their business
  • Students in computer science who want to learn more about data science and machine learning
  • Those who know some basic machine learning models but want to know how today’s most powerful models (Random Forest, AdaBoost, and other ensemble methods) are built

12 reviews for Ensemble Machine Learning in Python: Random Forest, AdaBoost

4.5 out of 5
6
6
0
0
0
Write a review
Show all Most Helpful Highest Rating Lowest Rating
  1. Nikhil Kini

    I highly recommend this course for anyone learning or working on classification problems. For one, it teaches very useful tools that are known in the real world to improve models by 2-3% (a big deal). For another, it does a great job explaining the bias variance decomposition which sits at the heart of why or how these models work. 10 / 10. Would do it again. Major points for implementing boosting, bagging, and to some extent (pseudo) random forests ‘from scratch’. Really helps understand exactly how the algorithm works and helps solidify theory.

    Helpful(0) Unhelpful(0)You have already voted this
  2. Benjamin Stevenson

    Generally a fantastic course but I sometimes struggle to keep up with the maths. Having said that, it’s been pretty clear from the start that people who are not willing to put the hours in will struggle, so it comes as no surprise. Perhaps a compromise would be to have a bit more narration aimed at those people who do struggle with the maths. For example, in the lecture ‘coding with poly degree 12’, I was in the dark as to what the output would look like before it was plotted, which was a bit demoralising. A few more sentences putting the logic in layman’s terms would have helped – but aside from that I think these courses are fantastic.

    Helpful(0) Unhelpful(0)You have already voted this
  3. Weituo Hao

    I did learn a lot on ensemble methods. A very interesting point in the lecture is the author connects the thoughts of ensemble to deep learning. It is very impressive to me.

    Helpful(0) Unhelpful(0)You have already voted this
  4. César Juárez

    Best course for Trees, many concepts are taken from the ESL book which could be hard. However, the instructor makes the topics really easy to understand, and the fact that he connects what it is learn in each one of them with previous and advanced courses makes the learning curve less steep. The programming excercises are awesome as they are a vital component to really get the nitty gritty of the algorithms.

    Helpful(0) Unhelpful(0)You have already voted this
  5. Sean Farias

    Learnt a ton from this course. The amount of insights I got from this course was certainly rewarding. I can recommend this course to help you on your own projects.

    Helpful(0) Unhelpful(0)You have already voted this
  6. Tomer Sh

    A very thorough course with an academic approach to it. For me this is great. All those “here is the api” courses out there really teach you nothing at the end of the day. Lazy Programmer is one of the few that actually goes in depth. Keep making those great courses 🙂

    Helpful(0) Unhelpful(0)You have already voted this
  7. Kiran Kura

    Overall, a good course. The math is sometimes hard to follow, the way it is presented. Looking forward to an extension to this course.

    Helpful(0) Unhelpful(0)You have already voted this
  8. Nguyen Tran Trung

    Your lecture is fascinating

    Helpful(0) Unhelpful(0)You have already voted this
  9. Charlotte Kuhn

    This course is a great followup to the supervised learning course. If you are into the nitty gritty of why things work, this is the right course for you. No boring cut and paste code here!

    Helpful(0) Unhelpful(0)You have already voted this
  10. Damon Gray

    Good start

    Helpful(0) Unhelpful(0)You have already voted this
  11. Sudesh Singh

    this was an amazing course for me to advance my machine learning journey. the instructor is to the point and gives a lot of info quickly. there are also lots of practical sample codes in the course, so it’s not just theory. i recommend it for a great course for becoming an advanced data scientist.

    Helpful(0) Unhelpful(0)You have already voted this
  12. Ellie Wang

    A super well-constructed course that includes everything you need to understand random forests and boosted trees (some basic knowledge of math and statistics is still necessary). The host is sharp and doesn’t waste time, a true professional, every word counts. All in all, I loved every moment in the course and I can definitely recommend it to anyone interested in machine learning. P. S. This course helped me land a job, so I’m eternally grateful to the Lazy Programmer. Well done!

    Helpful(0) Unhelpful(0)You have already voted this

    Add a review

    Your email address will not be published.

    Ensemble Machine Learning in Python: Random Forest, AdaBoost
    Ensemble Machine Learning in Python: Random Forest, AdaBoost

    $29.99

    Coletividad
    Logo
    Compare items
    • Total (0)
    Compare
    0
    Shopping cart