This Machine Learning: Decision Trees & Random Forests online course will teach you cool machine learning techniques to predict survival probabilities aboard the Titanic – a Kaggle problem!
Design and Implement the solution to a famous problem in machine learning: predicting survival probabilities aboard the Titanic. Understand the perils of overfitting, and how random forests help overcome this risk. Identify the use-cases for Decision Trees as well as Random Forests.
No prerequisites required, but knowledge of some undergraduate level mathematics would help, but is not mandatory. Working knowledge of Python would be helpful if you want to perform the coding exercise and understand the provided source code.
Taught by a Stanford-educated, ex-Googler and an IIT, IIM – educated ex-Flipkart lead analyst. This team has decades of practical experience in quant trading, analytics and e-commerce.
Python Activity: Surviving aboard the Titanic! Build a decision tree to predict the survival of a passenger on the Titanic. This is a challenge posed by Kaggle (a competitive online data science community). We’ll start off by exploring the data and transforming the data into feature vectors that can be fed to a Decision Tree Classifier.
Who is the target audience?
- Analytics professionals, modelers, big data professionals who haven’t had exposure to machine learning
- Engineers who want to understand or learn machine learning and apply it to problems they are solving
- Product managers who want to have intelligent conversations with data scientists and engineers about machine learning
- Tech executives and investors who are interested in big data, machine learning or natural language processing
Chapter 01: Decision Fatigue & Decision Trees
- Lesson 01: Introduction: You, This Course & Us!
- Lesson 02: Planting the seed: What are Decision Trees?
- Lesson 03: Growing the Tree: Decision Tree Learning
- Lesson 04: Branching out: Information Gain
- Lesson 05: Decision Tree Algorithms Lesson 06: Installing Python: Anaconda & PIP
- Lesson 07: Back to Basics: Numpy in Python
- Lesson 08: Back to Basics: Numpy & Scipy in Python
- Lesson 09: Titanic: Decision Trees predict Survival (Kaggle) – I
- Lesson 10: Titanic: Decision Trees predict Survival (Kaggle) – II
- Lesson 11: Titanic: Decision Trees predict Survival (Kaggle) – III
Chapter 02: A Few Useful Things to Know about Overfitting
- Lesson 01: Overfitting: The Bane of Machine Learning
- Lesson 02: Overfitting continued
- Lesson 03: Cross-Validation
- Lesson 04: Simplicity is a virtue: Regularization
- Lesson 05: The Wisdom of Crowds: Ensemble Learning
- Lesson 06: Ensemble Learning continued: Bagging, Boosting & Stacking
Chapter 03: Random Forests
- Lesson 01: Random Forests: Much more than trees
- Lesson 02: Back on the Titanic: Cross Validation & Random Forests
Length of Subscription: 12 Months Online On-Demand Access
Running Time: 4 hrs 50 min
Platform: Windows & MAC OS
Level: Beginner to Advanced
Project Files: Included
Learn anytime, anywhere, at home or on the go.
Stream your training via the internet, or download to your computer and supported mobile device, including iPad, iPhone, iPod Touch and most Android devices.
Need to train your Team? Contact Us for Discounts on Multiple Subscription Purchases.