Currently Empty: £0.00
You’ve just stumbled upon the most complete, in-depth Machine Learning course online.
Whether you want to:
– build the skills you need to get your first data science job
– move to a more senior software developer position
– become a computer scientist mastering in data science
– or just learn Machine Learning to be able to create your own projects quickly.
…this complete Machine Learning Masterclass is the course you need to do all of this, and more.
This course is designed to give you the machine learning skills you need to become a data science expert. By the end of the course, you will understand the machine learning method extremely well and be able to apply it in your own data science projects and be productive as a computer scientist and developer.
What makes this course a bestseller?
Like you, thousands of others were frustrated and fed up with fragmented Youtube tutorials or incomplete or outdated courses which assume you already know a bunch of stuff, as well as thick, college-like textbooks able to send even the most caffeine-fuelled coder to sleep.
Like you, they were tired of low-quality lessons, poorly explained topics, and confusing info presented in the wrong way. That’s why so many find success in this complete Machine Learning course. It’s designed with simplicity and seamless progression in mind through its content.
This course assumes no previous data science experience and takes you from absolute beginner core concepts. You will learn the core machine learning skills and master data science. It’s a one-stop shop to learn machine learning. If you want to go beyond the core content you can do so at any time.
What if I have questions?
As if this course wasn’t complete enough, I offer full support, answering any questions you have.
This means you’ll never find yourself stuck on one lesson for days on end. With my hand-holding guidance, you’ll progress smoothly through this course without any major roadblocks.
There’s no risk either!
This course comes with a full 30-day money-back guarantee. Meaning if you are not completely satisfied with the course or your progress, simply let me know and I’ll refund you 100%, every last penny no questions asked.
You either end up with Machine Learning skills, go on to develop great programs and potentially make an awesome career for yourself, or you try the course and simply get all your money back if you don’t like it…
You literally can’t lose.
Moreover, the course is packed with practical exercises that are based on real-life case studies. So not only will you learn the theory, but you will also get lots of hands-on practice building your own models.
And as a bonus, this course includes Python code templates which you can download and use on your own projects.
Ready to get started, developer?
Enroll now using the “Add to Cart” button on the right, and get started on your way to creative, advanced Machine Learning brilliance. Or, take this course for a free spin using the preview feature, so you know you’re 100% certain this course is for you.
See you on the inside (hurry, Machine Learning is waiting!)
Machine Learning Fundamentals
Introduction - Preprocessing and Analysis
Visualization - Principal Component Analysis
Visualization - Locally Linear Embedding (LLE)
-
7Introduction to PCA
On this lesson we introduce the Principal Component Analysis and give a brief background to the technique.
-
8Introduction to the Dataset
On this lesson we introduce the dataset crabs.csv.
Later we will use PCA for the visualization of this dataset. -
9Initial Visualization
On this lesson we perform a basic exploration of the data set and an initial visualization.
At the end we discuss the importance of applying dimensionality reduction techniques.
On the following lesson we will apply PCA.
-
10Using PCA
On this lesson we use the Principal Component Analysis technique to visualize the separation between the classes.
Visualization - t-Stochastic Neighbor Embedding (t-SNE)
-
11Introduction to LLE
On this lesson we introduce the Localy Linear Embedding (LLE) algorithm.
-
12Locally Linear Embedding Algorithm
On this lesson we introduce the steps that are followed in the LLE algorithm.
On future lessons we will apply this method in practise using Python. -
13Introduction to the Dataset
On this lesson we introduce the dataset crabs.csv.
Later we will use LLE for the dimensionality reduction and visualization of this dataset. -
14Using LLE
On this lesson we use the Locally Linear Embedding technique to reduce the dimensionality of our dataset.
We also visualize the new dimensional space with 2 components. This is a visualization in 2 dimensions. -
15LLE with 3 Dimensions
On this lesson we use the Locally Linear Embedding technique to reduce the dimensionality of our dataset.
Now we use 3 components. At the end we do a visualization of the new dimensional space with 3 dimensions. With this, we end the practise of Locally Linear Embedding.
Visualization - Multidimensional Scaling (MDS)
-
16Introduction to t-SNE
On this lesson we make a brief introduction to the t-Stochastic Neighbor Embedding dimensionality reduction technique.
-
17Dataset
On this lesson we mention that we will use the crabs.csv dataset on this section.
This is a dataset we already worked with previously. Those that are familiar with this
dataset can skip the following lesson and just go to the next one. -
18Introduction to the Dataset
On this lesson we introduce the dataset crabs.csv.
Later we will use t-SNE for the visualization of this dataset. -
19t-SNE on Raw Data
On this lesson we apply the t-Stochastic Neighbor Embedding technique on the original dataset of the crabs. At the end, we visualize the separation between the classes using 2 dimensions and 3 dimensions.
-
20t-SNE on Scaled Data
On this lesson we apply the t-Stochastic Neighbor Embedding technique on the scaled dataset of the crabs. At the end, we visualize the separation between the classes using 2 dimensions and 3 dimensions.
-
21t-SNE on Standardized Data
On this lesson we apply the t-Stochastic Neighbor Embedding technique on the standardized dataset of the crabs. At the end, we visualize the separation between the classes using 2 dimensions and 3 dimensions.
Visualization - ISOMAP
-
22Introduction to MDS
On this lesson we introduce the Multidimensional Scaling dimensionality reduction technique.
-
23Using MDS with 2 Dimensions
On this lesson we apply the MDS dimensionality reduction technique in the crabs dataset.
At the end, we visualize the separation between classes in 2 Dimensions. -
24Using MDS with 3 Dimensions
On this lesson we apply the MDS dimensionality reduction technique in the crabs dataset.
At the end, we visualize the separation between classes in 3 Dimensions.
Visualization - Fisher Discriminant Analysis
-
25IntroducciĂłn to ISOMAP
On this lesson we make a brief introduction to the ISOMAP dimensionality reduction technique.
-
26ISOMAP with 2 Dimensions
On this lesson we use the ISOMAP technique to reduce the dimensionality of our dataset.
We also visualize the new dimensional space with 2 components. This is a visualization in 2 dimensions. -
27ISOMAP with 3 Dimensions
On this lesson we use the ISOMAP technique to reduce the dimensionality of our dataset.
We also visualize the new dimensional space with 3 components. This is a visualization in 3 dimensions.
Visualization Final Project - Images
-
28Introduction to Fisher Discriminant Analysis
On this lesson we make a brief introduction to the Fisher Discriminant Analysis dimensionality reduction technique.
-
29Dataset Information
On this lesson we mention that we will use the crabs.csv dataset on this section.
This is a dataset we already worked with previously. Those that are familiar with this
dataset can skip the following lesson and just go to the next one. -
30Introduction to the Dataset
On this lesson we introduce the dataset crabs.csv.
Later we will use Fisher Discriminant Analysis for the visualization of this dataset and the separation of the crabs. -
31Fisher Discriminant Analysis with 2 Dimensions
On this lesson we use the Fisher Discriminant Analysis technique to reduce the dimensionality of our dataset.
We also visualize the new dimensional space with 2 components. This is a visualization in 2 dimensions. -
32Fisher Discriminant Analysis with 3 Dimensions
On this lesson we use the Fisher Discriminant Analysis technique to reduce the dimensionality of our dataset.
We also visualize the new dimensional space with 3 components. This is a visualization in 3 dimensions.
Linear Regression
-
33Images
On this lesson we explain that images can be transformed for use by dimensionality reduction methods by converting them to arrays of numeric values.
-
34Introduction to the Image Dataset
In this lesson we introduce the image dataset digits.
Later we will use Dimensionality Reduction techniques for the visualization of this dataset and the separation of the images. -
35Fisher Discriminant Analysis
On this lesson we use the Fisher Discriminant Analysis technique to reduce the dimensionality of our dataset.
We visualize a new dimensional space with 2 components in 2 dimensions.
We then also visualize a new dimensional space with 3 components in 3 dimensions. -
36Locally Linear Embedding
On this lesson we use the Locally Linear Embedding technique to reduce the dimensionality of our dataset.
We visualize a new dimensional space with 2 components in 2 dimensions.
We then also visualize a new dimensional space with 3 components in 3 dimensions. -
37Principal Component Analysis
On this lesson we use Principal Component Analysis to reduce the dimensionality of our dataset.
We visualize a new dimensional space with 2 components in 2 dimensions.
We then also visualize a new dimensional space with 3 components in 3 dimensions. -
38ISOMAP
On this lesson we use ISOMAP to reduce the dimensionality of our dataset.
We visualize a new dimensional space with 2 components in 2 dimensions.
We then also visualize a new dimensional space with 3 components in 3 dimensions.
Ridge Regression
-
39Introduction to the Dataset
On this lesson we introduce the dataset LifeExpectancy.csv.
Later we will use Linear Regression to predict the life expectancy of different countries during different years. -
40Preprocessing
On this lesson we perform the preprocessing of the dataset so that we can apply linear regression.
It is mandatory to remove categorical variables and missing values. -
41Linear Regression
On this lesson we apply Linear Regression to predict the target variable: life expectancy.
-
42Metrics
On this lesson we study the fundamental metrics to evaluate our model.
-
43Cross Validation
On this lesson we use cross validation so we can get a generalization error as close as possible to the error we would get using the model with completely new data.
Regression - Understand the Models
Classification
-
46Analysis
On this lesson we perform an initial analysis of the models.
We visualize the weights and discuss the importance of scaling our data. -
47Data Scaling
On this lesson we scale out data and evaluate a performance analysis.
-
48One-Hot Encoding
On this lesson we apply One-Hot Encoding to our categorical variables.
-
49Regularization
On this lesson we apply regularization after the preprocessing (Scaling + One Hot Encoding).
Therefore, we use the regularized regression models: Ridge Regression and Lasso Regression. -
50Final Results
On this lesson we explore and visualize the final results of our models to predict the Life Expectancy.
Support Vector Machines for Regression
-
51Introduction to the Dataset
On this lesson we introduce the breast cancer dataset.
Later we will use Linear Classification to predict if the patient has a malignant or benign tumor. -
52Partition of the Dataset: Train and Test
On this lesson we partition our dataset into training and test.
-
53Preprocessing
On this lesson we preprocess and prepare our data to apply visualization techniques and classification models.
-
54Principal Component Analysis
-
55Linear Discriminant Analysis
On this lesson we implement our first classifier: Linear Discriminant Analysis.
We study the results in the classification report. Moreover, we plot the Confusion Matrix and the Roc Curve. -
56Naive Bayes Classifier
On this lesson we implement the Naive Bayes classifier.
We study the results in the classification report. Moreover, we plot the Confusion Matrix and the Roc Curve. -
57Quadratic Classifier
On this lesson we implement the Quadratic classifier.
We study the results in the classification report. Moreover, we plot the Confusion Matrix and the Roc Curve. -
58Logistic Regression
On this lesson we implement Logistic Regression for our classification.
We study the results in the classification report. Moreover, we plot the Confusion Matrix and the Roc Curve.
Support Vector Machines for Classification
-
59Introduction to Support Vector Machines
On this lesson we introduce Support Vector Machines for Regression
-
60Introduction to the Dataset
On this lesson we introduce the energy dataset.
Later we will use Support Vector Machines to predict the energy consumption of household appliances. -
61Partition of the Dataset - Target Variable
On this lesson we begin the partition of our dataset.
We partition the target variable into traning and test. -
62Partition of the Dataset - Time Series Windows
On this lesson we generate the data matrix that we will use to make the predictions.
We use the 4 previous moments in time as data to make the predictions. -
63Support Vector Machine - Linear Kernel
On this lesson we use a Support Vector Machine (SVM) with Linear Kernel to predict the energy consumption of household appliances in the house. We use the 4 previous moments in time as data to make the predictions.
-
64Support Vector Machines - Polynomial Kernels
On this lesson we use a Support Vector Machine (SVM) with Polynomial Kernel to predict the energy consumption of household appliances in the house. We use the 4 previous moments in time as data to make the predictions.
-
65Support Vector Machine - Radial Basis Function (RBF) Kernel
On this lesson we use a Support Vector Machine (SVM) with RBF Kernel to predict the energy consumption of household appliances in the house. We use the 4 previous moments in time as data to make the predictions.
Neural Networks
-
66Introduction to Support Vector Machines
On this lecture we briefly introduce Support Vector Machines (SVM) for Classification.
-
67Introduction to the Dataset
On this lesson we introduce the dataset arxivs that we will work along this section.
Later we will use Support Vector Machines to predict to which category each document belongs. -
68Partition of the Dataset
On this lesson we perform the partition of the dataset.
Hence we divide our dataset into two partitions: training and test. -
69Transformation to Data Matrix
On this lesson we apply the transformation of our unstructured dataset to a data matrix.
Thus, allowing us to apply learning models. -
70Dimensionality Reduction
On this lecture we apply Dimensionality Reduction.
-
71Support Vector Machine - Linear Kernel
On this lesson we use a Support Vector Machine (SVM) with Linear Kernel to predict which category (class) each document (sample) belongs to.
-
72Support Vector Machines - Polynomial Kernels
On this lesson we use a Support Vector Machine (SVM) with Polynomial Kernel to predict which category (class) each document (sample) belongs to.
-
73Support Vector Machine - Radial Basis Function (RBF) Kernel
On this lesson we use a Support Vector Machine (SVM) with RBF Kernel to predict which category (class) each document (sample) belongs to.
Neural Networks for Regression
Neural Networks for Classification
-
75Dataset Information
On this lesson we mention that we will use the Energy.csv dataset on this section.
This is a dataset we already worked with previously. Those who are familiar with this
dataset can skip the following lesson and just go to the next one. -
76Introduction to the Dataset
On this lesson we introduce the energy dataset.
Later we will use the MLP neural network to predict the energy consumption of household appliances. -
77Partition of the Dataset - Target Variable
On this lesson we begin the partition of our dataset.
We partition the target variable into traning and test. -
78Partition of the Dataset - Time Series Windows
On this lesson we generate the data matrix that we will use to make the predictions.
We use the 4 previous moments in time as data to make the predictions. -
79Multilayer Perceptron Neural Network
On this lesson we use a Multilayer Perceptron (MLP) Neural Network to predict the energy consumption of household appliances in the house. We use the 4 previous moments in time as data to make the predictions.
How long do I have access to the course materials?
You can view and review the lecture materials indefinitely, like an on-demand channel.
Can I take my courses with me wherever I go?
Definitely! If you have an internet connection, courses on Udemy are available on any device at any time. If you don't have an internet connection, some instructors also let their students download course lectures. That's up to the instructor though, so make sure you get on their good side!
Stars 5
26
Stars 4
12
Stars 3
9
Stars 2
0
Stars 1
2