Courses

Machine Learning

Instructor
admin
0 Reviews

Course Description

SECTION 1 :  Machine Learning Basics and its Life Cycle

Machine Learning introduction Difference Between Business Intelligence Team, Data Analyst , and Data Scientist, 

Purpose of Machine Learning, Deep Learning, NLP,AI

What is Machine Learning ? 

Introduction to Supervised Learning and Unsupervised Learning

Introduction to Reinforcement learning . 

How traffic board , bank fraud transaction systems can use machine learning.

Machine Learning Life cycle 

Introduction to data extractions

More details on online, batch, data streaming systems.

Introduction to NOSQL database sources  

Types of NoSql databases

Overview of Key value store databases

Document store databases

Columner Store Databases 

Graph Store Databases

3 types of data sets used in Machine Learning

3 approaches to create Train , Validation, test data sets as part of data preparation

Types of data extacting techniques  

Data Cleansing and Transormations . 

How to clean missed values. if problem is Regression Problem.

How to clean missed values. if problem is Classification Problem.

Data cleansing and Transformations

How to transform if input variable is character(string) continues value. 

How to transform if input variable is  character(string) categorical value.

Develop function for cleansing 

Develop function for transformations 

Need of Scaling data and Scaling Techniques. 

When to use what type of scaling technique. 

Develop functions for scaling techniques with python

Introduction to Training model

Evaluating model

Model Selection 

Deployment of model 

Rebuilding a model 

Summary of Machine Learning Life Cycle

More on supervised, unsupervised, Reinforcement Learning

Preparing Train and Test sets Using Python and numpy 

Cleaning missed values in continuous variables with python for Regression models

How to transform if input variable is character categorical(classifier) for regression model

Transforming string continuous variables into numerical scores Using Python and numpy

Transforming String categorical values into probabilities with random noise using Python and numpy for regression models

Scaling input features and labels for Regression models Using Python and Numpy

 

Section 2 :  Machine Learning Models introduction,     Tensorflow Basics , Pytorch Basics

 Introduction to predictive models

Introduction to clustering models

Introduction to Recommender Systems models

Difference between predictions and forcasting

what are descriminative model

what are generative model

Introduction to Linear Regression. 

How to Derive coefficients which will correlate input features and target labels

More discussion on Linear Regression. 

How to Deal non linear regression using Polynomial techniques such as quadratic models and cubic models

Implementation of Linear, Qudratic polynomial, Cubic Polynomial Regression Using R language

 Developing predict function and Accuracy testing function using R Language

 Statistical approach of Tuning Co-efficients . Problems in Statistical approach . And a Little Introduction to Gradient Descent Algorithm

 How to prepare Linear input Matrix And How to Convert  Linear Matrix into n-degree-polynomial matrix using Python Numpy

 Implementation of Linear and polynomial regressions . And accuracy testing. And predicting target labels of new data Using Python Numpy

 Introduction to Tensorflow

 A sample code explanation with Tensorflow and Keras

 Step by Step explanation of Tensorflow code part 1

 Step By Step explanation of Tensorflow code Part 2

 Step by Step Explanation of Tensorflow code Part 3

 Low Level Api of Tensorflow Part 1

 Low Level Api of Tensorflow Part 2

 How to derive Weight matrix if you have multiple target variables

 Tensorflow Computations over matrices

 Extracting Weight matrix using Tensorflow for Linear Regression

Introductin to pytorch 

 

Section 3 : Gradient Descent Algorithm

What is a model

How brainless model will learn from data 

What is brain of a model?

Optimizers and how they work

Gradient descent algorithm as an Optimizer

Mathematics of Gradient

Linear algebra for Gradient

How you make sure training of model is completed?

What is convergence? and why you should use convergence?

Importance of scaling features

Types of scaling techniques

When to use what type of scaling

importance of bias for the feature matrix

Preparation of feature and label matrices

Gradient descent algorithm for regression problems

Gradient descent algorithm for classification problems

How you measure loss of a model

Types of loss functions 

When to use what type of loss function.

Mean square error as a loss fucntion

Cross entropy as a loss function

How gradients work to reduce loss of model

What is learning rate and how it helps to improve learning speed of model

How much learning rate to be defined.

Dangers with learning rate

what is overlapping and underlapping

Python : scaling data 

How to decide scaling of features required or not

Develop python function for regression prediction

Develop python function for classification prediction 

Develop python function for MSE

Develop python fucntion for Cross Entropy

Develop python function for gradients with derivative for MSE

Develop python function for gradients with derivatives for Cross Entropy

Adjust weights for Regression models

Adjust weights for Classification models

A full implementation of Gradient Descent algorithm with python for regression.

A full implementation of Gradient Descent algorithm with python for classification.

Types of classifications as binary and multinomial

How to convert string labels into numeric labels with python

How to convert numeric labels in to binary array.

why to convert into binary array

How to train models for multiple target variables for regression with python

Upscaling predictions 

How to train models for multiple target variables for classifications with python

How to transform predicted probabilities into binary array with python

How to transform binary array into numeric labels.

how to do accuracy testing on multiple target variables(all target variables are continues)

How to do accuracy testing on multiple target variables(all target variables are binary classifiers)

How to do accuracy testing on multiple target variables(all target variables are combination of binary and multinomial classifiers)

How to do accuracy testing on multiple target variables(all target variables are combination of continues and classification variables)

What is linearity and non linearity in data

How to transform non linear data to linear data

How polynomial techniques are used to transform non linear data to linear data

Develop function to transform non linear data to linear data with python

How to train models for nonlinear data 

libraries used for train machine learning models

What is scikit-learn

What is tensorflow

What is keras

What is pytorch

Building end to end model with scikit-learn

Building end to end model with tensorflow

Building end to end model with keras

Building end to end model with pytorch

How to save trained model

How to load saved model

How to predict target labels of new data

 

SECTION 4 :  Types of Gradient Descent Algorithm

Gradient descent algorithm-revision

Problem with basic gradient descent algorithm

Solution as batch gradient algorithm

What is a batch?

What is an epoch?

What is training iteration?

What is global minimum?

How to select weights of global minimum as knowledge of model

Problem with batch gradient algorithm

Solution as mini batch gradient

Problem with mini-batch gradient algorithm

Solution as stochastic gradient

What is stochastic process?

Types of sample sets

Sampling with replacement 

Sampling with out replacement

when to use replacement and without replacement

Python implementation of batch gradient algorithm with numpy 

Python implementation of batch gradient algorithm with scikit-learn

Python implementation of batch gradient algorithm with tensorflow

Python implementation of batch gradient algorithm with keras

Python implementation of mini-batch gradient algorithm with numpy 

Python implementation of mini-batch gradient algorithm with scikit-learn

Python implementation of mini-batch gradient algorithm with tensorflow

Python implementation of mini-batch gradient algorithm with keras

Python implementation of stochastic gradient algorithm with numpy 

Python implementation of stochastic gradient algorithm with scikit-learn

Python implementation of stochastic gradient algorithm with tensorflow

Python implementation of stochastic gradient algorithm with keras

 

SECTION 5 :  Naive Bayes Classifier

Introduction to naive bayes

Probability basics

Mathematics behind naive bayes

What is posterior probability 

Posterior probability for single input variable

Posterior probability for multiple input variables

How to test predictions

Python implementation of naive bayes classifier with numpy

Python implementation of naive bayes classifier with scikit-learn

Python implementation of naive bayes classifier with tensorflow

Python implementation of naive bayes classifier with keras

 

SECTION 6 :  Decision Tree Classifier

Introduction to decision tree classifier

When to use decision tree classifier

Conditional probability basics

How to construct decision tree

Components of decision tree

What is a root node

What is a branch

What is a leaf node

What is  a terminal node

How to select variables for the nodes

Mathematics behind decision tree classifier

What is entropy

Entropy of target variable

Entropy of input variable on target variable

What is information gain

What is gini index

How to compute information gain with python numpy

How to compute gini index with python numpy

How to construct decision tree with python numpy

Python implementation of decision tree classifier with numpy

Python implementation of decision tree classifier with scikit-learn

Python implementation of decision tree classifier with tensorflow

How to test predictions

 

SECTION 7 :  Decision Tree Regressor

Introduction to decision tree regressor

When to use decision tree regressor

Conditional probability basics

How to construct decision tree

Components of decision tree

what is a root node

what is a branch

what is a leaf node

what is  a terminal node

How to select variables for the nodes

Mathematics behind decision tree regressor

what is entropy

Entropy of target variable

Entropy of input variable on target variable

What is information gain

What is gini index

How to compute information gain with python numpy

How to compute gini index with python numpy

How to construct decision tree with python numpy

Python implementation of decision tree regressor with numpy

Python implementation of decision tree regressor with scikit-learn

Python implementation of decision tree regressor with tensorflow

How to test predictions

 

SECTION 8 :  Random Forest

Introduction to ensemble models

When to use Random forest

How to construct Random forest

Decision tree vs Random forest

Need of multiple trees in Random forest

Sampling with replacement

Sampling with out replacement 

Why Random forest opts sampling with replacement method

How to choose number of trees in Random forest and guide lines

Python implementation of Random forest with scikit-learn

How to test predictions

Implementation of Random forest for regression problems with python scikit-learn

 

SECTION 9 :  SVM(Support Vector Machine )Classifier

Introduction to SVM

When to use SVM

Problem with probabilistic models

How svm solves this problem

What is hyperplane

What is optimized hyperplane

What is marzin of hyperplane

What are support vectors

How to construct SVM

Mathematics behind SVM

Python implementation of SVM

How to test predictions

 

SECTION 10 :  Ensemble Methods

Bagging meta estimator

Single estimetors Vs bagging : bias-varience decomposition

Forest of Randomised Trees:

     -> Random forest’s

     -> Extremely Randomized Trees

     -> Parameters

     -> Parallelization

           -> Examples:

            -> Plot the decision surfaces of ensembles of trees 

                    on the iris dataset

            -> Pixel importances with a parallel forest of trees

            -> Face completion with a multi-output estimators

     ->  Feature importance evaluation

        ->  Examples :

                    ->  Pixel importances with a parallel forest of trees

                    ->  Feature importances with forests of trees

     ->  Totally Random Trees Embedding

         -> Examples :

                    ->  Hashing feature transformation using Totally 

                       Random Trees

                    ->  Manifold learning on handwritten digits: 

                    ->  Feature transformations with ensembles of 

                          trees

 

SECTION 11 :  AdaBoost

      ->  Usage

            ->  Examples:

                     -> Discrete versus Real AdaBoost

                     -> Multi-class AdaBoosted Decision trees

                     -> Two-class AdaBoost

                     -> Decision Tree Regression with AdaBoost

SECTION 12 :  Gradient Tree Boosting:

       -> Classification

       -> Regression

       ->  Examples:

               ->  Gradient Boosting regression

               ->  Gradient Boosting Out-of-Bag estimates

       ->  Controlling the tree size

       ->  Mathematical formulation

       ->  Loss Functions

       ->  Regularization

                ->  Shrinkage

                ->  Subsampling

                ->  Examples:

                      ->  Gradient Boosting regularization

                      ->  Gradient Boosting Out-of-Bag estimates

                      ->  OOB Errors for Random Forests

       ->  Interpretation

                 ->  Feature importance

                            ->  Examples:

                                   ->  Gradient Boosting regression

SECTION 13 :  Histogram-Based Gradient       Boosting 

        ->  Examples:

               ->  Partial Dependence Plots

       ->  Usage

       ->  Missing values support

       ->  Low-level parallelism

       ->  Why it’s faster

SECTION 14 :  Voting Classifier 

       ->  Majority Class Labels

       ->  Usage

       ->  Weighted Average Probabilities

       ->  Using the VotingClassifier with GridSearchCV :

       ->  Usage

SECTION 15 :  Voting Regressor 

       -> usage   

       -> Examples:

               -> Plot individual and voting regression predictions

 

SECTION 16 :  Stacked Generalization 

 

SECTION 17 :   Recommendation Systems

     -> Graph Based Recommendations

     -> Pattern Based Recommendations with FPGrowth

     -> Collaberative filtering using IBCF and UBCF

About Instructor

  • admin

Reviews

0
0 Ratings
stars 5
0%
0
stars 4
0%
0
stars 3
0%
0
stars 2
0%
0
stars 1
0%
0

There are no reviews yet.

Leave a Review

Be the first to review “Machine Learning”

Quick Contact