LODO

SENHAJI RHAZI Hamza

Data Engineer @ Artefact

Passionated AI and Datascience.

Presentation

Last course added/modified at 29/01/2019

The idea of this tutorial came from a frustration felt when i was trying to understand what behind the machine learning algorithms, i haven't found an easy way to understand them, it was frustrating, but i came to understand some of it and i was amazed, so this tutorial aims to share this amazement felt after coming over those difficulties


There are many rigorous ways to define Machine learning, but i will define it practically: Machine learning is a game where you have many samples (inputs) {x1, x2, ....,xn} and many labels {y1, y2, ....,yn} (outputs) that corresponds recpectively, the aim of this game is to guess what is the function f(x) that allows to go from the samples to the labels Machine learning is the numerous mathematical methods to guess what f(x) is


That being said, we can't really understand Machine learning without some notions of linear algebra.

So before attacking directly the subject, we are going to do some linear algebra algorithms for problem solving like gradient descent, then we move to Machine learning algorithms.


Please for any feed back, write me an email to hamza.senhajirhazi@gmail.com with the object "[FEEDBACK]" as i can find it back when i will be searching for it in order to reply, thank you !
PS: Sometimes the math equations doesn't render on the browser, if it does happen to you, just refresh the page.
If it doesn't work, try on another browser, sorry for this inconvenience, i am working on finding a definitive solution !

Contents

  • Linear Algebra for system solving

    • Gauss method
    • Factorization LU
    • Jacobi and Gauss Seidel method
    • Gradient and conjugate gradient method
    • Newton descent
    • Generalization of gradient and newton descent + lagrangian
  • Machine learning Algorithms

    • Linear regression
    • Support vector machine
    • Decision Tree
    • Random forest
    • Bayesian classification
    • Expectation maximization (generative mixture models)(generalization of k means)
  • Dimentionality reduction

    • Principal component analysis
    • Linear discriminant analysis (LDA)
    • Kernel PCA
    • Kernel LDA

Linear ALGEBRA

Gauss Method


Factorization LU


Jacobi and Gauss Seidel method


Gradient and conjugate gradient method


Newton Descent


Generalization of gradient and newton descent + lagrangian