Course Coordinator: 
Mohammad Emtiyaz Khan
Foundations of Machine Learning

Machine learning aims to design machines that can autonomously learn like humans and animals. In this course, basic principles and methods of machine learning will be introduced. You will learn a few basic methods, how they relate to each other, and why they work.

Target Students

This course is intended for beginners in machine learning with some background in basic math and programming.

Course Materials

1) T. Hastie, R. Tibshirani and J. Friedman: Elements of statistical learning, free download from http://statweb.stanford.edu/~tibs/ElemStatLearn/
2) C. Bishop: Pattern Recognition and Machine Learning (available online)

Assessment items

1. [40%] Class summary: Students will summarize every two weeks of lecture in their own words (a total of 5 such reports). This needs to be a summary based on understanding and can be as short as as 2 pages.  
2. [40%] Project report and presentation: students will submit a final project report in Week 13, and present their work in Week 14. The grading will be based on constructive feedback from the class on the project and presentation.  
3. [20%] Class discussions Grading expectations

Project Description: Students will work on a project in a team of 2 people, and do the following:  

- Implement a method from scratch  
- Evaluate its performance and compare to a simple baseline  
- Write up findings in a 4-page report  
- Give a 15 minute oral-presentation about it

Students successfully completing this course will be able to: • Explain a few methods for Regression and Classification. • Implement and apply these methods to real data. • Discuss fundamental principles of machine learning. • Continue to work through difficulties to find better solutions. • Create an assessment of current skill level, and devise a plan for ongoing learning.
Course Content: 

Week 1 (online): Intro + regression (linear models)
Week 2 (online): (Stochastic) gradient descent, Newton's method, Project starts
Week 3 (online): Overfitting, cross-validation, bias-variance decomposition
Week 4 (online): Classification: Logistic regression
Week 5 (online): Classification: Support vector machines
Week 10 (online): Deep Learning methods
Week 11 (online/in-person): Gaussian Process Regression and Classification
Week 12 (in-person): Machine Learning from a Bayesian Perspective
Week 13 (in-person): Machine Learning from a Bayesian Perspective, Project ends
Week 14 (in-person): Recap and project presentations

Course Type: 
Prior Knowledge: 

• Programming in at least 1 language, such as Python/Matlab/R (basic level)  
• Basic linear algebra (system of linear equations and SVD)
• Basic multivariate calculus (deriving gradients)
• Basic probability and statistics (conditional and joint distribution, independence, Bayes rule, random variables, expectation, mean, median, mode, central limit theorem)  
• Univariate and multivariate Gaussian distribution (joint, conditional, and marginals)  

It is a plus if you have taken one of either of these courses (but not a requirement):
• Jun Tani, Cognitive Neurorobotics, https://groups.oist.jp/course/course-neurorobotics
• Tomoki Fukai, Statistical Modeling, https://groups.oist.jp/course/statistical-modeling