I will continue adding new good resources to this post. But don’t get lost in them. Choose one and keep learning. After you finished one course, then choose another one.
Hope this can be helpful to you!
For a newecomer to deep learning and machine learning area, facing some much courses and resources, the first question is how to choose right books and courses to begin this trip.The roadmap includes some highly recommended courses, books and papers which can help you get into DL and ML area quickly. Maybe some materials are quite difficult, but really worth reading and studying.
A machine translation reading list maintained by the Tsinghua Natural Language Processing Group.
This repository aims to provide simple and read-to-use tutorials for TensorFlow. Each tutorial includes
source code and most of them are associated with a
In this course, they discuss how Bayesian Methods can be combined with Deep Learning and lead to better results in machine learning applications.
The most important thing is this course provides full videos, slides and assignments.
It is a Jupyter notebook version of this book
It might be a good book for the people who want to learning CV but with little background.
This project aims at teaching you the fundamentals of Machine Learning in python. It contains the example code and solutions to the exercises in this book.
You can download a pdf version of this book at here.
This Matlab package implements machine learning algorithms described in the great textbook: Pattern Recognition and Machine Learning by C. Bishop (PRML).
Essential Cheat Sheets for deep learning and machine learning researchers and practitioners.
The mission of Papers With Code is to create a free and open resource with Machine Learning papers, code and evaluation tables.
This book explains to you how to make (supervised) machine learning models interpretable.
Q&A of Some Concrete Topics
The full derivation of the Maximum Likelihood Estimators for the multivariate Gaussian.
Page 3 of a note of CS229.
This question puzzled me when I read page 96 of PRML. The original book uses likelihood, but in the errata, the author changed likelihood to negative likelihood to statisfy to condition of Robbins and Monro(1951).
In this reference tutorial, we will demonstrate how to complete the square in both univariate (scalar) and multivariate (matrix) contexts. You can also find relevant discussion in section 2.3.1 of PRML.
Derivation of PRML eq(4.5).