Roadmap of Mathematical Disciplines for Machine Learning, Part 2 (Probabilities)

Instead of introducing


Once upon a time there was the first part , now it is time for the second part! Here we touch on issues related to probability theory.


As in the previous article, we will highlight several "levels" of immersion in the subject and its study. In fact, I believe that each subject should be completed several times at different "levels" of difficulty: first you immerse yourself in the subject, get used to the "dictionary", to the typical formulation of problems and methods for solving them. After a while, enriched with knowledge from other areas, you are ready to take the course again, but at a slightly higher level. You now may be interested not only in standard tasks, but also in the limitations of methodology, non-standard approaches, perhaps some kind of philosophy from which the subject grew (an eternal debate between the “frequentist school” and the “bayesian school”).


Let me remind you that we distinguish three "levels" of complexity:


  1. Bring it on - the main workhorse; these are books that are called “must have”.
  2. Hurt me plenty - a level higher, allows you to look at level 1 from a bird's eye view, systematizes knowledge, combines different areas of knowledge.
  3. Nightmare - for the strong in spirit, the level of mehmat, for lovers of mathematics and ivory towers.

In most cases, I indicate those books that I either read myself or that are very popular in the (mathematical) community - they are advised on stackoverflow, goodreads, quora, etc.


Classic probability


I am convinced that it makes no sense to jump immediately through three steps and immediately plunge into the world of Bayesian inference, without having previously studied the classical sections: probability theory and statistics.


() ?


  • : , () , ( , - );
  • , , . , ;
  • : /, , ;
  • : , , , , , ;
  • : , ;

( — ). , .


, ?


  • : , , ( , ) , " " - .
  • : . (, , ).
  • : ( MCMC )
  • : , ;
  • : , /;
  • : , ;

Bring it on


Blitzstein & Hwang "Introduction to Probability"; , : stat110. youtube edx stat110x. , . 110 .
, . : " " . R.


: coursera/ /openedu (. openedu). : .
:


  1. .. " ". — " - ..."
  2. - , : " . ". , " ".

Hurt me plenty


MIT
Grimmett, Stirzaker "Probability and random processes" , ( !!! ): One Thousand Exercises in Probability.
Michael Mitzenmacher, Eli Upfal: "
Probability and Computing: Randomization and Probabilistic Techniques in Algorithms and Data Analysis (2nd Edition)". , "real-world" . : VC-.


:


  1. " " 2008 ;
  2. " " ;
  3. . : ;
  4. , 1-2. , , (, , ..). , , , . 1984- ().

: . , "". - , computer science center: .


: . , , , .


Nightmare


. . ,
, , :


  1. , "";
  2. , " ";
  3. , , " ", , — , .

- , "", :


: .
, , :


  1. RICK DURRETT, "Probability, Theory and Examples";
  2. Kai Lai Chung, "A course in probability theory";

Unclassified


, .


,


, . , : overkill; .
, : David Williams, "Weighing the Odds".
, , . , .


Cheatsheets


, - cheatsheets (""), , . :


  1. 1;
  2. 2;


. -, , .


  1. One Thousand Exercises in Probability — , . , ;
  2. , " " — , . , , x. , " ".
  3. " " — ; , , , - ;
  4. , " ";
  5. Blitzstein & Hwang "Introduction to Probability";

?


, "" , ( , ) ; ; (MCMC ) .


python/julia, QuantEcon, , ; , , - R/Python/Julia.


?


Of the probability, mathematical statistics and the Bayesian approach remained completely unaffected. To my regret, in these areas I read fewer books, and therefore tips will be of less value. It is unclear whether to write, and in how much detail.


All Articles