Introduction to basic probabilistic concepts and methods used in computer science. Develops an understanding of the crucial role played by randomness in computing, both as a powerful tool and as a challenge to confront and analyze. Emphasis on rigorous reasoning, analysis, and algorithmic thinking. (Counts as a Group B course for the CS major, and a background course for the CS minor.)
More specifically, we focus on basic probability theory and applications and uses of probability theory in computer science. This includes using probability to analyze data sets and algorithms and to prove the correctness of algorithms.
Prerequisites: CS 131 and MA 123 (or other elementary calculus class). We assume good working knowledge of elementary set theory and counting, and elementary calculus (i.e., integration and differentiation).
The readings are from the class textbook. A free copy of the textbook is available here.
Homeworks are released on Thursdays and are due the following week on Thursday at midnight. The homework pdf and tex are posted on Piazza.
You can try the Jupyter notebooks in your browser here. You can make changes and run the code in the browser. The changes will not be saved, when you hit refresh all the changes will be gone.
The schedule is tentative and subject to change
|L2 (1/24)||Probability cast: sample spaces, events, probability function||17.1, 17.2, 17.3, 17.5||Slides||HW1 out|
|D1 (1/25)||Discussion week 1||Slides||python intro|
|L3 (1/29)||Probability axioms||Slides|
|L4 (1/31)||Probability rules||Slides||HW1 due, HW2 out|
|L5 (2/5)||A guessing game, tree diagrams||Slides|
|L6 (2/7)||More tree diagrams, continuous probability spaces||Slides||strange dice; continuous uniform sampling||HW2 due, HW3 out|
|L7 (2/12)||Random variables: definition and examples||19.1, 19.3, MIT notes||Slides||hot hands|
|L8 (2/14)||Random variables: pdf and cdf||Slides||HW3 due, HW4 out|
|L9 (2/19)||No class (Monday schedule)|
|L10 (2/21)||More pdf/cdf (slides included in L8 slide deck above). Conditional probability: motivation, definition||18.2, 18.3, 18.4, 18.5||Slides||HW4 due, HW5 out|
|D5 (2/22)||Discussion week 5||Notes|
|L11 (2/26)||Conditional probability: examples, tree diagrams, product rule||Slides|
|L12 (2/28)||More conditional probability: more examples, law of total probability, Bayes theorem||Slides||HW5 due, HW6 out|
|L13 (3/5)||Independent events and random variables||18.7, 19.2||Slides|
|L14 (3/7)||Mutual independence||Slides||HW6 due, HW7 out, midterm practice problems out|
|L15 (3/19)||Midterm review||Slides|
|L17 (3/26)||Expectation||19.4, 19.5||Slides|
|L18 (3/28)||Variance||20.2, 20.3||Slides||HW7 due, HW8 out|
|L19 (4/2)||Discrete distributions I: uniform, Bernoulli, binomial|| 19.3, 19.4, 19.4.3,
19.4.6, 20.3.2, 20.3.4
|Slides||distributions, double sixes (L20)|
|L20 (4/4)||Discrete distributions II: geometric||Slides||HW8 due, HW9 out|
|L21 (4/9)||Continuous distributions I: uniform, exponential||MIT notes I, MIT notes II||Slides|
|L22 (4/11)||Continuous distributions II: normal||Slides||HW9 due, HW10 out|
|L23 (4/16)||Markov and Chebyshev inequalities, estimation by sampling||20.1, 20.2, 20.3.5, 20.4||Slides|
|L24 (4/18)||Chernoff inequality, load balancing||20.5.2, 20.5.3, 20.5.5||Slides||HW10 due, final practice problems out|
|L25 (4/23)||Random walks||21.1 (optional), 21.2||Slides|
|L26 (4/25)||Randomized algorithms, reservoir sampling||Slides||reservoir sampling|
|L27 (4/30)||Final exam review|
|L28 (5/2)||Course recap|