Probability models (Fall 2011)Tue, Thu, Fri 2:00 - 3:00, in LH III
Teaching assistants: Indrajit Jana and Kartick Adhikari (Tutorials: Tue 5:30 PM)
Some generalities: This is the first course in probability theory. Probability is an intuitive concept, and we often use the words such as probably, by chance, luckily, an unbelievable coincidence etc., which indicate uncertainty. Yet, probability theory is a part of mathematics. The aim in this course is to learn the basic concepts of this mathematical theory of probability, as well as develop an intuitive understanding of what these concepts mean and how they are applied in "real life situations". In this course we will stick to discrete probability spaces, where the mathematical sophistication needed is little. A natural continuation is the Probability theory course offered in the next semester, where you will learn the measure theoretical foundations of probability.
Grading: The final grade will be based on weekly homeworks (25%), two class tests (25% together) and the final exam (50%). Homeworks are due on Mondays (delayed submissions not accepted). Solving problems (preferably many more than given in the homeworks) is absolutely crucial to develop an understanding of the subject.
Texts and other resources: Sheldon Ross' Introduction to probability models is the primary text for the course. We will try to cover most of chapters 1-6 and chapter 11, although I will not strictly follow the book at all times. Another excellent book is William Feller's classic treatise An introduction to probability theory and its applications - vol. 1. Both these book have lots of examples and problems (and available at Tata book house). In addition to books, there are various resources on the web (for example) regarding basic probability.
Tentative list of topics: Probability space, events. Basic rules for calculating probabilities. Inclusion exclusion. Combinatorial examples. Independence and conditioning. Bayes formula. Random variables. Distribution function. Simulation. Examples: Binomial, Geometric, Poisson, Hypergeometric etc. Expectation, variance and covariance, generating functions. Independence and conditioning of random variables. Joint disribution, Distribution of the sum. The conceptual difficulty of picking a point at random from [0,1] or tossing a coin infinitely many times. Working rules for continuous distributions and densities. Simulation. Examples: Normal, exponential and gamma, uniform and beta, etc. Useful inequalities: Markov, Chebyshev, Cauchy-Schwarz, Bonferroni. IID random variables (existential issues overlooked). WLLN, SLLN (?), Demoivre-Laplace CLT, General CLT. Interesting processes: (a) Random walks (b) Branching processes (c) Polya's urn scheme. Markov chains: Basic theory. Many examples. Irreducible aperiodic MC. Stationary distribution. Reversibility. Recurrence and positive recurrence. Convergence theorem. Strong Markov property. CLT for statistics. Mixing time idea introduced. Continuous time markov chains: Infinitesimal description, generator. Poisson process. Examples.
Progress of lectures: