Probability and Statistics

Number of Credits: 7 credits

Hours: 30 hours of Lectures, 30 hours of Tutorials including exams, and 20 hours of flipped Classrooms with tutor support.

General Presentation: This course introduces the student to the fundamentals of rigorous probability theory with some classical applications in statistics. It provides a clear and intuitive approach while maintaining a good level of mathematical accuracy without using the Lebesgue integral. No previous course in probability or statistics is needed.

Part 1: In this first part we introduce the basic notions of elementary probability theory.

We start by the definition of a probability measure and discuss some of its properties. After defining the independence of sets we introduce conditional probabilities to obtain the so-called law of total probabilities and Bayes formula.  Then, the notion of random variable is extensively studied starting from the univariate case. We pay attention on the discrete and the continuous cases to define independence, moments and to present in detail special parametric families of univariate distributions (Uniform, Bernoulli, Binomial, Poisson, Geometric, Gaussian, Exponential, Gamma, Chi-squared).  What is more, we study the effects of some regular transforms on these random variables. After that, we extend our approach in the multivariate case starting with pairs of discrete and jointly continuous random variables (change of variables theorem). We also introduce the notion of moment generating function and discuss some of its powerful properties to handle practical computations.

Finally, we introduce the notion of conditional distribution in order to be able to define in our setting the conditional expectation for discrete and jointly continuous random variables.

Part 2: The second part of the course starts with some complementary notions on random vectors and related matrix notations. We also spend time on the very important case of Gaussian vectors. Then, several important topics in statistics are introduced to conclude this course. We study properties of samples of random variables, with emphasis on the Gaussian case, introducing, in particular, some classical estimators of expectation and variance. Then we present a primer on theoretical estimation theory studying in particular maximum likelihood estimators and exploring some optimality criteria in relation to the Fisher information matrix. Finally, the basic concepts of classical test theory are presented insisting on likelihood ratio tests.

Remark: Depending on student’s levels and available time, some extra topics as the convergence of random variables (strong law of large numbers, central limit theorem) or complementary aspects in estimation theory (sufficiency, Bayes estimation) could be covered.

Books:

- Mood, A. et. al. (1974) Introduction to the Theory of Statistics, McGraw-Hill, Inc., NY (Chapters 1 to 5)

- Rice, J. (2007) Mathematical Statistics and Data Analysis, Thomson, Berkley, CA (Chapters 1, 2, 3, 4, 6)   

- For statistics: Casella, G. and Berger, R.L. (1990, 2002). Statistical Inference. Wadsworth Publishing Co., Belmont, CA (Chapters 5 to 8).

Prerequisites: Logic and Sets