Advanced Statistical Methods -- Exam ==================================== General ------- The exam will take place on the 29th January, Thursday, from 9:15am to 10:45am (2 1/2 hours). The exam will be mostly based on the analytical part of the homework exercises of homework sheets 1-3. Calculators can be used during the exam (no mobile phones, of course), but are not strictly necessary. Knowledge of numerical algorithms is required for the cases that are mentioned below, and you are expected to explain them in words (or to write understandable pseudo code). Note that you do not have to memorize probability distribution functions for the various distributions. If they are required, they will be provided. Study material are the lecture slides, and the solutions to the homeworks. Solutions to homework set 1 and 2 are online, solutions to homework set 3 will be available tomorrow evening. Note that the online material was updated recently to remove a few typos, so please download slides and homework solutions again if you already did. If anything is unclear or if you find further typos, please let us know. Q&A Session ----------- On Tuesday, 27 Jan, the second hour of the lecture (10-11am) will be a Q&A session where remaining questions about the study material etc can be asked. What to learn ------------- An exhaustive list of what could be asked during the exam is given in the following: - Understanding central aspects of random variables, e.g. [Material: Barlow, Ch 2-4; Cowan Ch 1-2; Gregory Ch 5] - Basic definitions (like mean, median, mode, variance, covariance, central moments, etc) - The relation between the most basic probability distribution functions (normal, chi-squared and Poisson distributions, etc) - The central limit theorem and application in simple examples - Understanding of basic definitons of estimators [Material: Barlow, Ch 5; Cowan Ch 5] - Biased or unbiased estimators - Maximum likelihood estimator - Consistent or efficient estimators - Understanding the basic concepts of Frequentist statistics [Material: Slides; Barlow, Ch 5, 8; Cowan, Ch 4, 6, 9] - The meaning of p-values - The Neyman-Pearson lemma - Chi-squared goodness-of-fit test - Knowledge of the Cramer-Rao Bound and Fisher information - Confidence belt construction in one dimensions - Construction of upper and lower limits, and a central confidence interval - Confidence level construction using the likelihood function or chi-squared [Discussed in Cowan Ch 9, in Vaughan and on the slides] - Knowledge of Wilks' theorem, and its application in deriving confidence regions (understanding of the proof or limiting cases is not required) - Understanding of trial corrections in simple examples - Understanding of basic Bayesian inference [Material: Slides; Gregory, Ch 1, 3] - Bayes theorem and its application in simple examples - Knowledge and understanding of odds ratio and Bayes factor - Understanding of the connection between the "Occam's razor" principle and nuisance parameters in Bayesian analysis - Knowledge of most basic priors (e.g. flat prior, Jeffrey's prior) - Understand conceptual difference between Frequentist and Bayesian statistical inference, e.g. [Material: Gregory, Ch 1, 3; above Cowan + Barlow chapters] - Difference between hypothesis testing and model comparison - Understanding of difference between confidence and credible interval - Understanding of difference between marginalization and profiling over nuisance parameters - Knowledge of basic minimization algorithms [Material: Slides, Wikipedia (Newton's method in optimization)] - Gradient decent, Newton method (Gauss-Newton method not required) - Shannon entropy [Material: Slides; Greogry Ch 8] - Definition of Shannon entropy - Idea of Maximum entropy principle (no detailed calculations required) - Monte Carlo algorithms [Material: Cowan Ch 3, Gregory Ch 12] - Understanding of Monte Carlo integration - Definition of Markov Chain, and reversible Markov Chain - Understanding of goals and mechanisms of the Metropolis-Hastings algorithm (steps of the algorithm, desired properties of proposal distribution, definition of the acceptance probability) - Knowledge of "detailed balance" principle and connection to reversible Markov Chains and the Metropolis-Hastings algorithm - General aspects [Material: Slides] - Understanding of why 1/sqrt(n) is the typical behaviour of error bars - Knowledge of meaning of "statistics limits", "background limits" and "systematics limits"