Study-unit INFORMATION AND ESTIMATION THEORY

Course name Computer engineering and robotics
Study-unit Code A003165
Curriculum Comune a tutti i curricula
Lecturer Giuseppe Baruffa
Lecturers
  • Giuseppe Baruffa
  • Luca Rugini (Codocenza)
Hours
  • 48 ore - Giuseppe Baruffa
  • 24 ore (Codocenza) - Luca Rugini
CFU 9
Course Regulation Coorte 2022
Supplied 2022/23
Supplied other course regulation
Learning activities Affine/integrativa
Area Attività formative affini o integrative
Sector ING-INF/03
Type of study-unit Obbligatorio (Required)
Type of learning activities Attività formativa monodisciplinare
Language of instruction The course is held in Italian.
Contents Elements of information theory, elements of source coding, elements of estimation theory, elements of detection theory.
Reference texts Thomas M. Cover and Joy A. Thomas, “Elements of information theory”, 2nd ed., Wiley-Interscience, 2006.
Steven M. Kay, “Fundamentals of statistical signal processing, vols. I and II: estimation theory and detection theory”, Prentice-Hall, 1993.
Educational objectives Understanding the fundamental concepts of information theory.
Designing source coding schemes.
Understanding the fundamental concepts of estimation and detection theory.
Designing optimal estimators and detectors for data and information processing.
Prerequisites Signals and systems, telecommunications and internet basics, probability and measurement theory
Teaching methods Face to face lessons of theoretical arguments are held using a PC with digital projector; integrations are developed using the (multimedia) blackboard.
For information on support services for students with disabilities and/or SLD, visit the page https://www.unipg.it/en/international-students/general-information/facilities-for-special-needs-students.
Other information Further information will be available in the UniStudium webpage dedicated to this course, which is accessible to all the students enrolled in this course.
Learning verification modality The examination is a 45 minutes discussion on the arguments introduced during the lessons, with questions and open answers.
Extended program Elements of information theory. Measure of information. Entropy, relative entropy and mutual information. Relationship between entropy and mutual information. Chain rule for entropy, relative entropy and mutual information. Jensen's inequalities, data processing and log-sum. Asymptotic equipartition property (AEP). Entropy rate for discrete stochastic processes.
Source coding. Classes of source codes. Kraft inequality. Optimal codes: Huffman coding. Universal codes: arithmetic coding, Lempel-Ziv coding. Introduction to rate-distortion (RD) theory: definitions, RD function.
Elements of estimation theory. Classical approach: minimum variance unbiased (MVU) estimator, Cramer-Rao lower bound (CRLB), best linear unbiased estimator (BLUE), maximum likelihood (ML) estimator, least squares (LS) estimator. Bayesian approach: minimum mean square error (MMSE) estimator, linear MMSE estimator.
Elements of detection theory. Binary hypothesis testing. Neyman-Pearson theorem. ROC curve. Probability of error. Bayes risk. Multiple hypothesis testing. Generalized likelihood ratio test (GLRT).