Seminar: The Jacob Ziv Communication and Information Theory seminar

ECE Women Community

Recent information theoretic contributions to statistical inference

Date: December,11,2025 Start Time: 14:30 - 15:30
Location: 1061, Meyer Building
Add to:
Lecturer: Dr. Sergio Verdรบ
Information measures, such as relative entropy (Kullback-Leibler divergence), Chernoff information, and f-divergences have a long track record of contributing to the analysis of the fundamental limits of hypothesis testing and to necessary and sufficient conditions for the sufficiency of statistics of the data.

This talk gives an account of several recent information theoretic contributions to hypothesis testing and to the theory of sufficient statistics, obtained through the analysis of the relative information spectrum, i.e. the distribution of the log likelihood ratio.

We introduce a new measure of discrepancy between probability measures, the NP-divergence, and show that it gives the nonasymptotic fundamental limit of the largest area of conditional error probabilities achieved by the optimal Neyman-Pearson tests.

We also introduce a new easy-to-check criterion for sufficient statistics, and explore its relationships with the criteria introduced by Fisher, Blackwell, and Kolmogorov.

 

All Seminars
Skip to content