Statistical Inference for Entropy, Divergences and Renyl Information
Entropy and divergence (Shannon and Kullback-Leibler) estimation is a central problem in image processing, with many applications for image compression, segmentation, calibration, registration, etc. Mutual information, which is strongly related to Shannon entropy and Kullback-Leibler divergence, is a widely used measure of similarity between images.
In a seminal paper, Kozachenko and Leonenko (1987) proposed an approach to the problem of entropy estimation, based on the expected distance between a point and its nearest neighbour in the sample. In a series of papers of Prof. Leonenko and his collaborators, the analogous of the nearest neighbour estimates of Rényi entropy was constructed and studied.
One of the main aims of the project is to develop an asymptotic theory of the nearest neighbour estimates of Shannon and Rényi information, in particular to investigate a bias and to prove an asymptotic normality. The project also will consider a statistical methods for ε-entropy and quadratic Rényi entropy in the case of dependent data.
We are interested in pursuing this project and welcome applications if you are self-funded or have funding from other sources, including government sponsorships or your employer.
Please contact the supervisor when you want to pursue this project, citing the project title in your email, or find out more about our PhD programme in Mathematics.