Statistical Inference for Entropy, Divergences and Renyl Information

Entropy and divergence (Shannon and Kullback-Leibler) estimation is a central problem in image processing, with many applications for image compression, segmentation, calibration, registration, etc. Mutual information, which is strongly related to Shannon entropy and Kullback-Leibler divergence, is a widely used measure of similarity between images.

In a seminal paper, Kozachenko and Leonenko (1987) proposed an approach to the problem of entropy estimation, based on the expected distance between a point and its nearest neighbour in the sample. In a series of papers of Prof. Leonenko and his collaborators, the analogous of the nearest neighbour estimates of Rényi entropy was constructed and studied.

One of the main aims of the project is to develop an asymptotic theory of the nearest neighbour estimates of Shannon and Rényi information, in particular to investigate a bias and to prove an asymptotic normality. The project also will consider a statistical methods for ε-entropy and quadratic Rényi entropy in the case of dependent data.

We are interested in pursuing this project and welcome applications if you are self-funded or have funding from other sources, including government sponsorships or your employer.

Please contact the supervisor when you want to pursue this project, citing the project title in your email, or find out more about our PhD programme in Mathematics.

Supervisors

Photograph of Professor Nikolai Leonenko

Professor Nikolai Leonenko

Professor

Email:
leonenkon@cardiff.ac.uk
Telephone:
+44 (0)29 2087 5521

Programme information

For programme structure, entry requirements and how to apply, visit the Mathematics programme.

View programme
Meet us at our Information Fair on 22 February 2018.

Related programmes

Related areas

Related links