Shannon's entropy plays a central role in many fields of mathematics. In the first chapter, we present a sufficient condition for the asymptotic normality of the plug-in estimator of Shannon's entropy defined on a countable alphabet. The sufficien...
Jensen-Shannon divergence is one reasonable solution to the problem of measuring the level of difference or "distance'" between two probability distributions on a multinomial population. If one of the distributions is assumed to be known a priori,...
Introduced by Shannon, mutual information is a fundamental brick of information theory for its essential role in measuring association on non-ordinal alphabets. Mutual information being zero is a golden property as it indicates a probabilistic ind...
This dissertation discusses several statistical results under multinomial distribution with infinite categories. Firstly, the discussion focuses on Simpson's diversity index and Turing's formula. We established an unbiased estimate for the newly p...