MUTUAL INFORMATION FOR THE MULTINOMIAL DISTRIBUTION
The expression for the mutual information measure for the multinomial distribution is derived; the resulting information measure is the difference of two terms: a constant term, denoted by which is an expression solely in terms of N and q, where Nis the trial size and q is the dimension of the random vector, and the other term is the square of the magnitude of the probability vector of the multinomial distribution. Also, an unbiased estimator for the information measure and the derivation of its asymptotic distribution are presented.
Kullback-Leibler measure of association, multinomial distribution, asymptotic sampling distribution, distribution of quadratic forms derived from a singular multivariate normal distribution, non-central chi-square distribution.