Go to main content
Formats
Format
BibTeX
MARCXML
TextMARC
MARC
DublinCore
EndNote
NLM
RefWorks
RIS

Files

Abstract

Introduced by Shannon, mutual information is a fundamental brick of information theory for its essential role in measuring association on non-ordinal alphabets. Mutual information being zero is a golden property as it indicates a probabilistic independent between the distributions. This article oers asymptotic chi-square dis tributions for the plug-in estimator and a non-parametric estimator of mutual information. The established distributions allow new tests of independence in entropic perspective.

Details

PDF

Statistics

from
to
Export
Download Full History