Mutual information between discrete variables with many categories using recursive adaptive partitioning

Junhee Seok, Yeong Seon Kang

Research output: Contribution to journalArticlepeer-review

13 Scopus citations

Abstract

Mutual information, a general measure of the relatedness between two random variables, has been actively used in the analysis of biomedical data. The mutual information between two discrete variables is conventionally calculated by their joint probabilities estimated from the frequency of observed samples in each combination of variable categories. However, this conventional approach is no longer efficient for discrete variables with many categories, which can be easily found in large-scale biomedical data such as diagnosis codes, drug compounds, and genotypes. Here, we propose a method to provide stable estimations for the mutual information between discrete variables with many categories. Simulation studies showed that the proposed method reduced the estimation errors by 45 folds and improved the correlation coefficients with true values by 99 folds, compared with the conventional calculation of mutual information. The proposed method was also demonstrated through a case study for diagnostic data in electronic health records. This method is expected to be useful in the analysis of various biomedical data with discrete variables.

Original languageEnglish
Article number10981
JournalScientific Reports
Volume5
DOIs
StatePublished - 5 Jun 2015

Fingerprint

Dive into the research topics of 'Mutual information between discrete variables with many categories using recursive adaptive partitioning'. Together they form a unique fingerprint.

Cite this