mutualInformation {CPTtools}R Documentation

Calculates Mutual Information for a two-way table.

Description

Calculates the mutual information for a two-way table of observed counts or a joint probability distribution. The mutual information is a measure of association between two random variables.

Usage

mutualInformation(table)

Arguments

table

A two way table or probability distribution. Possibly the output of the table command.

Details

This is basically the Kullback-Leibler distance between the joint probability distribution and the probability distribution created by assuming the marginal distributions are independent. This is given in the following formula:

I[X;Y] = sum Pr(X=x, Y=y) log [Pr(X=x,Y=y)/Pr(X=x)Pr(Y=y)] where the sum is taken over all possible values of x and y.

Author(s)

Russell Almond

References

http://planetmath.org/encyclopedia/MutualInformation.html

Shannon (1948) “A Mathematical Theory of Communication.”

See Also

table

Examples

## UCBAdmissions is a three way table, so we need to
## make it a two way table.
mutualInformation(apply(UCBAdmissions,c(1,2),sum))
apply(UCBAdmissions,3,mutualInformation)
apply(UCBAdmissions,2,mutualInformation)
apply(UCBAdmissions,1,mutualInformation)

[Package CPTtools version 0.5-1 Index]