Nonlinear Measures of Statistical Dependence (Maximum Information Dimension)

Measuring statistical dependence between two random variables

Enlarged view: Logo of IJCAI 2013

Mahito Sugiyama, Karsten Borgwardt

Measuring Statistical Dependence via the Mutual Information Dimension

Summary

An estimation algorithm for MID (Mutual Information Dimension), which measures statistical dependence between two random variables and produces a real-valued score from 0 (weak) to 1 (strong). This algorithm has the following advantages:

  • Nonlinear dependencies (and also linear dependences) can be measured,
  • Scalable; the average-case time complexity is O(nlogn), where n is the number of data points, and
  • Parameter-free.

Code

C implementation can be downloaded here: Downloadcode.zip (ZIP, 421 KB)

Code is also available at external pageGitHub

Further information and publication

Please see the following article for detailed information and refer it in your published research.

Measuring Statistical Dependence via the Mutual Information Dimension

Mahito Sugiyama and Karsten Borgwardt
Proceedings of the 23rd International Joint Conference on Artificial Intelligence (IJCAI 2013), 1692-1698
external pageOnline  |  ETH Research Collection  |  Project page  |  external pageGitHub

      

JavaScript has been disabled in your browser