Nonlinear Measures of Statistical Dependence (Maximum Information Dimension)

Main content

Measuring statistical dependence between two random variables

Mahito Sugiyama, Karsten Borgwardt

Measuring Statistical Dependence via the Mutual Information Dimension

Summary

An estimation algorithm for MID (Mutual Information Dimension), which measures statistical dependence between two random variables and produces a real-valued score from 0 (weak) to 1 (strong). This algorithm has the following advantages:

  • Nonlinear dependencies (and also linear dependences) can be measured,
  • Scalable; the average-case time complexity is O(nlogn), where n is the number of data points, and
  • Parameter-free.

Code

C implementation can be downloaded here: code.zip (ZIP, 421 KB)

Code is also available at GitHub

Further information and reference

Please see the following article for detailed information and refer it in your published research.

Keyboard navigation between tabs via Alt arrow keys as well as Home and End.

Mahito Sugiyama and Karsten Borgwardt
Measuring Statistical Dependence via the Mutual Information Dimension,
Proceedings of the 23rd International Joint Conference on Artificial Intelligence (IJCAI 2013), 1692-1698. (Online)

Further information and the code can be found on the project page.

@inproceedings{Sugiyama-2013-IJCAI,
title = {Measuring {S}tatistical {D}ependence via the {M}utual {I}nformation {D}imension},
author = {Sugiyama, Mahito and Borgwardt, Karsten M.},
booktitle = {Proceedings of the 23rd International Joint Conference on Artificial Intelligence (IJCAI 2013)},
pages = {1692-1698},
editors = {Francesca Rossi},
publisher = {AAAI Press},
address = {Menlo Park, California},
year = {2013}
}

Contact: Mahito Sugiyama

 
 
Page URL: https://www.bsse.ethz.ch/mlcb/research/machine-learning/mid.html
Fri Apr 28 04:32:23 CEST 2017
© 2017 Eidgenössische Technische Hochschule Zürich