Non-Parametric Entropy Estimation Toolbox (NPEET)   Non-Parametric Entropy Estimation Toolbox (NPEET)
  Updated: July 7, 2013, Version 1.1

Greg Ver Steeg
Information Sciences Institute
University of Southern California



Please see updated documentation and code on github.

This package contains Python code implementing several entropy estimation functions for both discrete and continuous variables. Information theory provides a model-free way find structure in complex systems, but difficulties in estimating these quantities has traditionally made these techniques infeasible. This package attempts to allay these difficulties by making modern state-of-the-art entropy estimation methods accessible in a single easy-to-use python library.

The implementation is very simple. It only requires that numpy/scipy be installed. It includes estimators for entropy, mutual information, and conditional mutual information for both continuous and discrete variables. Additionally it includes a KL Divergence estimator for continuous distributions and mutual information estimator between continuous and discrete variables along with some non-parametric tests for evaluating estimator performance.

This package is mainly geared to estimating information-theoretic quantities for continuous variables in a non-parametric way. If your primary interest is in discrete entropy estimation, particularly with undersampled data, please consider this package.

Example installation and usage:
$curl -O
$tar -xvf npeet.tgz
$cd npeet
>>> import entropy_estimators as ee
>>> x = [[1.3],[3.7],[5.1],[2.4],[3.4]]
>>> y = [[1.5],[3.32],[5.3],[2.3],[3.3]]
>>> ee.mi(x,y)
Out: 0.168


See documentation for references on all implemented estimators.

  1. A Kraskov, H Stögbauer, P Grassberger. Estimating Mutual Information PRE 2004.
  2. Greg Ver Steeg and Aram Galstyan Information-Theoretic Measures of Influence Based on Content Dynamics. WSDM, 2013.
  3. Greg Ver Steeg and Aram Galstyan Information Transfer in Social Media. WWW, 2012.