|
Description Please see updated documentation and code on github. This package contains Python code implementing several entropy estimation functions for both discrete and continuous variables. Information theory provides a model-free way find structure in complex systems, but difficulties in estimating these quantities has traditionally made these techniques infeasible. This package attempts to allay these difficulties by making modern state-of-the-art entropy estimation methods accessible in a single easy-to-use python library. The implementation is very simple. It only requires that numpy/scipy be installed. It includes estimators for entropy, mutual information, and conditional mutual information for both continuous and discrete variables. Additionally it includes a KL Divergence estimator for continuous distributions and mutual information estimator between continuous and discrete variables along with some non-parametric tests for evaluating estimator performance. This package is mainly geared to estimating information-theoretic quantities for continuous variables in a non-parametric way. If your primary interest is in discrete entropy estimation, particularly with undersampled data, please consider this package. Example installation and usage:$curl -O http://www.isi.edu/~gregv/npeet.tgz $tar -xvf npeet.tgz $cd npeet $python >>> import entropy_estimators as ee >>> x = [[1.3],[3.7],[5.1],[2.4],[3.4]] >>> y = [[1.5],[3.32],[5.3],[2.3],[3.3]] >>> ee.mi(x,y) Out: 0.168 References See documentation for references on all implemented estimators. |