Navigation

Operators and Keywords

Function List:

C++ API

informationtheory

Functions and routines for basic Information Theory definitions, and source coding.

Select category:

Information Theory:

binaryn
Not implemented.
bscchannel
Returns the transition matrix for a Binary Symmetric Channel with error probability, P.
condentr_seq
Calculates information entropy of the sequence x conditional on the sequence y: H(X|Y) = H(X,Y) - H(Y) X=[1, 1, 2, 1, 1]; Y=[2, 2, 1, 1, 2]; condentr_seq
conditionalentropy_XY
Computes the H(X/Y) = SUM( P(Yi)*H(X/Yi) ) , where H(X/Yi) = SUM( -P(Xk/Yi)log(P(Xk/Yi))), where P(Xk/Yi) = P(Xk,Yi)/P(Yi).
conditionalentropy_YX
Computes the H(Y/X) = SUM( P(Xi)*H(Y/Xi) ), where H(Y/Xi) = SUM( -P(Yk/Xi)log(P(Yk/Xi))) The matrix XY must have Y along rows and X along columns.
entropy
Computes the Shannon entropy of a discrete source whose probabilities are by SYMBOL_PROBABILITIES, and optionally BASE can be specified.
grayenc
Encodes the binary code P to the gray code.
graydec
Decodes the binary gray code P to the original binary code.
hartley_entropy
Compute the Hartley entropy using Reyni entropy of order 0, for the given probability distribution.
infoentr_seq
If just one input, calculates Shannon Information Entropy of the sequence x: H(X) = \sum_x \in X p(x) log2(1/p(x))
infogain_seq
Gives the information gain ratio (also known as the `uncertainty coefficient') of the sequence x conditional on y: I(X|Y) = I(X;Y)/H(X)
jointentropy
Computes the joint entropy of the given channel transition matrix.
kullback_leibler_distance
P and Q are probability distribution functions of the Dkl(P,Q) = \sum_x -P(x).
laverage
Compute the average word length `SUM(I = 1:N)* Li * Pi' where codebook is a struct of strings, where each string represents the codeword.
marginalc
Computes marginal probabilities along columns.
marginalr
Computes marginal probabilities along rows.
mutualinformation
Computes the mutual information of the given channel transition matrix.
mutualinfo_seq
Calculates mutual information of the sequences x and y: I(X;Y) = H(X) - H(X|Y) = H(Y) - H(Y|X) = I(Y;X)
narysource
This function creates a N-ary order source using the given PROBABILITY_DIST (as a column vector) of a 1-order source building a probability distribution of size len(PROBABILITY_DIST)^ORDER.
redundancy
Computes the wasted excessive bits over the entropy when using a particular coding scheme.
relativeentropy
Computes the relative entropy between the 2 given pdf's.
renyi_entropy
Compute the Renyi entropy of order ALPHA, for the given probability distribution P.
shannon_entropy
Redirects Shannon Entropy to entropy function.

Source Coding

tunstallcode
Implementation of a `|A|'-bit tunstall coder given the source probability of the `|A|' symbols from the source with `2^|A|' code-words involved.
unarydec
This function decodes the unary encoded value.
unaryenc
This function encodes the decimal value.
arithmetic_decode
Computes the message from arithmetic code given with symbol probabilities.
arithmetic_encode
Computes the arithmetic code for the message with symbol probabilities are given.

Example

info-theory
Not implemented.