informationtheory
Functions and routines for basic Information Theory definitions, and source coding.
Select category:
Information Theory:
Source Coding
Example
binaryn
Not implemented.
Returns the transition matrix for a Binary Symmetric Channel with error probability, P.
Calculates information entropy of the sequence x conditional on the sequence y: H(X|Y) = H(X,Y) - H(Y) X=[1, 1, 2, 1, 1]; Y=[2, 2, 1, 1, 2]; condentr_seq
Computes the H(X/Y) = SUM( P(Yi)*H(X/Yi) ) , where H(X/Yi) = SUM( -P(Xk/Yi)log(P(Xk/Yi))), where P(Xk/Yi) = P(Xk,Yi)/P(Yi).
Computes the H(Y/X) = SUM( P(Xi)*H(Y/Xi) ), where H(Y/Xi) = SUM( -P(Yk/Xi)log(P(Yk/Xi))) The matrix XY must have Y along rows and X along columns.
Computes the Shannon entropy of a discrete source whose probabilities are by SYMBOL_PROBABILITIES, and optionally BASE can be specified.
Encodes the binary code P to the gray code.
Decodes the binary gray code P to the original binary code.
Compute the Hartley entropy using Reyni entropy of order 0, for the given probability distribution.
If just one input, calculates Shannon Information Entropy of the sequence x: H(X) = \sum_x \in X p(x) log2(1/p(x))
Gives the information gain ratio (also known as the `uncertainty coefficient') of the sequence x conditional on y: I(X|Y) = I(X;Y)/H(X)
Computes the joint entropy of the given channel transition matrix.
P and Q are probability distribution functions of the Dkl(P,Q) = \sum_x -P(x).
Compute the average word length `SUM(I = 1:N)* Li * Pi' where codebook is a struct of strings, where each string represents the codeword.
Computes marginal probabilities along columns.
Computes marginal probabilities along rows.
Computes the mutual information of the given channel transition matrix.
Calculates mutual information of the sequences x and y: I(X;Y) = H(X) - H(X|Y) = H(Y) - H(Y|X) = I(Y;X)
This function creates a N-ary order source using the given PROBABILITY_DIST (as a column vector) of a 1-order source building a probability distribution of size len(PROBABILITY_DIST)^ORDER.
Computes the wasted excessive bits over the entropy when using a particular coding scheme.
Computes the relative entropy between the 2 given pdf's.
Compute the Renyi entropy of order ALPHA, for the given probability distribution P.
Redirects Shannon Entropy to entropy function.
Implementation of a `|A|'-bit tunstall coder given the source probability of the `|A|' symbols from the source with `2^|A|' code-words involved.
This function decodes the unary encoded value.
This function encodes the decimal value.
Computes the message from arithmetic code given with symbol probabilities.
Computes the arithmetic code for the message with symbol probabilities are given.
info-theory
Not implemented.