Function File: infoentr_seq (seq_x, seq_y)
If just one input, calculates Shannon Information Entropy of the sequence x: H(X) = \sum_x \in X p(x) log2(1/p(x))
If two inputs, calculates joint entropy of the concurrent sequences x and y: H(X,Y) = \sum_x \in X, y \in Y p(x,y) log2(1/p(x,y))
X=[1, 1, 2, 1, 1]; infoentr_seq(X) infoentr_seq([1,2,2,2,1,1,1,1,1],[1,2,2,2,2,2,1,1,1])