Meaning of joint entropy | Babel Free
Definitions
The Shannon entropy of a "script" whose "characters" are elements of the Cartesian product of the sets of characters of the component scripts.
countable, uncountable
Examples
“If random variables X and Y are mutually independent, then their joint entropy H(X,Y) is just the sum H(X)#43;H(Y) of its component entropies. If they are not mutually independent, then their joint entropy will be H(X)#43;H(Y)-I(X#59;Y) where I(X#59;Y) is the mutual information of X and Y.”
CEFR level
B2
Upper Intermediate
This word is part of the CEFR B2 vocabulary — upper intermediate level.
This word is part of the CEFR B2 vocabulary — upper intermediate level.