The term information covers signs, signals and messages with their syntactic, semantic and pragmatic aspects.

Uncertainty (and thus information content) of a random event i may be quantitatively described in form of the negative logarithm of its probability with

I = - ld pi

where ld denotes the dual logarithm as ld M = lg M / lg 2 = ln M / ln 2

The quantity of an information stream produced by an ergodic source is given with Shannon's equation as

H = - K Sigma i = 1 n (pi ld pi).

H is referred to as the entropy of the information source.

Community content is available under CC-BY-SA unless otherwise noted.