The Penguin Dictionary of Mathematics by David Nelson

The Penguin Dictionary of Mathematics by David Nelson

Author:David Nelson
Language: eng
Format: epub
ISBN: 9780141030234
Publisher: Penguin Books Ltd
Published: 2008-11-18T16:00:00+00:00


See Cramér – Rao inequality.

information theory A branch of mathematics concerned with the transmission and processing of information. A general theory of the subject was propounded in 1948 by Claude E. Shannon, in his article ‘A Mathematical Theory of Communication’. The subject is based on the idea that it is possible to give a quantitative measure of information. The usual method of assigning such a measure can be illustrated by the example of transmitting and receiving a single letter of the alphabet (i.e. any one of the 26 letters). The amount of information in such a message (if correctly received) is measured with reference to the situation in which there are only two letters, and is given by log2 26/log2 2=4.7, i.e. there is 4.7 times as much information in receiving a single letter of the 26-letter alphabet as in receiving a single *bit. The information content is said to be 4.7 bits. In fact, this applies only if the letters in the alphabet are equally likely to occur. In practice, this is not the case and information content is measured by a quantity known as entropy, given by

– p1 log2p1 – p2 log2p2 – p3 log2p3 –…

where p1, p2, p3,… are the probabilities of different values of the variable (in the example, the letter sent). This idea of entropy is similar to the concept originally developed in thermodynamics and statistical dynamics.

In considering information, it is usual to have a model consisting of:

(1) a source of information;

(2) an encoder, which changes this into a form suitable for transmission;

(3) a channel along which the information is transmitted;

(4) a decoder, which converts the information back into a useful form; and

(5) a destination or user, which receives the information.

The signal transmitted via the channel may be subject to extraneous noise. In its most restricted sense, information theory deals with the entropies of sources and channels. More generally, the term is also used to encompass *coding theory (ways of encoding information to ensure effective transmission). The term communication theory is often used to include both information theory and coding theory.

Information theory is essentially an application of probability theory. It has obvious uses in telegraphy, radio transmission, and the like, but has also been applied to language studies and cybernetics.

initial conditions See boundary conditions.

initial meridian plane See spherical coordinate system.

injection An injection from a *set A to a set B is a *one-to-one function whose *domain is A and whose *range is part of B. For example, if A={3, 6} and B={9, 36, 150} then the function f:xx2 is an injection (or injective function). See also surjection; bijection.

inner product See scalar product.

inradius The radius of the *incircle of a polygon. Compare exradius.

inscribed Describes a figure that is *circumscribed by another figure. For example, a polygon lying inside a circle with all its vertices on the circumference is said to be inscribed in the circle. A circle inside a polygon with all the sides of the polygon tangent to the circle is inscribed in the polygon (it is the incircle of the polygon).



Download



Copyright Disclaimer:
This site does not store any files on its server. We only index and link to content provided by other sites. Please contact the content providers to delete copyright contents if any and email us, we'll remove relevant links or contents immediately.