Information Theory

basic concepts of the theory introduced in the forties of the twentieth century, Shannon.Information Theory - is the study of the processes of transmission, storage and retrieval of data in natural, technical and social systems.Many applied scientific fields such as computer science, linguistics, cryptography, control theory, image processing, genetics, psychology, economics, organization of production methods are used this science.

In modern conditions the theory of information is very tightly linked to coding theory, discusses the most common problems of the correspondence between the signal and message and the signal processing theory, which studies the quantization and restore signals, as well as spectral and correlation analysis thereof.

Information Theory examines the basic concept of "information" for the most part quantitatively, without taking into account its value, and sometimes sense.With this approach, the text page will contain approximately the same amount of data is determined only by the number of signs and symbols, and does not depend on the fact that, strictly speaking, there is printed, even if it is absolutely senseless and chaotic set of some symbols.

This approach is justified, but only for simulation of communication systems, as they have to correctly transmit information via communication channels that can be represented by any set of characters and symbols.Then, when it is necessary to take into account the value and meaning of the data, a quantitative approach is unacceptable.Such circumstances impose significant restrictions on areas of possible application of this theory.

Fundamentals of information theory involves consideration of various issues, including those directly related to the transmission and reception of data.The basic scheme of communication, which is seen in the teachings, as follows.Information Theory and Coding believes that these are the source of messages, which is a word or set of words, certain letters of the alphabet written.The source of the message can be text in any natural or artificial languages, human speech, database and some mathematical models, creating a sequence of letters.The transmitter converts the message into a signal corresponding to the physical nature of the communication channel - a medium for signal transmission.During the passage of such it is subject to interference, which introduce distortions in the value of the information parameters.The receiver recovers the original message from the received distorted signal.Message in a reduced form comes to the addressee - some person or technical devices.

Source posts a statistical nature, ie the appearance of each message is determined by a certain probability.Information Theory Shannon believes that if the probability of the message is unity, that is, his appearance significantly, and there is no uncertainty, consider that it does not carry any information.

One of the important problems of information theory considers the coordination of the communication channel and the properties of the source of information messages.Throughput is defined in the unit of 1 bit per second.

One problem with communication systems are interference to the desired signal path.Shannon, unfortunately, does not give us specific methods for dealing with those.The easiest method to eliminate, which is repeated many times messages are not very effective because time-consuming to transfer information.Much greater efficiency allows the use of codes that allow you to detect and correct transmission errors information.