Information Theory - Principles of Communication

What is Information Theory?

Information is the foundation of a communication system, whether it is analog or digital. Information theory is a mathematical method to the study of coding of information along with the quantification, storage, and communication of information.

Conditions of Occurrence of Events

If we consider an event, there are three conditions of occurrence.

  • If the event has not occurred, there is a condition of uncertainty.
  • If the event has just occurred, there is a condition of surprise.
  • If the event has occurred, a time back, there is a condition of having some information.

Therefore, these three occur at altered times. The difference in these conditions, help us have facts on the chances of occurrence of events.

Entropy

When we notice the options of occurrence of an event, whether how surprise or undefined it would be, it means that we are trying to have an idea on the average content of the information from the source of the event.

Entropy can be defined as a measure of the average information content per source symbol. Claude Shannon, the “father of the Information Theory”, has given a formula for it as

H = i p i log b p i

Where pi is the likelihood of the existence of character number i from a given stream of characters and b is the base of the algorithm used. Therefore, this is also called as Shannon’s Entropy.

The amount of doubt residual about the channel input after observing the channel output, is called as Conditional Entropy. It is denoted by H ( x y )


Discrete Memoryless Source

A source from which the data is being produced at succeeding intervals, which is independent of preceding values, can be termed as discrete memoryless source.

This source is separate as it is not considered for a nonstop time interval, but at discrete time intervals. This source is memoryless as it is fresh at each instant of time, without considering the previous values.

Source Coding

According to the definition, “Given a discrete memoryless source of entropy H ( δ ) , the average code-word length L ¯ ; for any source encoding is bounded as L ¯ H ( δ )


In simpler words, the code-word (For example: Morse code for the word QUEUE is -.- ..- . ..- . ) is always greater than or equal to the source code (QUEUE in example). Which means, the symbols in the code word are greater than or equal to the alphabets in the source code.

Channel Coding

The channel coding in a communication system, presents idleness with a control, so as to expand the dependability of the system. Source coding reduces redundancy to improve the efficiency of the system.

Channel coding consists of two parts of action.

  • Mapping incoming data sequence into a channel input sequence.
  • Inverse mapping the channel output sequence into an output data sequence.

The final target is that the generally effect of the channel noise should be minimized.
The mapping is done by the transmitter, with the help of an encoder, whereas the inverse mapping is done at the receiver by a decoder.

All rights reserved © 2020 Wisdom IT Services India Pvt. Ltd DMCA.com Protection Status

Principles of Communication Topics