Information is the foundation of a communication system, whether it is analog or digital. Information theory is a mathematical method to the study of coding of information along with the quantification, storage, and communication of information.
If we consider an event, there are three conditions of occurrence.
Therefore, these three occur at altered times. The difference in these conditions, help us have facts on the chances of occurrence of events.
When we notice the options of occurrence of an event, whether how surprise or undefined it would be, it means that we are trying to have an idea on the average content of the information from the source of the event.
Entropy can be defined as a measure of the average information content per source symbol. Claude Shannon, the “father of the Information Theory”, has given a formula for it as
Where pi is the likelihood of the existence of character number i from a given stream of characters and b is the base of the algorithm used. Therefore, this is also called as Shannon’s Entropy.
The amount of doubt residual about the channel input after observing the channel output, is called as Conditional Entropy. It is denoted by
A source from which the data is being produced at succeeding intervals, which is independent of preceding values, can be termed as discrete memoryless source.
This source is separate as it is not considered for a nonstop time interval, but at discrete time intervals. This source is memoryless as it is fresh at each instant of time, without considering the previous values.
According to the definition, “Given a discrete memoryless source of entropy , the average code-word length ; for any source encoding is bounded as
In simpler words, the code-word (For example: Morse code for the word QUEUE is -.- ..- . ..- . ) is always greater than or equal to the source code (QUEUE in example). Which means, the symbols in the code word are greater than or equal to the alphabets in the source code.
The channel coding in a communication system, presents idleness with a control, so as to expand the dependability of the system. Source coding reduces redundancy to improve the efficiency of the system.
Channel coding consists of two parts of action.
The final target is that the generally effect of the channel noise should be minimized.
The mapping is done by the transmitter, with the help of an encoder, whereas the inverse mapping is done at the receiver by a decoder.
Principles of Communication Related Interview Questions
|Interpersonal Communication Interview Questions||Business Communications Interview Questions|
|Communication Skills Interview Questions||Speech Communication Interview Questions|
|Control Systems-Electrical Engineering Interview Questions||Mass communication Interview Questions|
|Analog Communication Interview Questions||Corporate Communication Interview Questions|
|Digital Communication Interview Questions|
Principles of Communication Related Practice Tests
|Interpersonal Communication Practice Tests||Business Communications Practice Tests|
|Communication Skills Practice Tests||Speech Communication Practice Tests|
|Control Systems-Electrical Engineering Practice Tests||Mass communication Practice Tests|
|Analog Communication Practice Tests||Corporate Communication Practice Tests|
All rights reserved © 2020 Wisdom IT Services India Pvt. Ltd
Wisdomjobs.com is one of the best job search sites in India.