




























































































Study with the several resources on Docsity
Earn points by helping other students or get them with a premium plan
Prepare for your exams
Study with the several resources on Docsity
Earn points to download
Earn points by helping other students or get them with a premium plan
Community
Ask the community for help and clear up your study doubts
Discover the best universities in your country according to Docsity users
Free resources
Download our free guides on studying techniques, anxiety management strategies, and thesis advice from Docsity tutors
ECE-V-INFORMATION THEORY & CODING [10EC55]-NOTES
Typology: Study notes
1 / 217
This page cannot be seen from the preview
Don't miss anything!
Subject Code : 06EC55 IA Marks : 25 No. of Lecture Hrs/Week : 04 Exam Hours : 03 Total no. of Lecture Hrs. : 52 Exam Marks : 100
Unit – 1: Information Theory: Introduction, Measure of information, Average information content of symbols in long independent sequences, Average information content of symbols in long dependent sequences. Mark-off statistical model for information source, Entropy and information rate of mark-off source. 6 Hours Unit – 2: Source Coding : Encoding of the source output, Shannon’s encoding algorithm. Communication Channels, Discrete communication channels, Continuous channels. 6 Hours Unit – 3: Fundamental Limits on Performance : Source coding theorem, Huffman coding, Discrete memory less Channels, Mutual information, Channel Capacity. 6 Hours Unit – 4: Channel coding theorem, Differential entropy and mutual information for continuous ensembles, Channel capacity Theorem. 6 Hours
PART - B Unit – 5: Introduction to Error Control Coding : Introduction, Types of errors, examples, Types of codes Linear Block Codes: Matrix description, Error detection and correction, Standard arrays and table look up for decoding. 7 Hours Unit – 6: Binary Cycle Codes, Algebraic structures of cyclic codes, Encoding using an (n-k) bit shift register, Syndrome calculation. BCH codes. 7 Hours Unit – 7: RS codes, Golay codes, Shortened cyclic codes, Burst error correcting codes. Burst and Random Error correcting codes. 7 Hours Unit – 8: Convolution Codes, Time domain approach. Transform domain approach. 7Hours
Text Books: Digital and analog communication systems, K. Sam Shanmugam, John Wiley, 1996. Digital communication, Simon Haykin, John Wiley, 2003.
Reference Books: ITC and Cryptography, Ranjan Bose, TMH, II edition, 2007 Digital Communications - Glover and Grant; Pearson Ed. 2nd Ed 2008
INDEX SHEET
SLNO. Unit & Topic of Discussion PAGE NO.
(^1) PART - A UNIT – 1: INFORMATION THEORY
(^2) Introduction 5 (^3) Measure of information 5 (^4) Average information content of symbols in long independent sequences
(^5) Average information content of symbols in long dependent sequences
(^6) Mark-off statistical model for information source, 11 (^7) Entropy and information rate of mark-off source. 19 (^8) Review questions 27 (^9) UNIT – 2 SOURCE CODING
29
(^10) Encoding of the source output 30 (^11) Shannon’s encoding algorithm 31 (^12) Communication Channels 44 (^13) Discrete communication channels 45 (^14) Review questions 73 (^15) UNIT – 3 FUNDAMENTAL LIMITS ON PERFORMANCE
(^16) Source coding theorem 75 (^17) Huffman coding 75 (^18) Discrete memory less Channels 81 (^19) Mutual information 88 (^20) Channel Capacity 90 (^21) Review questions 110 (^22) UNIT - 4 111 (^23) Continuous Channel 112 (^24) Differential entropy and mutual information for continuous ensembles
119
(^25) Channel capacity Theorem 121 (^26) Review questions 129 (^27) PART - B UNIT - 5 INTRODUCTION TO ERROR CONTROL CODING
130
(^28) Introduction 131 (^29) Types of errors 133 (^30) Types of codes 133 (^31) Linear Block Codes: Matrix description. 136 (^32) Error detection and correction 146 (^33) Standard arrays and table look up for decoding 149
Unit – 1: Information Theory
Syllabus: Introduction, Measure of information, Average information content of symbols in long independent sequences, Average information content of symbols in long dependent sequences. Mark-off statistical model for information source, Entropy and information rate of mark-off source. 6 Hours
Text Books:
Reference Books:
6 TH^ SEM INFORMATION THEORY AND CODING (06EC65)
Dept. of ECE, SJBIT, B’lore 60 5
1.1 Introduction:
Communication involves explicitly the transmission of information from one point to another, through a succession of processes.
o Transmitter o Channel and o Receiver
Analog : Emit a continuous – amplitude, continuous – time electrical wave from. Discrete : Emit a sequence of letters of symbols. The output of a discrete information source is a string or sequence of symbols.
1.2 Measure the information:
To measure the information content of a message quantitatively, we are required to arrive at an intuitive concept of the amount of information.
Consider the following examples: A trip to Mercara (Coorg) in the winter time during evening hours,
INFORMATION SOURCE
ANALOG DISCRETE
Message signal
Receiver
User of information
Transmitter
Source of information
Transmitted signal
Received signal
Estimate of message signal
Communication System
Unit of information measure
Base of the logarithm will determine the unit assigned to the information content. Natural logarithm base : ‘nat’ Base - 10 : Hartley / decit Base - 2 : bit Use of binary digit as the unit of information? Is based on the fact that if two possible binary digits occur with equal proby (p 1 = p 2 = ½) then the correct identification of the binary digit conveys an amount of information. I (m 1 ) = I (m 2 ) = – log 2 (½ ) = 1 bit ∴ One bit is the amount if information that we gain when one of two possible and equally likely events occurs.
Illustrative Example
A source puts out one of five possible messages during each message interval. The probs. of
these messages are p 1 = 2
; p 2 = 4
; p 1 = 4
: p 1 = 16
, p 5 16
What is the information content of these messages?
I (m 1 ) = - log 2
= 1 bit
I (m 2 ) = - log 2
= 2 bits
I (m 3 ) = - log (^)
= 3 bits
I (m 4 ) = - log 2
= 4 bits
I (m 5 ) = - log 2
= 4 bits
HW: Calculate I for the above messages in nats and Hartley
Digital Communication System:
Entropy and rate of Information of an Information Source /
Model of a Mark off Source
1.3 Average Information Content of Symbols in Long Independence Sequences
Suppose that a source is emitting one of M possible symbols s 0 , s 1 ….. sM in a statically independent sequence
Let p 1 , p 2 , …….. pM be the problems of occurrence of the M-symbols resply. suppose further that during a long period of transmission a sequence of N symbols have been generated.
On an average – s 1 will occur NP 1 times S 2 will occur NP 2 times : : si will occur NPi times
The information content of the i (^) th symbol is I (si) = log (^)
p i
bits
∴ PiN occurrences of si contributes an information content of
PiN. I (si) = PiN. log (^)
p i
bits
∴ Total information content of the message is = Sum of the contribution due to each of
Source of information
Source encoder
Channel encoder
Modulator
Channel
User of information
Source decoder
Channel decoder
Demodulator
Message signal
Transmitter
Receiver
Source code word
Channel code word
Estimate of source codeword
Waveform Received signal
Estimate of channel codeword
Estimate of the Message signal
2 2
If the source is emitting symbols at a fixed rate of ‘’rs’ symbols / sec, the average source information rate ‘R’ is defined as –
R = rs. H bits / sec
Solution: By definition, the entropy of a source is given by
=
M
i (^) i
pi (^) p 1
log bits/ symbol
H for this example is
=
2
0
log i (^) i
pi (^) p
Substituting the values given, we get
H (A) = po log P o
2 1
log
p
p p
= ¼ log 2 4 + ¼ log 2 4 + ½ log 22
= 1.5 bits
if rs = 1 per sec, then
H′ (A) = rs H (A) = 1.5 bits/sec
2. An analog signal is band limited to B Hz, sampled at the Nyquist rate, and the samples are quantized into 4-levels. The quantization levels Q1, Q2, Q3, and Q 4 (messages) are assumed independent and occur with probs.
P 1 = P 2 = 8
and P 2 = P 3 = 8
. Find the information rate of the source.
Solution: By definition, the average information H is given by
H = p 1 log 1
p
p
p
p
Substituting the values given, we get
log 8 + 8
log 3
log 3
log 8
= 1.8 bits/ message.
Information rate of the source by definition is R = rs H
R = 2B, (1.8) = (3.6 B) bits/sec
3. Compute the values of H and R, if in the above example, the quantities levels are so chosen that they are equally likely to occur,
Solution:
Average information per message is H = 4 (¼ log 2 4) = 2 bits/message and R = rs H = 2B (2) = (4B) bits/sec
1.5 Mark off Model for Information Sources
Assumption
A source puts out symbols belonging to a finite alphabet according to certain probabilities
depending on preceding symbols as well as the particular symbol in question.
A statistical model of a system that produces a sequence of symbols stated above is and which
is governed by a set of probs. is known as a random process.
Therefore, we may consider a discrete source as a random process
And the converse is also true.
i.e. A random process that produces a discrete sequence of symbols chosen from a finite set
may be considered as a discrete source.
Provides a statistical model for the symbol sequences emitted by a discrete source.
General description of the model can be given as below:
….. n
Where ‘n’ is defined as
o Transition probs. and the symbols emitted corresponding to the transition will be shown marked along the lines of the graph. A typical example for such a source is given below.
o It is an example of a source emitting one of three symbols A, B, and C o The probability of occurrence of a symbol depends on the particular symbol in question and the symbol immediately proceeding it. o Residual or past influence lasts only for a duration of one symbol.
Last symbol emitted by this source
o The last symbol emitted by the source can be A or B or C. Hence past history can be represented by three states- one for each of the three symbols of the alphabet.
o Suppose that the system is in state (1) and the last symbol emitted by the source was A. o The source now emits symbol (A) with probability ½ and returns to state (1). OR o The source emits letter (B) with probability ¼ and goes to state (3) OR o The source emits symbol (C) with probability ¼ and goes to state (2).
State transition and symbol generation can also be illustrated using a tree diagram.
Tree diagram
P 1 (1) = 1 / 3 P 2 (1) = 1 / 3 P 3 (1) = 1 / 3
½
¼
½
¼
B
¼ B
A (^) ¼
¼ (^) C
½ (^) A 1
C 2
3
C
A ¼
B
To state 3
To state 2
3
3
A C B
3
Symbol probs.
Symbols emitted
State at the end of the first symbol internal
State at the end of the second symbol internal
Symbol sequence
Initial state
Recall the Markoff property.
Transition probability to S 3 depends on S 2 but not on how the system got to S 2.
Therefore, P (S 1 = 1, S 2 = 1, S 3 = 3 ) = 1 / 3 x ½ x ¼
Similarly other terms on the RHS of equation (2) can be evaluated.
Therefore P (AB) = 1 / 3 x ½ x ¼ + 1 / 3 x ¼ x ¼ + 1 / 3 x ¼ x ¼ = 48
Similarly the probs of occurrence of other symbol sequences can be computed.
Therefore,
In general the probability of the source emitting a particular symbol sequence can be computed by summing the product of probabilities in the tree diagram along all the paths that yield the particular sequences of interest.
Illustrative Example:
Source given emits one of 3 symbols A, B and C
Tree diagram for the source outputs can be easily drawn as shown.
4 p 1 = ½ P 2 = ½
Messages of length (1) and their probs
A ½ x ¾ = 3 / 8 B ½ x ¾ = 3 / 8
C ½ x ¼ + ½ x ¼ = 8
Message of length (2)
How may such messages are there?
Seven
Which are they?
AA, AC, CB, CC, BB, BC & CA
What are their probabilities?
Message AA : ½ x ¾ x ¾ = 32
Message AC: ½ x ¾ x ¼ = 32
and so on.
Tabulate the various probabilities
4
4
No. of states: n ≤ (M)m; 4 ≤ M^2 ∴ M = 2
m = No. of symbols for which the residual influence lasts (duration of 2 symbols) or M = No. of letters / symbols in the alphabet.
Say the system in the state 3 at the beginning of the symbols emitted by the source were BA.
Similar comment applies for other states.
1.6 Entropy and Information Rate of Markoff Sources
Assume that, the probability of being in state i at he beginning of the first symbol interval is the same as the probability of being in state i at the beginning of the second symbol interval, and so on.
The probability of going from state i to j also doesn’t depend on time, Entropy of state ‘i’ is defined as the average information content of the symbols emitted from the i-th state.
=
n
j (^) ij
pij (^) p 1
i 2
H log bits / symbol ------ (1)
Entropy of the source is defined as the average of the entropy of each state.
=
n
j 1
p (^) i Hi ------ (2)
Where, Pi = the proby that the source is in state ‘i'. Using eqn (1), eqn. (2) becomes,
8
8
8
4
4
4 (^3) / 4
= =
n
j ij
ij
n
i
p (^) i p p 1 1
log bits / symbol ------ (3)
Average information rate for the source is defined as R = rs. H bits/sec Where, ‘rs’ is the number of state transitions per second or the symbol rate of the source. The above concepts can be illustrated with an example
Illustrative Example:
4
4
4 p 1 = ½ P 2 = ½