The Kraft Inequality in digital communication , Instantaneous Codes , Optimal Codes :-

**9.19.3.6. Instantaneous Codes**

** **A uniquely decodable code is called an instantaneous code if the end of any codeword is recognizable without examining subsequent code symbols. The instantaneous codes have the property previously mentioned that no codeword is a prefix of another codeword. For this reason, prefix-free codes are sometimes known as instantaneous codes.

**9.19.3.7. Optimal Codes**

** **A code is said to be optimal if it is instantaneous and has minimum average *L* for a given source with a given probability assuagement for the sourcesymbols.

**9.19.4. The Kraft Inequality**

** **Let Xbe a DMS with alphabet {x_{i}} (i = 1, 2, … m). Assume that the length of the assigned binary codeword corresponding to x_{i} is n* _{i}*.

A necessary and sufficient condition for the existence of an instantaneous binary code is

**equation**

which is known as the Kraft inequality.

NOTE: It may be noted that Kraft inequality assures us of the existence of an instantaneously decodable code with codeword lengths that satisfy the inequality. But it does not show us how to obtain these codewords, nor does it say any code satisfies the inequality is automatically uniquely decodable.

**EXAMPLE 9.37. Given a DMS X with two symbols x**

_{l}and x_{2}and P(x_{i}) = 0.9, P(x_{2}) = 0.1. Symbols x_{1}, and x_{2}are encoded as follows (Table 9.3)**Table 9.3.**

x_{i} |
P(x_{i}) |
Code |

x_{1}x _{2} |
0.9 0.1 |
0 1 |

** Find the efficiency ****h**** and the redundancy ****γ**** of this code. **

**Solution:** We know that the average code length *L* per symbol is

**equaTion**

We know that

**equation**

= -0.9 log2 0.9 – 0.1 log2 0.1 = 0.469 b/symbol

Also, the code efficiency h is

h = = 0.469 = 46.9%

And, the code redundancy γ is given by

γ = 1 – h = 0.531 = 53.1% **Ans. **

**EXAMPLE 9.38. The second-order extension of a DMS X, denoted by X^{2}, is formed by taking the source symbols two at a time. The coding of this extension has been shown in Table 9.4. Find the efficiency **

**h**

**and the redundancy**

**γ**

**of this extension code.**

**Solution: Table 9.4.**

a_{i} |
P(a_{i}) |
Code |

a_{1} = x_{1}x_{1}a _{2} = x_{1}x_{2}a _{3} = x_{2}x_{1}a _{4} = x_{2}x_{2} |
0.81 0.09 0.09 0.01 |
0 10 110 111 |

We have (a_{i})n_{i} = 0.81(1) + 0.09(2) + 0.09(3) + 0.01(3) = 1.29 b/symbol

The entropy of the second-order extension of X, H(X^{2}), is given by

**equation**

= – 0.81 log_{2} 0.81 – 0.09 log_{2} 0.09 – 0.09 log_{2} 0.09- 0.01 log_{2} 0.01

or H(X^{2}) = 0.938 b/symbol

Therefore, the code efficiency Ƞ is

Ƞ = = 0.727 = 72.7%

Also, the code redundancy γ will be

γ = 1 – Ƞ = 0.273 = 27.3% **Ans.**

**EXAMPLE 9.39. Consider a DMS X with symbols x _{i}, i = 1, 2, 3, 4. Table 9.5 lists four possible binary codes.**

**Table 9.5.**

x_{i} |
Code A |
Code B |
Code C |
Code D |

x_{1}x _{2}x _{3}x _{4} |
00 01 10 11 |
0 00 11 110 |
0 10 11 110 |
0 100 110 111 |

** **

** (i) Show that all the codes except code B satisfy the Karft inequality.**

** (ii) Show that codes A and D are uniquely decodable but codes B and C are not uniquely decodable.**

**Solution:** We have

(i) For code A, n_{1} = n_{2} = n_{3} = n_{4} = 2

therefore, **Equation**

For code B, n_{1} = 1, n_{2} = n_{3} = 2, n_{4} = 3

therefore **Equation**

For code C, n_{1} = 1, n_{2} = 2, n_{3} = n_{4} = 3

therefore **Equation**

For code D, n_{1} = 1, n_{2} = n_{3} = n_{4} = 3

therefore, **equation**

Thus, all codes except code B satisfy the Kraft inequality.

(ii) Codes A and D are prefix-free codes. They are, therefore, uniquely decodable. Code B does not satisfy the Kraft inequality, and it is not uniquely decodable. Although code C does satisfy the Kraft inequality, but it is not uniquely decodable. This can be seen by the following example:

Given the binary sequence 0110110. This sequence may correspond to the source sequences x_{1}x_{2}x_{1}x_{4} or x_{1}x_{4}x_{4}.

**EXAMPLE 9.40. Verify the following expression:**

where *L* is the average codeword length per symbol and *H*(X) is the source entropy. *(Gate Examination-2000)*

**Solution:** We know that

**Equation**

where the equality holds only if Q_{i} = P_{i}.

Let **Equation**

where **Equation**

Now, we have

** equation**

and **equation**

** equation**

= H(X) -L – log_{2} K __<__ 0

From the Kraft inequlity (14.56), we have

log_{2} K __<__ 0

Thus, H(X) – L ≤ log_{2} K ≤ 0

or L ≥ H(X)

The equality holds when K= 1 and P_{i}= Q_{i}

**EXAMPLE 9.41. Let X be a DMS with symbols x _{i} and corresponding probabilities P(x_{i}) = P_{i}, i = 1,2, …m. Show that for the optimum source encoding, we require that **

**Hence proved.**

**equation**

**and ni = log2**

**= I**

_{i}**where n**

_{i}is the length of the codeword corresponding to x_{i}and I_{i}is the information content of x_{i}.**Solution:**From the result of last problem, the optimum source encoding with L = H(X) requires K = 1 and P = Q

_{i}. Thus, equations using (ii) and (i) of last problem, we have

**equation**

and P

_{i}= Q

_{i}= 2

^{-ni}

Hence ni = -log

_{2}Pi = log2

**NOTE:**Note that equation (ii) implies the following commonsense principle.

Symbols which occur with high probability should be assigned shorter codewords than symbols that occur with low probability.

**Hence proved.**

**EXAMPLE 9.42. Consider a DMS**

*X*with symbols x*and corresponding probabilities P(x*_{i}*) = P*_{i}*, i = 1, 2, …*_{i}*m*. Let n*be the length of the codeword of x*_{i}*such that*_{i}**log2**

**≤**

**n**

_{i}**≤ log2**

**Show that this relationship satisfies the Kraft inequality (9.56), and find the bound on**

*K*in equation (9.56).**Solution:**Equation (i) can be rewritten as

– log

_{2}P

_{i}≤ n

_{i}≤ – log

_{2}P

_{i}+1

or log

_{2}P

_{i}≥ – n

_{i}≥ log

_{2}P

_{i}– 1

Then 2log

_{2}P

_{i}≥ 2

^{-ni}≥ 2

^{log2}P

_{i}2

^{-1}

or P

_{i}≥ 2

^{-ni}≥ P

_{i}

Thus

**Equation**

or

**equation**

which indicates that the Kraft inequality (9.56) is satisfied, and the bound on

*K*will be

≤ K ≤ 1

**Hence proved.**

**EXAMPLE 9.43. Consider a DMS**

*X*with symbols x_{i}and corresponding probabilities P(x_{i}) = P_{i}, i 1, 2, …,m. Show that a code constructed in agreement with equation (i) in last problem gill satisfy the following relation:**H(X)**

**≤**

**L**

**≤**

**H(X) + 1**

**where H(X) is the source entropy and**

*L*is the average codeword length.**Solution:**Multiplying equation (ii) in last problem by P

*and summing over*

_{i}*i*yields

**equation**

Now,

**equation**

Thus, equation (ii) reduces to

H(X) ≤ L ≤ H(X) + 1

**Hence proved.**