**what is** **UNCERTAINTY IN THE TRANSMISSION PROCESS ?**

**TRANSMISSION OF CONTINUOUS SIGNALS**

** **Now, let us further illustrate the Hartley-Shannon law for the exchange of bandwidth and signal-to-noise ratio by a continuous signal which is bandlimited to *f _{m}* Hz.

According to the sampling theorem, the information of a continuous-time signal, which is bandlimited to f

*Hz, is completely specified by 2f*

_{m}*samples per second.*

_{m}Hence, to transmit the information of such a signal, it is necessary to transmit only these discrete samples.

**Theoretical, Aspect**

**Now, one important question arises. How much information does each sample contain.**

It depends upon how many discrete levels or values the samples may assume. In fact, these samples can assume any value and hence to transmit such samples, we require pulses capable of assuming infinite levels. Clearly, the information carried by each sample is infinite bits.

Therefore, the information contained in a continuous bandlimited signal is infinite.

________________________________________________________________

** This is true even if the bandwidth

*B*is infinite. The noise signal is a white noise with a uniform power density spectrum over the entire frequency range. Therefore, as the bandwidth

*B*is increased,

*N*also increases and hence the channel capacity remains finite even if B = .

If Ƞ/2 is the power density, then we have

N = ȠB, and

C = B log

_{2}

and

**equation**

The above limit may be found with the help of following standard expression:

**equation**

Therefore, we have

**equation**

** The improvement in the signal-to-noise ratio in wideband FM and PCM can be properly understood.

In the presence of noise*, the channel capacity is finite. Therefore, it is impossible to transmit complete information in a bandlimited signal by a physical channel in the presence In the presence of noise**,

In the absence of noise, N = 0, the channel capacity is infinite and hence any desired signal can be transmitted.

It is quite obvious that it is impossible to transmit the complete information contained in a continuous signal unless the transmitted signal power is made infinite.

Because of presence of noise, there is always a certain amount of uncertainty in the received signal. In fact, the transmission of complete information in a signal will mean a zero amount of uncertainty. Actually, the amount of uncertainty can be made arbitrarily small by increasing the channel capacity***. However, it can never be made zero.

**9.17 UNCERTAINTY IN THE TRANSMISSION PROCESS**

As a matter of fact, the uncertainty is introduced in the process of transmission. Therefore, although, it is possible to transmit the complete information in a continuous signal at the transmitter end, it is impossible to recover this infinite amount of information at the receiver end. The amount of information that can be recovered per second at the receiver is

*C*bits per second where

*C*is the channel capacity.

In place of transmitting all of the information at the transmitter, we can approximate the signal so that its information contents are reduced to

*C*bits per second and transmit this approximated signal which has a finite information content.

Now, it will be possible to recover all of the information that has been transmitted. In fact, this happens in pulse code modulation (PCM) system.

Now, the question arises: How can we approximate a signal so that the approximated signal has a finite information content per second?

In fact, this can be done by a process known as

**quantization.**

**For illustration, let us consider the continuous signal bandlimited to f**

*Hz as shown In figure 9.14.*

_{m}Now, to transmit the information in this signal, we require to transmit only 2f

*samples per second. Figure 9.14 also shows samples.*

_{m}**diagram**

**figure 9.14**

* That finite value of N.

** Existing over the same band.

*** BY increasing the bandwidth and/or increasing the signal power.

The samples can take any value, and to transmit them directly, we require pulses which can assume an infinite number of levels.

Therefore, instead of transmitting the exact values of these pulses, we round off the amplitudes to the nearest one of the finite number of permitted values.

In this example, all of the pulses are approximated to the nearest tenth of a volt.

It may be noted from the figure that each of the pulses transmitted assumes any one of the 16 levels, and thus, carries an amount of information of log

_{2}16 = 4 bits. Also, since, there are 2f

*samples per second, the total information content of the approximated signal is 8 f*

_{m}*bits per second.*

_{m}It the channel capacity is greater than or equal to 8 f

*bits per second, all of the information that has been transmitted will be recovered completely without any uncertainty. This means that the received signal will be an exact replica of the approximated signal that was transmitted.*

_{m}It can be shown that if the channel capacity is 8 f

*bits per second, the process of transmission does not introduce an additional degree of uncertainty.*

_{m}Now, let us consider that we are using a channel of bandwidth f

*Hz to transmit these samples then, since the channel capacity required will be 8 f*

_{m}*bits per second, the required signal-to-noise power ratio will be given by*

_{m}8fm = fm log2

Therefore, = 256

We have already discussed that the number of levels that can be distinguished at the receiver is .

It is obvious that in this case, the receiver can distinguish the 16 states without error. Thus, although the process of transmission introduces some noise in the desired signal, the levels are far enough apart to be distinguishable at the receiver. In other words, we can say that a channel of the capacity of 8f

*bits per second can transmit information of 8 f*

_{m}*bits per second virtually without error.*

_{m}**9.18 EXCHANGE OF BANDWIDTH FOR SIGNAL-TO-NOISE RATIO**

**As a matter of fact, a given signal can be transmitted with a given amount of uncertainty by a channel of finite capacity. We have already discussed that a given channel capacity may be obtained by any number of combinations of bandwidth and signal power. Actually, it is possible to exchange one for the other. Now, let us discuss how such an exchange can be affected.**

Again, let us consider the transmission of signal x(t) shown in figure 9.14, we have already observed that if an uncertainty of 0.1 volt is tolerated, the information content of the signal is given by 8 f

*bits per second. Now, let us show that this information can be transmitted by various combinations of bandwidths and signal power.*

_{m}One important and possible way of transmission is to send 2 f

*samples per second directly. Each of the samples can assume any of the 16 states. In this case, we must have a signal-to-nose ratio which allows us to distinguish 16 states.*

_{m}Clearly, = 16.

moreover, in order to transmit 2fm pulses per second, we require a channel of bandwidth f

*Hz.*

_{m}Therefore, the required channel capacity

*C*is expressed as

C = f

*log*

_{m}_{2}

or C = f

*log*

_{m}_{2}(16)

^{2}= 8 f

*bits per second.*

_{m}Hence, the channel capacity

*C*is exactly equal to the amount of information per second in the signal x(t).

In other method of transmission, we may transmit the samples in figure 8.1 by quaternary pulse (pulses that can assume four states). Thus, it is obvious that we require a group of two quaternary pulses to transmit each sample that can assume 16 states.

In this case, the signal-to-noise ratio required at the receiver to distinguish pulses that assume four distinct states is = 4.

Hence, in this mode of transmission, the required signal power is reduced. But, now, we have to transmit twice as many pulses per second i.e., 4 f

*pulses per second.*

_{m}Hence, the required bandwidth is 2 f

*Hz.*

_{m}In this case, the channel capacity

*C*will be

**equation**