GBPPR Tech Bulletin #10 - Communication Channel Capacity


Excerpt from Signals - The Science of Telecommunications

Communication Channel Capacity

In his mathematical theory of communication, Claude Shannon showed that even a noisy communication channel has a channel capacity measured in bits per second.  Consider a communication channel of bandwidth B Hz.  Suppose that the power received over the channel is a signal power P and a noise power N from noise somehow added during transmission and amplification.  The channel capacity C in bits per second is:

C = B log2 (1 + S / N)

The equation shows the advantage of using a broad bandwidth to transmit messages.

Because noise is proportional to bandwidth, the thermal, or Johnson, noise power and bandwidth are related by the equation:

N = kTB

where k is Boltzmann's constant (1.38 * 10-23) and T is noise temperature in kelvins.

If N is all the noise that is added to the signal:

C = B log2 (1 + S / kTB)

Note in this equation the bandwidth B appears twice.  For a given signal power, as B increases the channel capacity increases.  Thus, if possible it is better to use more rather than less bandwidth in transmitting a signal with a given power.

As the bandwidth B is made larger and larger, the channel capacity approaches the limiting value:

C = BS / kT ln2

This equation shows that the energy necessary to transmit one bit of information can never be less than kT ln2 = 0.693 kT joule.

That limit is approached most closely in painstakingly designed communication systems for interplanetary missions.  To gain an idea of the energy involved, consider that the electricity used in lighting homes is measured by the kilowatt-hour.  A kilowatt-hour is equal to 3.6 * 106 joules.  Boltzmann's constant is equal to 1.38 * 10-23 joules per degree.  Room temperature is about 293 K.  At this temperature, kT ln2 = 1.23 * 10-27 kilowatt-hour per bit (or a decimal point followed by 26 zeros followed by 123).  In principle, it doesn't take much power to transmit a bit of information.  In practice it takes somewhat more - a few times more in well designed systems.

The foregoing calculations are "classical"; they were derived without taking into account the quantum nature of electromagnetic radiation.  I was surprised to find, some years ago, that kT ln2 joules per bit is the correct quantum mechanical result, but is is almost impossible to approach closely in signaling with light.  Still, workers at Caltech's Jet Propulsion Laboratory have attained signaling rates of several bits per photon.

Notes

log2 is a log to the base 2, and for any number X:

log2(X) = log10(X) / log10(2)

Another overview of Shannon's theorem.