Partha P. Mitra, Jason B. Stark, and Andrew G. Green
Through widespread usage, the word “information” has come to have a tangible, concrete feel to it. Today we speak glibly of information flow and of information superhighways. Information is an abstraction, nevertheless, and it speaks to the insights of Claude Shannon1 that we have a formal theory that allows us to describe in fairly precise terms the flow of information through a communication channel. Although the basic formalism is quite general, information theory was initially developed in the context of telephonic communication through copper cable, and dealt mostly with a linear propagation channel, as exemplified by Shannon’s famous formula for the capacity of a channel (Table 1) with additive white Gaussian noise (AWGN). Such a channel is defined by a linear relationship between the output time series Y(t) and input X(t) through the relation Y(t) = X(t) + N(t), where N(t) is a Gaussian noise process with a flat power spectrum. All three processes are assumed to be bandlimited with a bandwidth W. The capacity of this channel is given by C=W log2(1+S/N), where S and N are the signal and noise powers, respectively.
Access to the full text of this article is restricted. In order to view this article please log in.