Error Control Coding

Many communication channels suffer from noise, interference or distortion due to hardware imperfections, or physical limitations. The goal of error control coding is to encode information in such a way that even if the channel (or storage medium) introduces errors, the receiver can correct the errors and recover the original transmitted information.

1948: Dawn of the Information Age

It is arguable that the information age had its beginnings in the late 1940's.  Although depending upon much earlier mathematical and scientific work, three discoveries in particular were about to change forever the way the world communicated.  Not only were the announcements of these discoveries almost simultaneous, they stemmed from a single commercial research laboratory.

C. Shannon
C. Shannon
Claude Shannon (1916-2001), a mathematician then working at Bell Laboratories, is generally regarded as the father of Information Theory. Prior to Shannon, it was believed that in order to achieve higher reliability in communications over noisy channels, it was necessary to keep increasing the transmit power. In his paper "A Mathematical Theory of Communication", Bell System Technical Journal Vol. 27, pp 379-423 and pp 623-656, July, October 1948, Shannon introduced a fundamental measure of information - entropy, and coined the term "bit", referring to the amount of information in a single binary digit. He went on to show that every noisy channel has associated with it an information capacity C bits/second, below which it is possible to encode messages (by representing them as long binary strings) such that they can be received and decoded with arbitrarily low probability of error. His proof of the capacity theorem however relied on a mathematical "trick". He showed for transmission rates below C, that the error probability averaged over all randomly selected codes can be made as small as desired. While this implies the existence of good codes, it left open the problem of designing such codes. Today, thanks to reliable communications, you can read Shannon's paper online. Claude Shannon is today honored by the "Shannon Lectures", delivered by recipients of the "Shannon Award", which is awarded each year by the IEEE Information Theory Society to outstanding researchers in the field of information theory.

R. Hamming
R. Hamming
Shannon did however refer to a practical code in his paper. This code was found by his Bell Labs colleague, Richard Hamming (1915-1998). You may have already heard Hamming's name in connection with the Hamming window used in Fourier analysis. Hamming became interested in the idea of codes that could correct errors when he was working on a early "relay computer" at Bell Labs. Upon detection of an error, the machine would sound an alarm, and the operator would try to fix it. Unfortunately for Hamming, but fortunately for Error Control Coding, Hamming only had access to the computer on weekends, when there was no operator to fix errors. He decided that he needed a way of enabling the computer to automatically correct errors. Hamming constructed a code which added three additional "parity" bits to four information bits. The parity bits are chosen based on the information bits in such a way that the resulting codeword must obey a set of linear equations. If any single error occurs (a bit is reversed), the codeword no longer satisfies the equations, and furthermore, it is possible to tell which position is in error. The Hamming code was the first result in the field that is now known as Coding Theory, the object of which is to find good codes that are also easy to implement. Although the Hamming code was referred to by Shannon in 1948, patent considerations prevented its independent publication until 1950. "Error Detecting and Error Correcting Codes", Bell Systems Technical Journal, Vol 29, pp 147-160, Apr. 1950. In 1988 Hamming received the first IEEE Richard W. Hamming Medal "For exceptional contributions to information sciences and systems".

R. Hamming
Transistor Inventors
When it comes to implementation, one cannot overlook the impact of the transistor on the information age. Dr. John Bardeen (left), Dr. William Shockley (middle) and Dr. Walter Brattain (right), all members of Bell Laboratories, first observed the transistor effect on December 15, 1947. The invention was announced June 30, 1948. The three inventors were awarded the Nobel Prize in physics in 1956. You can find out about the first fifty years of the transistor at Lucent Technologies retrospective. Without the transistor and other solid state devices, computing and communications would have languished inside house sized boxes filled with vacuum tubes or relays.

Fifty Years of Progress

A. Barbulescu and Turbo Codec
Australian researcher A. Barbulescu with one of the world's first implementations of a Turbo Codec
Loosely speaking, error control coding history can be divided into "pre-turbo code" and "post-turbo code". Turbo codes and their ingenious iterative decoder, invented in 1993 by French researchers Claude Berrou and Alain Glavieux, revolutionized the area. Prior to the invention of these codes, no-one really knew how to get close to the theoretical performance limits promised by Shannon. You can find out about the fascinating series of events that led to the discovery of the turbo code in a tutorial paper written by Claude Berrou and Alain Glavieux.

Coding research concentrated on two main areas, algebraic codes and trellis codes. Algebraic codes, such as the celebrated Reed-Solomon and BCH codes build algebraic structure into the code such that the code can be decoded using computationally efficient algorithms for solving systems of equations. Trellis codes are easily decoded using trellis-based algorithms such as the Viterbi algorithm.

Today, thanks to the discovery of the "turbo principle", we can get very close for a range of different channels. In practical terms, this means the most efficient use of bandwidth and power, which is very important for portable wireless devices.

Somewhat ironically, another class of codes that turned out to approach Shannon's bound, namely low density parity check codes (LDPC codes), which were actually discovered by Robert Gallager in the 60's. Like turbo codes, LDPC codes have an iterative decoder. The true power of these codes was however overlooked, due in part to the lack of sufficient computing power for their implementation at the time.

Today, the codes coming closest to the Shannon bound are LDPC codes, and much work has recently been focussed on their construction and analysis. The IEEE Information Theory Society recognised major contributions in this area through the award of the 2002 paper prize jointly to T.J. Richardson and R.L. Urbanke "The Capacity of Low-Density Parity-Check Codes Under Message-Passing Decoding" and M.G. Luby, M. Mitzenmacher, M.A. Shokrollahi and D.A. Spielman, "Improved Low-Density Parity-Check Codes Using Irregular Graphs". Both of these papers appeared in the February 2001 issue of the IEEE Transactions on Information Theory.

The timeline below shows some of the more important developments in the theory and application of error control coding. You can click on the various technologies in the lower portion of the figure to find out about the types of codes used.

Error Control Coding Timeline.
Click on technologies in the lower section to find out more.

Key Research Challenges

Emerging Media

The discovery of turbo codes and related techniques has resulted in nearly optimal codes for certain classes of channels. There are however still many channels of interest for which the gap to Shannon's bound is yet to be closed.

Of particular interest are the hostile channels experienced by the transmission of high bandwidth data from highly mobile terminals (think broadband internet access to trains for example).

Other research challenges are emerging from new transmission media, such as quantum channels, or non-linear optical channels, which have very different properties to the channels considered by classical coding theory.

Network Coding

The current paradigm for digital communications networks is to separate the various functions of the network. For example, the physical layer (perhaps using error control coding) provides an "error free channel" to the next layer in the network. However information theory indicates that a more coordinated system design may be more efficient. The area of network coding is an exciting field that is only just beginning to see activity.

More Information

Australian Error Control Coding Researchers

Barbulescu, Sorin Adrian
Bhaskaran Pillai, Sibi Raj
Clarkson, I. Vaughan L.
Davis, Linda M
Grant, Alex J
Ho, Mark S C
Jayalath, Dhammika
Johnson, Sarah J
Kellett, Christopher M
Kind, Adriel P.
Kuijper, Margreta
Land, Ingmar R
Letzepis, Nicholas Alexander
Li, Yonghui
Libman, Lavy
Naguleswaran, Sanjeev
Ngo, Nghia Hieu
Ning, Jun
Pietrobon, Steven S
Rasmussen, Lars K
Reed, Mark C
Reid, Aaron Barry
Rice, Mark
Rueffer, Bjoern S
Sadeghi, Parastoo
Seberry, Jennifer
Shi, Zhenning
Trajkovic, Vladimir
Vucetic, Branka S
Weller, Steven R
Xiang, Wei
Yuan, Jinhong
Zhang, Weimin
Zhou, Zhendong

Note: You can search for ACoRN Members using the Member Search facility