# Error Control Coding

Many communication channels suffer from noise, interference or distortion due to hardware imperfections, or physical limitations. The goal of error control coding is to encode information in such a way that even if the channel (or storage medium) introduces errors, the receiver can correct the errors and recover the original transmitted information.

## 1948: Dawn of the Information Age

It is arguable that the information age had its beginnings in the late 1940's. Although depending upon much earlier mathematical and scientific work, three discoveries in particular were about to change forever the way the world communicated. Not only were the announcements of these discoveries almost simultaneous, they stemmed from a single commercial research laboratory.

C. Shannon |

*Bell System Technical Journal*Vol. 27, pp 379-423 and pp 623-656, July, October 1948, Shannon introduced a fundamental measure of information - entropy, and coined the term "bit", referring to the amount of information in a single binary digit. He went on to show that every noisy channel has associated with it an information capacity C bits/second, below which it is possible to encode messages (by representing them as long binary strings) such that they can be received and decoded with arbitrarily low probability of error. His proof of the capacity theorem however relied on a mathematical "trick". He showed for transmission rates below C, that the error probability averaged over all randomly selected codes can be made as small as desired. While this implies the existence of good codes, it left open the problem of designing such codes. Today, thanks to reliable communications, you can read Shannon's paper online. Claude Shannon is today honored by the "Shannon Lectures", delivered by recipients of the "Shannon Award", which is awarded each year by the IEEE Information Theory Society to outstanding researchers in the field of information theory.

R. Hamming |

*Bell Systems Technical Journal*, Vol 29, pp 147-160, Apr. 1950. In 1988 Hamming received the first IEEE Richard W. Hamming Medal "For exceptional contributions to information sciences and systems".

Transistor Inventors |

## Fifty Years of Progress

Australian researcher A. Barbulescu with one of the world's first implementations of a Turbo Codec |

Coding research
concentrated on two main areas, *algebraic codes* and
*trellis codes*. Algebraic codes, such as the
celebrated Reed-Solomon and BCH codes build algebraic
structure into the code such that the code can be decoded using
computationally efficient algorithms for solving systems of
equations.
Trellis codes are easily decoded using
trellis-based algorithms such as the Viterbi algorithm.

Today, thanks to the discovery of the "turbo principle", we can get very close for a range of different channels. In practical terms, this means the most efficient use of bandwidth and power, which is very important for portable wireless devices.

Somewhat ironically, another class of codes that turned out to
approach Shannon's bound, namely
*low density parity check codes* (LDPC codes), which were
actually
discovered by Robert Gallager in the 60's. Like turbo codes,
LDPC codes have an iterative decoder. The true power of
these codes was however overlooked, due in part to the lack of
sufficient computing power for their implementation at the
time.

Today, the codes coming closest to the Shannon bound are LDPC codes, and much work has recently been focussed on their construction and analysis. The IEEE Information Theory Society recognised major contributions in this area through the award of the 2002 paper prize jointly to T.J. Richardson and R.L. Urbanke "The Capacity of Low-Density Parity-Check Codes Under Message-Passing Decoding" and M.G. Luby, M. Mitzenmacher, M.A. Shokrollahi and D.A. Spielman, "Improved Low-Density Parity-Check Codes Using Irregular Graphs". Both of these papers appeared in the February 2001 issue of the IEEE Transactions on Information Theory.

The timeline below shows some of the more important developments in the theory and application of error control coding. You can click on the various technologies in the lower portion of the figure to find out about the types of codes used.

Error Control Coding Timeline. Click on technologies in the lower section to find out more. |

## Key Research Challenges

### Emerging Media

The discovery of turbo codes and related techniques has resulted in nearly optimal codes for certain classes of channels. There are however still many channels of interest for which the gap to Shannon's bound is yet to be closed.

Of particular interest are the hostile channels experienced by the transmission of high bandwidth data from highly mobile terminals (think broadband internet access to trains for example).

Other research challenges are emerging from new transmission media, such as quantum channels, or non-linear optical channels, which have very different properties to the channels considered by classical coding theory.

### Network Coding

The current paradigm for digital communications networks is to separate the various functions of the network. For example, the physical layer (perhaps using error control coding) provides an "error free channel" to the next layer in the network. However information theory indicates that a more coordinated system design may be more efficient. The area of network coding is an exciting field that is only just beginning to see activity.

## More Information

- InformationTheory.net site promoting Information Theory in Australia
- IEEE Information Theory Society
- The Error Control Code Site, by Robert Morelos-Zaragoza. A page with programs implementing many different coding schemes as well as many links to other coding sites.
- An Introduction to the Analysis of Iterative Coding Systems, by Thomas Richardson and Rudiger Urbanke.
- Turbo Code Site at Ecole Nationale Superieure des Telecommunications de Bretagne (birthplace of turbo codes).

## Australian Error Control Coding Researchers

Note: You can search for ACoRN Members using the Member Search facility