Posted by: atri | April 27, 2009

## Lecture 39: Achieving List Decoding Capacity

In today’s lecture, we looked at the intuition behind why folded RS codes can achieve list decoding capacity in polynomial time. The intuitions are from this survey. Unfortunately, we did not have any time to talk about the details. You can look at the corresponding lecture notes from Fall 07 (which have to be polished). Alternatively, you can look at this chapter from my thesis.

A semester is too short a time to do justice to the numerous interesting topics in coding theory.  Let me mention some of these topics in no particular order (that we either mentioned in passing in the class or did not mention them at all).

• LDPC codes. We just saw the definition of these codes. As I had mentioned in class, these were defined by Gallager in his thesis in the 60s.The late 90s saw a resurgence in research activit in LDPC codes. These latter codes can provably achieve the capacity of the $BEC_{\alpha}$. They also experimentally seem to achieve the capacity of the $BSC_p$. The big advantage for these codes are the linear time encoding and decoding algorithms (the dependence on $\epsilon$, the distance from capacity, is also some small polynomial). For more details see this survey by Venkat Guruswami.
• Expander codes. In class we showed that expander codes are aysmptotically good. These give us the only linear time encodable and decodable binary codes (in the worst-case noise model). For more details on expanders codes as well as another class of application of expanders to codes, see this survey by Venkat Guruswami.
• Algebraic Geometry codes. These codes beat the Gilbert-Varshamov bound for alphabets $q\ge 49$. Steve will tell us more about these codes in his presentation. See my post on the corresponding project topic from fall 07 for more pointers.
• Linear Programming bounds. The best known lower bound on the rate vs. distance question are achieved via the so called Linear Programming bound. See my post on the project topic from Fall 07 for more pointers.
• Convolutional Codes. All the codes that we covered in class were block codes. That is, the block length of such codes are fixed. However, there are many applications where having a fixed block length might be too wasteful. Convolutional codes allow for variable block lengths and “on the fly” encoding and decoding.
• Applications in Complexity theory. There are numerous applications of coding theory in theoretical computer science and in particular, complexity theory (and cryptography). We only had time to cover two such applications: communication complexity and $\ell$-wise independent sources. However, some of the biggest recent advances in complexity theory (for example the PCP theorem) have used tools from coding theory. For more details, see this survey by Luca Trevisan or the my blog posts on the following project topics: codeword testingextractorslocally decodable codes, and hardness amplification. For applications of list decoding in complexity theory see this survey by Madhu Sudan.