In today’s lecture, we looked at the intuition behind why folded RS codes can achieve list decoding capacity in polynomial time. The intuitions are from this survey. Unfortunately, we did not have any time to talk about the details. You can look at the corresponding lecture notes from Fall 07 (which have to be polished). Alternatively, you can look at this chapter from my thesis.

A semester is too short a time to do justice to the numerous interesting topics in coding theory. Let me mention some of these topics in no particular order (that we either mentioned in passing in the class or did not mention them at all).

*LDPC codes*. We just saw the definition of these codes. As I had mentioned in class, these were defined by Gallager in his thesis in the 60s.The late 90s saw a resurgence in research activit in LDPC codes. These latter codes can provably achieve the capacity of the . They also experimentally seem to achieve the capacity of the . The big advantage for these codes are the linear time encoding and decoding algorithms (the dependence on , the distance from capacity, is also some small polynomial). For more details see this survey by Venkat Guruswami.*Expander codes*. In class we showed that expander codes are aysmptotically good. These give us the only linear time encodable and decodable binary codes (in the worst-case noise model). For more details on expanders codes as well as another class of application of expanders to codes, see this survey by Venkat Guruswami.*Algebraic Geometry codes*. These codes beat the Gilbert-Varshamov bound for alphabets . Steve will tell us more about these codes in his presentation. See my post on the corresponding project topic from fall 07 for more pointers.*Linear Programming bounds*. The best known lower bound on the rate vs. distance question are achieved via the so called Linear Programming bound. See my post on the project topic from Fall 07 for more pointers.*Convolutional Codes*. All the codes that we covered in class were block codes. That is, the block length of such codes are fixed. However, there are many applications where having a fixed block length might be too wasteful. Convolutional codes allow for variable block lengths and “on the fly” encoding and decoding.*Applications in Complexity theory*. There are numerous applications of coding theory in theoretical computer science and in particular, complexity theory (and cryptography). We only had time to cover two such applications: communication complexity and -wise independent sources. However, some of the biggest recent advances in complexity theory (for example the PCP theorem) have used tools from coding theory. For more details, see this survey by Luca Trevisan or the my blog posts on the following project topics: codeword testing, extractors, locally decodable codes, and hardness amplification. For applications of list decoding in complexity theory see this survey by Madhu Sudan.

[…] we very briefly looked at some topics we did not have time to cover this semester. Here is an old blog post with more details on those topics (and […]

By:

Lecture 42: Wrap-up « Error Correcting Codes: Combinatorics, Algorithms and Applicationson April 26, 2010at 5:31 pm

[…] to talk about the many important topics that we did not get to cover in this course. Here is an old blog post with a list of topics that we did not get to study in this […]

By:

Lecture 40: Approximating NP-witnesses « Error Correcting Codes: Combinatorics, Algorithms and Applicationson May 11, 2011at 11:30 am

[…] We then quickly went through some topics we did not have the time to cover in this course. This old blog post has some extra pointers for some of those […]

By:

Lect 41: Wrapup | Error Correcting Codes: Combinatorics, Algorithms and Applicationson May 1, 2013at 11:50 am