In class today, we proved the positive part of Shannon’s capacity theorem modulo the so called Markov or averaging argument. At the end of this post, a proof of the Markov argument in general is presented and in the next lecture, we will see its use in the specific context of Shannon’s proof. Scribed notes for Lecture 10 from Fall 07 contains the material covered today.

I also handed out feedback forms: thanks to everyone who filled in the form. I’ll address the issues that came up in the next post.

Below the fold is the proof of the Markov argument (the proof was typeset by Luca Trevisan‘s latex2wp program).

The “Markov’s argument” or the “averaging argument” is a simple yet pretty effective lemma. Next, we state the lemma in a more general form than we will need and then present its proof.

**Lemma 1 (Averaging argument)** * Let be a finite set and let be function, where denotes the set of all non-negative reals. Further, let *

*
* Then for every real (which can depend on ) and any subset such that

* we have . *

*Proof:* For the sake of contradiction assume that there is a subset with that satisfies (2). Now consider the following sequence of relationships

where in the first inequality we have used the fact from (2) that for and the fact that for . Thus, we get , which contradicts (1).

We will actually use the following corollary of the above lemma in our random coding with expurgation argument:

**Lemma 2** * Let be a finite set and let be function such that . Let be the set of with the smallest values. Then, *

*
** *

*Proof:* If there exists a such that , then violates Lemma 1.

In the Shannon’s proof, we will use the lemma in the following way: will be the set of all messages and for any will denote the decoding error probability for .

**Remark 1** * The lemmas above have been stated for finite and (implicitly) uniform distribution over elements in . One can easily generalize them to any distribution over (possibly non-finite) sets : the only difference is that the “size” of a subset or will be replaced by , or the total probability mass of under . *

### Like this:

Like Loading...

*Related*

## Leave a Reply