Posted by: atri | February 16, 2009

Lecture 14: Randomized Communication Complexity

In today’s lecture, we studied the notion of randomized complexity,  where the protocols are allowed to use random bits and have a bounded probability of error (over the randomness used by the protocol).  In particular, let $CC_{\epsilon}(f)$ denote the minimum number of bits exchanged by any protocol for $f$ that errs with probability at most $\epsilon$. We first saw that any asymptotically binary good code (that can be generated by a deterministic algorithm) implies that $CC_{\frac{1}{3}}(Equality)\le O(\log{n})$. We then saw how using a Reed-Solomon code over an alphabet of size $\Theta(n^2)$, we can show that $CC_{\frac{1}{n}}(Equality)\le O(\log{n})$ (this is a log factor better than the protocol which repeats protocol based on binary codes $O(\log{n})$ times).

Next lecture, we’ll go back to our good old tradeoff of rate vs. distance and we’ll prove the Plotkin bound.