Posted by: atri | February 16, 2009

Lecture 14: Randomized Communication Complexity

In today’s lecture, we studied the notion of randomized complexity,  where the protocols are allowed to use random bits and have a bounded probability of error (over the randomness used by the protocol).  In particular, let CC_{\epsilon}(f) denote the minimum number of bits exchanged by any protocol for f that errs with probability at most \epsilon. We first saw that any asymptotically binary good code (that can be generated by a deterministic algorithm) implies that CC_{\frac{1}{3}}(Equality)\le O(\log{n}). We then saw how using a Reed-Solomon code over an alphabet of size \Theta(n^2), we can show that CC_{\frac{1}{n}}(Equality)\le O(\log{n}) (this is a log factor better than the protocol which repeats protocol based on binary codes O(\log{n}) times).

Next lecture, we’ll go back to our good old tradeoff of rate vs. distance and we’ll prove the Plotkin bound.


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


%d bloggers like this: