Loading…
Tight bounds for synchronous communication of information using bits and silence
We establish worst-case and average-case lower bounds on the trade-off between the time and bit complexity for two-party communication in synchronous networks. We prove that the bounds are tight by presenting protocols whose bit-time complexity match the ones expressed by the lower bounds. We actual...
Saved in:
Published in: | Discrete Applied Mathematics 2003-06, Vol.129 (1), p.195-209 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | We establish worst-case and average-case lower bounds on the trade-off between the time and bit complexity for two-party communication in synchronous networks. We prove that the bounds are tight by presenting protocols whose bit-time complexity match the ones expressed by the lower bounds. We actually show that the algorithms are
everywhere optimal: at any point of the trade-off and for any universe of data to be communicated, no other solution has better complexity to communicate any element of that universe (within a fixed relabeling). Similar results are derived when transmissions are subject to corruptions.
In these results, the number of bits is a priori agreed upon. We also derive lower bounds on the worst case complexity of two-party communication when the number of bits is variable; the bounds prove that any improvement would be by an additive constant (even in presence of an oracle). |
---|---|
ISSN: | 0166-218X 1872-6771 |
DOI: | 10.1016/S0166-218X(02)00240-8 |