Sciweavers

STOC
2007
ACM

Lower bounds for randomized read/write stream algorithms

14 years 4 months ago
Lower bounds for randomized read/write stream algorithms
Motivated by the capabilities of modern storage architectures, we consider the following generalization of the data stream model where the algorithm has sequential access to multiple streams. Unlike the data stream model, where the stream is read only, in this new model (introduced in [8, 9]) the algorithms can also write onto streams. There is no limit on the size of the streams but the number of passes made on the streams is restricted. On the other hand, the amount of internal memory used by the algorithm is scarce, similar to data stream model. We resolve the main open problem in [7] of proving lower bounds in this model for algorithms that are allowed to have 2-sided error. Previously, such lower bounds were shown only for deterministic and 1-sided error randomized algorithms [9, 7]. We consider the classical set disjointness problem that has proved to be invaluable for deriving lower bounds for many other problems involving data streams and other randomized models of computation...
Paul Beame, T. S. Jayram, Atri Rudra
Added 03 Dec 2009
Updated 03 Dec 2009
Type Conference
Year 2007
Where STOC
Authors Paul Beame, T. S. Jayram, Atri Rudra
Comments (0)