Sciweavers

Share
DEXA
2009
Springer

Significance-Based Failure and Interference Detection in Data Streams

10 years 5 months ago
Significance-Based Failure and Interference Detection in Data Streams
Detecting the failure of a data stream is relatively easy when the stream is continually full of data. The transfer of large amounts of data allows for the simple detection of interference, whether accidental or malicious. However, during interference, data transmission can become irregular, rather than smooth. When the traffic is intermittent, it is harder to detect when failure has occurred and may lead to an application at the receiving end requesting retransmission or disconnecting. Request retransmission places additional load on a system and disconnection can lead to unnecessary reversion to a checkpointed database, before reconnecting and reissuing the same request or response. In this paper, we model the traffic in data streams as a set of significant events, with an arrival rate distributed with a Poisson distribution. Once an arrival rate has been determined, over-time, or lost, events can be determined with a greater chance of reliability. This model also allows for the alte...
Nickolas J. G. Falkner, Quan Z. Sheng
Added 16 Aug 2010
Updated 16 Aug 2010
Type Conference
Year 2009
Where DEXA
Authors Nickolas J. G. Falkner, Quan Z. Sheng
Comments (0)
books