Sciweavers

ICPR
2006
IEEE

Onset Detection through Maximal Redundancy Detection

14 years 5 months ago
Onset Detection through Maximal Redundancy Detection
We propose a criterion, called `maximal redundancy', for onset detection in time series. The concept redundancy is adopted from information theory and indicates how well a signal locally can be explained by an underlying model. It is shown that a local maximum in the redundancy is a good indicator for an onset. It is proven that `maximal redundancy' detection is a statistical asymptotically optimal detector for AR processes. It also accounts for potentially non-Gaussian time series and nonGaussian innovations in the AR processes. Several applications are shown where the new criterion has been successfully applied.
Gert Van Dijck, Marc M. Van Hulle
Added 09 Nov 2009
Updated 09 Nov 2009
Type Conference
Year 2006
Where ICPR
Authors Gert Van Dijck, Marc M. Van Hulle
Comments (0)