Sciweavers

Share
ESANN
2007

Classifying n-back EEG data using entropy and mutual information features

8 years 10 months ago
Classifying n-back EEG data using entropy and mutual information features
In this work we show that entropy (H) and mutual information (MI) can be used as methods for extracting spatially localized features for classification purposes. In order to increase accuracy of entropy estimation, we use a Bayesian approach with a Dirichlet prior to derive estimation equations. We calculate the H and MI features for each electrode (H) and pair of electrodes (MI) in three frequency bands and use them to train the Naive Bayes classifier. We test the H and MI features on one/five trial long segments of n-back memory EEG signals and show that they outperform power spectrum and linear correlation features respectively.
Liang Wu, Predrag Neskovic, Etienne Reyes, Elena F
Added 29 Oct 2010
Updated 29 Oct 2010
Type Conference
Year 2007
Where ESANN
Authors Liang Wu, Predrag Neskovic, Etienne Reyes, Elena Festa, Heindel William
Comments (0)
books