Sciweavers

ICPR
2008
IEEE

Optimal feature weighting for the discrete HMM

13 years 10 months ago
Optimal feature weighting for the discrete HMM
We propose a modified discrete HMM that includes a feature weighting discrimination component. We assume that the feature space is partitioned into subspaces and that the relevance weights of the different subspaces depends on the symbols and the states. In particular, we associate a partial probability with each symbol in each subspace. The overall observation state probability is then computed as an aggregation of the partial probabilities and their relevance weights. We consider two aggregation models. The first one is based on a linear combination, while the second one is based on a geometric combination. For both models, we reformulate the Baum-Welch learning algorithm and derive the update equations for the relevance weights and the partial state probabilities. The proposed approach is validated using synthetic and real data sets. The results are shown to outperform the baseline HMM.
Oualid Missaoui, Hichem Frigui
Added 30 May 2010
Updated 30 May 2010
Type Conference
Year 2008
Where ICPR
Authors Oualid Missaoui, Hichem Frigui
Comments (0)