Bayesian networks, equivalently graphical Markov models determined by acyclic digraphs or ADGs (also called directed acyclic graphs or dags), have proved to be both effective and ...
A Markov Decision Process (MDP) is a general model for solving planning problems under uncertainty. It has been extended to multiobjective MDP to address multicriteria or multiagen...
The observation models in tracking algorithms are critical to both tracking performance and applicable scenarios but are often simplified to focus on fixed level of certain target...
The notion of using context information for solving highlevel vision problems has been increasingly realized in the field. However, how to learn an effective and efficient context...
This paper is focused on the Co-segmentation problem
[1] – where the objective is to segment a similar object from
a pair of images. The background in the two images may be
ar...