Markov logic networks (MLNs) combine logic and probability by attaching weights to first-order clauses, and viewing these as templates for features of Markov networks. Learning ML...
We recall the basic idea of an algebraic approach to learning Bayesian network (BN) structures, namely to represent every BN structure by a certain (uniquely determined) vector, c...
We consider a class of learning problems that involve a structured sparsityinducing norm defined as the sum of -norms over groups of variables. Whereas a lot of effort has been pu...
Abstract. We are interested in the relationship between learning efficiency and representation in the case of supervised neural networks for pattern classification trained by conti...
In the study of information flow in the nervous system, component processes can be investigated using a range of electrophysiological and imaging techniques. Although data is diff...