Recurrent neural networks are theoretically capable of learning complex temporal sequences, but training them through gradient-descent is too slow and unstable for practical use i...
The best recent supervised sequence learning methods use gradient descent to train networks of miniature nets called memory cells. The most popular cell structure seems somewhat ar...
In this paper, we propose a post randomization technique to learn a Bayesian network (BN) from distributed heterogeneous data, in a privacy sensitive fashion. In this case, two or ...
While much work on learning in planning focused on learning domain physics (i.e., action models), and search control knowledge, little attention has been paid towards learning use...
Nan Li, William Cushing, Subbarao Kambhampati, Sun...
- This work investigates bandwidth learning algorithms in a version of a distributed heterogeneous data dissemination system called the Agile Information Control Environment (AICE)...