Sciweavers

2027 search results - page 14 / 406
» Learning and Generalization with the Information Bottleneck
Sort
View
ICML
2006
IEEE
15 years 10 months ago
Constructing informative priors using transfer learning
Many applications of supervised learning require good generalization from limited labeled data. In the Bayesian setting, we can try to achieve this goal by using an informative pr...
Rajat Raina, Andrew Y. Ng, Daphne Koller
IEICET
2007
114views more  IEICET 2007»
14 years 9 months ago
Analytic Optimization of Adaptive Ridge Parameters Based on Regularized Subspace Information Criterion
In order to obtain better learning results in supervised learning, it is important to choose model parameters appropriately. Model selection is usually carried out by preparing a ...
Shun Gokita, Masashi Sugiyama, Keisuke Sakurai
NN
2002
Springer
224views Neural Networks» more  NN 2002»
14 years 9 months ago
Optimal design of regularization term and regularization parameter by subspace information criterion
The problem of designing the regularization term and regularization parameter for linear regression models is discussed. Previously, we derived an approximation to the generalizat...
Masashi Sugiyama, Hidemitsu Ogawa
JMLR
2008
83views more  JMLR 2008»
14 years 9 months ago
Generalization from Observed to Unobserved Features by Clustering
We argue that when objects are characterized by many attributes, clustering them on the basis of a random subset of these attributes can capture information on the unobserved attr...
Eyal Krupka, Naftali Tishby
KCAP
2003
ACM
15 years 3 months ago
Building large knowledge bases by mass collaboration
Acquiring knowledge has long been the major bottleneck preventing the rapid spread of AI systems. Manual approaches are slow and costly. Machine-learning approaches have limitatio...
Matthew Richardson, Pedro Domingos