Sciweavers

ICML
2005
IEEE

PAC-Bayes risk bounds for sample-compressed Gibbs classifiers

14 years 5 months ago
PAC-Bayes risk bounds for sample-compressed Gibbs classifiers
We extend the PAC-Bayes theorem to the sample-compression setting where each classifier is represented by two independent sources of information: a compression set which consists of a small subset of the training data, and a message string of the additional information needed to obtain a classifier. The new bound is obtained by using a prior over a data-independent set of objects where each object gives a classifier only when the training data is provided. The new PAC-Bayes theorem states that a Gibbs classifier defined on a posterior over samplecompressed classifiers can have a smaller risk bound than any such (deterministic) samplecompressed classifier.
François Laviolette, Mario Marchand
Added 17 Nov 2009
Updated 17 Nov 2009
Type Conference
Year 2005
Where ICML
Authors François Laviolette, Mario Marchand
Comments (0)