Sciweavers

100 search results - page 7 / 20
» Distributed training of large scale exponential language mod...
Sort
View
OSDI
2008
ACM
14 years 12 months ago
DryadLINQ: A System for General-Purpose Distributed Data-Parallel Computing Using a High-Level Language
DryadLINQ is a system and a set of language extensions that enable a new programming model for large scale distributed computing. It generalizes previous execution environments su...
Yuan Yu, Michael Isard, Dennis Fetterly, Mihai Bud...
83
Voted
ACL
2010
14 years 7 months ago
Towards Open-Domain Semantic Role Labeling
Current Semantic Role Labeling technologies are based on inductive algorithms trained over large scale repositories of annotated examples. Frame-based systems currently make use o...
Danilo Croce, Cristina Giannone, Paolo Annesi, Rob...
IPPS
2009
IEEE
15 years 4 months ago
Scalable RDMA performance in PGAS languages
Partitioned Global Address Space (PGAS) languages provide a unique programming model that can span shared-memory multiprocessor (SMP) architectures, distributed memory machines, o...
Montse Farreras, George Almási, Calin Casca...
EMNLP
2010
14 years 7 months ago
Efficient Graph-Based Semi-Supervised Learning of Structured Tagging Models
We describe a new scalable algorithm for semi-supervised training of conditional random fields (CRF) and its application to partof-speech (POS) tagging. The algorithm uses a simil...
Amarnag Subramanya, Slav Petrov, Fernando Pereira
KDD
2008
ACM
178views Data Mining» more  KDD 2008»
15 years 10 months ago
Training structural svms with kernels using sampled cuts
Discriminative training for structured outputs has found increasing applications in areas such as natural language processing, bioinformatics, information retrieval, and computer ...
Chun-Nam John Yu, Thorsten Joachims