Sciweavers

284 search results - page 54 / 57
» Design and Implementation of a Document Database Extension
Sort
View
KDD
2010
ACM
300views Data Mining» more  KDD 2010»
14 years 7 months ago
Using data mining techniques to address critical information exchange needs in disaster affected public-private networks
Crisis Management and Disaster Recovery have gained immense importance in the wake of recent man and nature inflicted calamities. A critical problem in a crisis situation is how t...
Li Zheng, Chao Shen, Liang Tang, Tao Li, Steven Lu...
VLDB
2007
ACM
145views Database» more  VLDB 2007»
15 years 9 months ago
Executing Stream Joins on the Cell Processor
Low-latency and high-throughput processing are key requirements of data stream management systems (DSMSs). Hence, multi-core processors that provide high aggregate processing capa...
Bugra Gedik, Philip S. Yu, Rajesh Bordawekar
176
Voted
SIGMOD
2009
ACM
190views Database» more  SIGMOD 2009»
15 years 9 months ago
Optimizing complex extraction programs over evolving text data
Most information extraction (IE) approaches have considered only static text corpora, over which we apply IE only once. Many real-world text corpora however are dynamic. They evol...
Fei Chen 0002, Byron J. Gao, AnHai Doan, Jun Yang ...
EDBT
2010
ACM
188views Database» more  EDBT 2010»
15 years 22 days ago
DEDUCE: at the intersection of MapReduce and stream processing
MapReduce and stream processing are two emerging, but different, paradigms for analyzing, processing and making sense of large volumes of modern day data. While MapReduce offers t...
Vibhore Kumar, Henrique Andrade, Bugra Gedik, Kun-...
BMCBI
2008
122views more  BMCBI 2008»
14 years 9 months ago
High-throughput bioinformatics with the Cyrille2 pipeline system
Background: Modern omics research involves the application of high-throughput technologies that generate vast volumes of data. These data need to be pre-processed, analyzed and in...
Mark W. E. J. Fiers, Ate van der Burgt, Erwin Date...