Sciweavers

12 search results - page 1 / 3
» DIPBench: An independent benchmark for Data-Intensive Integr...
Sort
View
ICDE
2008
IEEE
106views Database» more  ICDE 2008»
13 years 11 months ago
DIPBench: An independent benchmark for Data-Intensive Integration Processes
— The integration of heterogeneous data sources is one of the main challenges within the area of data engineering. Due to the absence of an independent and universal benchmark fo...
Matthias Böhm, Dirk Habich, Wolfgang Lehner, ...
ICDE
2008
IEEE
146views Database» more  ICDE 2008»
14 years 6 months ago
DIPBench Toolsuite: A Framework for Benchmarking Integration Systems
So far the optimization of integration processes between heterogeneous data sources is still an open challenge. A first step towards sufficient techniques was the specification of ...
Dirk Habich, Matthias Böhm, Uwe Wloka, Wolfga...
CLOUD
2010
ACM
13 years 9 months ago
Comet: batched stream processing for data intensive distributed computing
Batched stream processing is a new distributed data processing paradigm that models recurring batch computations on incrementally bulk-appended data streams. The model is inspired...
Bingsheng He, Mao Yang, Zhenyu Guo, Rishan Chen, B...
ICCTA
2007
IEEE
13 years 8 months ago
Register Sharing Verification During Data-Path Synthesis
The variables of the high-level specifications and the automatically generated temporary variables are mapped on to the data-path registers during data-path synthesis phase of hig...
Chandan Karfa, Chittaranjan A. Mandal, Dipankar Sa...
ISCAS
2008
IEEE
169views Hardware» more  ISCAS 2008»
13 years 11 months ago
Sigma-delta learning for super-resolution independent component analysis
— Many source separation algorithms fail to deliver robust performance in presence of artifacts introduced by cross-channel redundancy, non-homogeneous mixing and highdimensional...
Amin Fazel, Shantanu Chakrabartty