Sciweavers

Share
VLDB
2016
ACM

epiC: an extensible and scalable system for processing Big Data

3 years 10 days ago
epiC: an extensible and scalable system for processing Big Data
The Big Data problem is characterized by the so called 3V features: Volume - a huge amount of data, Velocity - a high data ingestion rate, and Variety - a mix of structured data, semi-structured data, and unstructured data. The state-of-the-art solutions to the Big Data problem are largely based on the MapReduce framework (aka its open source implementation Hadoop). Although Hadoop handles the data volume challenge successfully, it does not deal with the data variety well since the programming interfaces and its associated data processing model is inconvenient and inefficient for handling structured data and graph data. This paper presents epiC, an extensible system to tackle the Big Data’s data variety challenge. epiC introduces a general Actor-like concurrent programming model, independent of the data processing models, for specifying parallel computations. Users process multi-structured datasets with appropriate epiC extensions, the implementation of a data processing model best...
Dawei Jiang, Sai Wu, Gang Chen, Beng Chin Ooi, Kia
Added 11 Apr 2016
Updated 11 Apr 2016
Type Journal
Year 2016
Where VLDB
Authors Dawei Jiang, Sai Wu, Gang Chen, Beng Chin Ooi, Kian-Lee Tan, Jun Xu
Comments (0)
books