Sciweavers

HPDC
2010
IEEE

Massive Semantic Web data compression with MapReduce

13 years 5 months ago
Massive Semantic Web data compression with MapReduce
The Semantic Web consists of many billions of statements made of terms that are either URIs or literals. Since these terms usually consist of long sequences of characters, an effective compression technique must be used to reduce the data size and increase the application performance. One of the best known techniques for data compression is dictionary encoding. In this paper we propose a MapReduce algorithm that efficiently compresses and decompresses a large amount of Semantic Web data. We have implemented a prototype using the Hadoop framework and we report an evaluation of the performance. The evaluation shows that our approach is able to efficiently compress a large amount of data and that it scales linearly regarding the input size and number of nodes. Categories and Subject Descriptors E.4 [Coding and Information Theory]: Data compaction and compression
Jacopo Urbani, Jason Maassen, Henri E. Bal
Added 09 Nov 2010
Updated 09 Nov 2010
Type Conference
Year 2010
Where HPDC
Authors Jacopo Urbani, Jason Maassen, Henri E. Bal
Comments (0)