Sciweavers

Share
GCC
2005
Springer

Parallel Web Spiders for Cooperative Information Gathering

9 years 3 months ago
Parallel Web Spiders for Cooperative Information Gathering
Web spider is a widely used approach to obtain information for search engines. As the size of the Web grows, it becomes a natural choice to parallelize the spider’s crawling process. This paper presents a parallel web spider model based on multi-agent system for cooperative information gathering. It uses the dynamic assignment mechanism to wipe off redundant web pages caused by parallelization. Experiments show that the parallel spider is effective to improve the information gathering performance within an acceptable interaction efficiency cost for controlling. This approach provides a novel perspective for the next generation advanced search engine.
Jiewen Luo, Zhongzhi Shi, Maoguang Wang, Wei Wang
Added 27 Jun 2010
Updated 27 Jun 2010
Type Conference
Year 2005
Where GCC
Authors Jiewen Luo, Zhongzhi Shi, Maoguang Wang, Wei Wang
Comments (0)
books