Sciweavers

Share
CORR
2012
Springer

Optimal Threshold Control by the Robots of Web Search Engines with Obsolescence of Documents

7 years 6 months ago
Optimal Threshold Control by the Robots of Web Search Engines with Obsolescence of Documents
A typical web search engine consists of three principal parts: crawling engine, indexing engine, and searching engine. The present work aims to optimize the performance of the crawling engine. The crawling engine finds new web pages and updates web pages existing in the database of the web search engine. The crawling engine has several robots collecting information from the Internet. We first calculate various performance measures of the system (e.g., probability of arbitrary page loss due to the buffer overflow, probability of starvation of the system, the average time waiting in the buffer). Intuitively, we would like to avoid system starvation and at the same time to minimize the information loss. We formulate the problem as a multi-criteria optimization problem and attributing a weight to each criterion we solve it in the class of threshold policies. We consider a very general web page arrival process modeled by Batch Marked Markov Arrival Process and a very general service t...
Konstantin Avrachenkov, Alexander N. Dudin, Valent
Added 20 Apr 2012
Updated 20 Apr 2012
Type Journal
Year 2012
Where CORR
Authors Konstantin Avrachenkov, Alexander N. Dudin, Valentina I. Klimenok, Philippe Nain, Olga V. Semenova
Comments (0)
books