We propose a new algorithm for verifying concurrent programs, which uses concrete executions to partition the program into a set of lean partitions called concurrent trace program...
This paper proposes a random Web crawl model. A Web crawl is a (biased and partial) image of the Web. This paper deals with the hyperlink structure, i.e. a Web crawl is a graph, w...
In an environment of distributed text collections, the first step in the information retrieval process is to identify which of all available collections are more relevant to a giv...
The presence of replicas or near-replicas of documents is very common on the Web. Documents may be replicated completely or partially for different reasons (versions, mirrors, etc...
Ernesto Di Iorio, Michelangelo Diligenti, Marco Go...
Abstract. We present a framework that unifies unit testing and runtime verification (as well as static verification and static debugging). A key contribution of our overall approac...
Edison Mera, Manuel V. Hermenegildo, Pedro L&oacut...