Service robots will have to accomplish more and more complex, open-ended tasks and regularly acquire new skills. In this work, we propose a new approach to generating plans for su...
Crawling the web is deceptively simple: the basic algorithm is (a) Fetch a page (b) Parse it to extract all linked URLs (c) For all the URLs not seen before, repeat (a)?(c). Howev...
Ontologies represent the next important phase of the World Wide Web, creating a semantic web which links together disparate pieces of information and knowledge. Creating ontologie...
The current World Wide Web is essentially a network of documents, a continuously evolving information universe. But the potential of internetworking goes far beyond information ac...
The aim of this paper is to examine the domain of World Wide Web site development and propose a methodology to assist with this process. Methodologies have both their proselytizers...