Sciweavers

1760 search results - page 259 / 352
» The Focus of Requirements Engineering in Workflow Applicatio...
Sort
View
WWW
2010
ACM
15 years 4 months ago
Not so creepy crawler: easy crawler generation with standard xml queries
Web crawlers are increasingly used for focused tasks such as the extraction of data from Wikipedia or the analysis of social networks like last.fm. In these cases, pages are far m...
Franziska von dem Bussche, Klara A. Weiand, Benedi...
SOQUA
2004
14 years 11 months ago
Measuring Component Performance Using A Systematic Approach and Environment
: As more third-party software components are available in the commercial market, more people begin to use the component-based software engineering approach to developing component...
Jerry Gao, Chandra S. Ravi, Raquel Espinoza
KBSE
2005
IEEE
15 years 3 months ago
Designing and implementing a family of intrusion detection systems
Intrusion detection systems are distributed applications that analyze the events in a networked system to identify malicious behavior. The analysis is performed using a number of ...
Richard A. Kemmerer
ICML
1999
IEEE
15 years 2 months ago
Learning Hierarchical Performance Knowledge by Observation
Developing automated agents that intelligently perform complex real world tasks is time consuming and expensive. The most expensive part of developing these intelligent task perfo...
Michael van Lent, John E. Laird
ICDM
2008
IEEE
186views Data Mining» more  ICDM 2008»
15 years 4 months ago
xCrawl: A High-Recall Crawling Method for Web Mining
Web Mining Systems exploit the redundancy of data published on the Web to automatically extract information from existing web documents. The first step in the Information Extract...
Kostyantyn M. Shchekotykhin, Dietmar Jannach, Gerh...