Sciweavers

23 search results - page 1 / 5
» Knowledge Decay in a Normalised Knowledge Base
Sort
View
DEXA
2000
Springer
109views Database» more  DEXA 2000»
13 years 8 months ago
Knowledge Decay in a Normalised Knowledge Base
Knowledge ‘decay’ is a measure of the degradation of knowledge integrity. In a unified knowledge representation, data, information and knowledge are all represented in a single...
John K. Debenham
KCAP
2003
ACM
13 years 9 months ago
Modularisation of domain ontologies implemented in description logics and related formalisms including OWL
Modularity is a key requirement for large ontologies in order to achieve re-use, maintainability, and evolution. Mechanisms for ‘normalisation’ to achieve analogous aims are s...
Alan L. Rector
SIGMOD
2001
ACM
134views Database» more  SIGMOD 2001»
14 years 4 months ago
Preservation of Digital Data with Self-Validating, Self-Instantiating Knowledge-Based Archives
Digital archives are dedicated to the long-term preservation of electronic information and have the mandate to enable sustained access despite rapid technology changes. Persistent...
Bertram Ludäscher, Richard Marciano, Reagan Moore
ISLPED
2005
ACM
122views Hardware» more  ISLPED 2005»
13 years 10 months ago
A simple mechanism to adapt leakage-control policies to temperature
Leakage power reduction in cache memories continues to be a critical area of research because of the promise of a significant pay-off. Various techniques have been developed so fa...
Stefanos Kaxiras, Polychronis Xekalakis, Georgios ...
KDD
2007
ACM
178views Data Mining» more  KDD 2007»
14 years 4 months ago
Density-based clustering for real-time stream data
Existing data-stream clustering algorithms such as CluStream are based on k-means. These clustering algorithms are incompetent to find clusters of arbitrary shapes and cannot hand...
Yixin Chen, Li Tu