We study the randomized version of a computation model (introduced in [9, 10]) that restricts random access to external memory and internal memory space. Essentially, this model c...
Feature selection is fundamental to knowledge discovery from massive amount of high-dimensional data. In an effort to establish theoretical justification for feature selection al...
This paper presents a case study of a web-based on-line collaborative academic writing project by students of English as a Foreign Language (EFL) at two Higher Education instituti...
Abstract-- A large amount of the world's data is both sequential and imprecise. Such data is commonly modeled as Markovian streams; examples include words/sentences inferred f...
Julie Letchner, Christopher Re, Magdalena Balazins...
We review a method of generating logical rules, or axioms, from empirical data. This method, using closed set properties of formal concept analysis, has been previously described ...