Background: Normalization is the process of removing non-biological sources of variation between array experiments. Recent investigations of data in gene expression databases for ...
Timothy Lu, Christine M. Costello, Peter J. P. Cro...
In this paper, we focus on the design of Markov Chain Monte Carlo techniques in a statistical registration framework based on finite element basis (FE). Due to the use of FE basis...
It is becoming increasingly common to construct network services using redundant resources geographically distributed across the Internet. Content Distribution Networks are a prim...
Data availability, collection and storage have increased dramatically in recent years, raising new technological and algorithmic challenges for database design and data management...
In self-adaptive systems, metadata about resources in the system (e.g., services, nodes) must be dynamically published, updated, and removed. Current middleware approaches use sta...