Sciweavers

169 search results - page 6 / 34
» Caching queues in memory buffers
Sort
View
SIGMOD
2004
ACM
204views Database» more  SIGMOD 2004»
15 years 9 months ago
Buffering Database Operations for Enhanced Instruction Cache Performance
As more and more query processing work can be done in main memory, memory access is becoming a significant cost component of database operations. Recent database research has show...
Jingren Zhou, Kenneth A. Ross
ICPP
2003
IEEE
15 years 2 months ago
Enabling Partial Cache Line Prefetching Through Data Compression
Hardware prefetching is a simple and effective technique for hiding cache miss latency and thus improving the overall performance. However, it comes with addition of prefetch buff...
Youtao Zhang, Rajiv Gupta
ISLPED
2003
ACM
88views Hardware» more  ISLPED 2003»
15 years 2 months ago
Reducing data cache energy consumption via cached load/store queue
High-performance processors use a large set–associative L1 data cache with multiple ports. As clock speeds and size increase such a cache consumes a significant percentage of t...
Dan Nicolaescu, Alexander V. Veidenbaum, Alexandru...
PADS
1998
ACM
15 years 1 months ago
A Nonblocking Algorithm for the Distributed Simulation of FCFS Queueing Networks with Irreducible Markovian Routing
In this paper we consider the distributed simulation of queueing networks of FCFS servers with infinite buffers, and irreducible Markovian routing. We first show that for either t...
Manish Gupta 0007, Anurag Kumar
74
Voted
SIGCOMM
1995
ACM
15 years 1 months ago
Pipelined Memory Shared Buffer for VLSI Switches
ABSTRACT: Switch chips are building blocks for computer and communication systems. Switches need internal buffering, because of output contention; shared buffering is known to perf...
Manolis Katevenis, Panagiota Vatsolaki, Aristides ...