Sharing event data in optimistically scheduled multicast applications
Abstract
Recommendations
Making a case for split data caches for embedded applications
Special issue: MEDEA'05In this paper we show that cache memories for embedded applications can be designed to increase performance while reduce area and energy consumed. Previously we have shown that separating data cache into an array cache and a scalar cache can lead to ...
Thread Data Sharing in Cache: Theory and Measurement
PPoPP '17: Proceedings of the 22nd ACM SIGPLAN Symposium on Principles and Practice of Parallel ProgrammingOn modern multi-core processors, independent workloads often interfere with each other by competing for shared cache space. However, for multi-threaded workloads, where a single copy of data can be accessed by multiple threads, the threads can ...
A sharing-aware L1.5D cache for data reuse in GPGPUs
ASPDAC '19: Proceedings of the 24th Asia and South Pacific Design Automation ConferenceWith GPUs heading towards general-purpose, hardware caching, e.g. the first-level data (L1D) cache is introduced into the on-chip memory hierarchy for GPGPUs. However, facing the GPGPU massive multi-threading, the small L1D requires a better management ...
Comments
Information & Contributors
Information
Published In

Sponsors
Publisher
Winter Simulation Conference
Publication History
Check for updates
Qualifiers
- Article
Acceptance Rates
Contributors
Other Metrics
Bibliometrics & Citations
Bibliometrics
Article Metrics
- 0Total Citations
- 40Total Downloads
- Downloads (Last 12 months)0
- Downloads (Last 6 weeks)0
Other Metrics
Citations
View Options
Login options
Check if you have access through your login credentials or your institution to get full access on this article.
Sign in