Efficient Context Switching for the Stack Cache: Implementation and Analysis

Sahar Abbaspourseyedi, Florian Brandner, Amine Naji, Mathieu Jan

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Abstract

The design of tailored hardware has proven a successful strategy to reduce the timing analysis overhead for (hard) real-time systems. The stack cache is an example of such a design that has been proven to provide good average-case performance, while being easy to analyze.

So far, however, the analysis of the stack cache was limited to individual tasks, ignoring aspects related to multitasking. A major drawback of the original stack cache design is that, due to its simplicity, it cannot hold the data of multiple tasks at the same time. Consequently, the entire cache content needs to be saved and restored when a task is preempted.

We propose (a) an analysis exploiting the simplicity of the stack cache to bound the overhead induced by task pre-emption and (b) an extension of the design that allows to (partially) hide the overhead by virtualizing stack caches.
Original languageEnglish
Title of host publicationProceedings of the 23rd International Conference on Real Time and Networks Systems (RTNS 2015)
PublisherAssociation for Computing Machinery
Publication date2015
Pages119-128
ISBN (Print)978-1-4503-3591-1
DOIs
Publication statusPublished - 2015
Event23rd International Conference on Real-Time Networks and Systems - Lille, France
Duration: 4 Nov 20156 Nov 2015
Conference number: 23

Conference

Conference23rd International Conference on Real-Time Networks and Systems
Number23
Country/TerritoryFrance
CityLille
Period04/11/201506/11/2015

Keywords

  • Program Analysis
  • Stack Cache
  • Cache-Related Preemption Delays
  • Real-Time Systems

Fingerprint

Dive into the research topics of 'Efficient Context Switching for the Stack Cache: Implementation and Analysis'. Together they form a unique fingerprint.

Cite this