Practical Temporal and Stereoscopic Filtering for Real-time Ray Tracing

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

100 Downloads (Pure)

Abstract

We present a practical method for temporal and stereoscopic filtering that generates stereo-consistent rendering. Existing methods for stereoscopic rendering often reuse samples from one eye for the other or do averaging between the two eyes. These approaches fail in the presence of ray tracing effects such as specular reflections and refractions. We derive a new blending strategy that leverages variance to compute per pixel blending weights for both temporal and stereoscopic rendering. In the temporal domain, our method works well in a low noise context and is robust in the presence of inconsistent motion vectors, where existing methods such as temporal anti-aliasing (TAA) and deep learning super sampling (DLSS) produce artifacts. In the stereoscopic domain, our method provides a new way to ensure consistency between the left and right eyes. The stereoscopic version of our method can be used with our new temporal method or with existing methods such as DLSS and TAA. In all combinations, it reduces the error and significantly increases the consistency between the eyes making it practical for real-time settings such as virtual reality (VR).
Original languageEnglish
Title of host publicationProceedings of Eurographics Symposium on Rendering 2023
Number of pages8
PublisherEurographics Association
Publication date2023
ISBN (Print)978-3-03868-229-5
DOIs
Publication statusPublished - 2023
Event34th Eurographics Symposium on Rendering
- Delft, Netherlands
Duration: 28 Jun 202330 Jun 2023

Conference

Conference34th Eurographics Symposium on Rendering
Country/TerritoryNetherlands
CityDelft
Period28/06/202330/06/2023

Fingerprint

Dive into the research topics of 'Practical Temporal and Stereoscopic Filtering for Real-time Ray Tracing'. Together they form a unique fingerprint.

Cite this