Abstract
We present a practical method for temporal and stereoscopic filtering that generates stereo-consistent rendering. Existing methods for stereoscopic rendering often reuse samples from one eye for the other or do averaging between the two eyes. These approaches fail in the presence of ray tracing effects such as specular reflections and refractions. We derive a new blending strategy that leverages variance to compute per pixel blending weights for both temporal and stereoscopic rendering. In the temporal domain, our method works well in a low noise context and is robust in the presence of inconsistent motion vectors, where existing methods such as temporal anti-aliasing (TAA) and deep learning super sampling (DLSS) produce artifacts. In the stereoscopic domain, our method provides a new way to ensure consistency between the left and right eyes. The stereoscopic version of our method can be used with our new temporal method or with existing methods such as DLSS and TAA. In all combinations, it reduces the error and significantly increases the consistency between the eyes making it practical for real-time settings such as virtual reality (VR).
Original language | English |
---|---|
Title of host publication | Proceedings of Eurographics Symposium on Rendering 2023 |
Number of pages | 8 |
Publisher | Eurographics Association |
Publication date | 2023 |
ISBN (Print) | 978-3-03868-229-5 |
DOIs | |
Publication status | Published - 2023 |
Event | 34th Eurographics Symposium on Rendering - Delft, Netherlands Duration: 28 Jun 2023 → 30 Jun 2023 |
Conference
Conference | 34th Eurographics Symposium on Rendering |
---|---|
Country/Territory | Netherlands |
City | Delft |
Period | 28/06/2023 → 30/06/2023 |