Skip to content

Optix7Filter

Jacco Bikker edited this page Oct 21, 2019 · 3 revisions

The Optix7Filter core implements the spatio-temporal variance-guided filter as proposed by Christoph Schied, with the diamond-square motion vector search for reprojection of speculars (todo: glossy's) described by Victor Voorhuis in his master thesis. SVGF uses an edge-avoiding A-Trous filter with a adaptive kernel size (steered by estimated variance) over samples taken for the current frame, combined with reprojected samples from previous frames.

The filter uses a broad set of so-called feature buffers for input:

  • albedo: this buffer stores the color of the first (non-specular) surface the camera sees for each pixel. When rendering using a pinhole camera, this data is noise-free, even in a path tracer. Storing this data separately lets us filter the noisy illumination, without blurring detailed textures.
  • direct and indirect illumination: these are easily stored separately in a path tracer. Direct illumination typically has sharper details than indirect illumination, and should therefore be filtered with a smaller kernel.
  • normals and worldspace positions: filters work by averaging the illumination of nearby pixels. This should however not be done if the pixels belong to very different surfaces. Normals and worldspace positions (as well as material IDs) help identify these cases.
  • depth derivatives: Lighthouse2 also stores the rate at which depth changes from one pixel to the next, both horizontally and vertically. This helps adjust the filter kernel width, to prevent overblurring distant lighting details.

The buffers are constructed during execution of the shade kernel, which you can find in pathtracer.h.

To visualize these buffers, just press F4 when using the filtering core. This will launch kernel finalizeFilterDebug (at the end of finalize_shared.h) which shows a different feature buffer in each quadrant of the screen. The finalizeFilterDebug kernel code may be helpful to understand the contents of these buffers.

Filter flow

As in the other cores, The Optix7Filter core executes the shade kernel from rendercore.cpp. This happens in a loop, which implements the wavefront algorithm:

  1. Setup primary rays ('extension rays')
  2. Trace extension rays to find nearest intersections
  3. Evaluate material model and produce bounced rays and shadow rays
  4. Trace shadow rays
  5. For the bounced rays: goto 2, until no rays remain.

After step 5 the feature buffers are ready to be finalized by the code in finalize_shared.h.

PrepareFilter

Based on the feature buffers and information from the previous frame two additional buffers are filled. This happens in the prepareFilter kernel. The buffers are:

  • motion: this buffer contains estimated motion vectors, to be used for reprojection.
  • moments: first and second moments (i.e., means and variance) of the direct and indirect illumination.

The final A-Trous filter is then executed by repeatedly calling the applyFilter kernel.

Future work

The feature buffers produced by the path tracer can be used to feed other filters, such as an ANN filter. The SVGF filter implementation can be extended with motion vector calculation for glossy materials and dielectrics.

Clone this wiki locally