Me

I've Moved On

After 5 rewarding years at Columbia, I
have completed my PhD. and moved on.

It really has been a tremendous priviledge
to be at Columbia University and work
with the
amazing faculty and students. 

I will keep this page here as long as
Columbia keeps it here, but some of the
information may become out-of-date.


Please visit my new website at
http://www.ryanoverbeck.com.

Cheers!
-ryan
January 24, 2010

Ryan Overbeck

PhD

Graduated: August 2009

Computer Graphics Group
Computer Science Department
Columbia University

Advisor: Ravi Ramamoorthi

Publications


2009

Adaptive Wavelet Rendering

Whitted Ray Tracing
Overbeck, Ryan S.; Donner, Craig; Ramamoorthi, Ravi.
SIGGRAPH Asia 2009.  [project]

Abstract:  Effects such as depth of field, area lighting, antialiasing and global illumination require evaluating a complex high-dimensional integral at each pixel of an image.  We develop a new adaptive rendering algorithm that greatly reduces the number of samples needed for Monte Carlo integration. Our method renders directly into an image-space wavelet basis.  First, we adaptively distribute Monte Carlo samples to reduce the variance of the wavelet basis' scale coefficients, while using the wavelet coefficients to find edges.  Working in wavelets, rather than pixels, allows us to sample not only image-space edges but also other features that are smooth in the image plane but have high variance in other integral dimensions.  In the second stage, we reconstruct the image from these samples by using a suitable wavelet approximation.  We achieve this by subtracting an estimate of the error in each wavelet coefficient from its magnitude, effectively producing the smoothest image consistent with the rendering samples.  Our algorithm renders scenes with significantly fewer samples than basic Monte Carlo or adaptive techniques.  Moreover, the method introduces minimal overhead, and can be efficiently included in an optimized ray-tracing system.

2008

Large Ray Packets for Real-time Whitted Ray Tracing

Whitted Ray Tracing
Overbeck, Ryan; Ramamoorthi, Ravi; Mark, William R..
IEEE/EG Symposium on Interactive Ray Tracing 2008.  [pdf]

Abstract:  In this paper, we explore large ray packet algorithms for acceleration structure traversal and frustum culling in the context of Whitted ray tracing, and examine how these methods respond to varying ray packet size, scene complexity, and ray recursion complexity.  We offer a new algorithm for acceleration structure traversal which is robust to degrading coherence and a new method for generating frustum bounds around reflection and refraction ray packets.  We compare, adjust, and finally compose the most effective algorithms into a real-time Whitted ray tracer.  With the aid of multi-core CPU technology, our system renders complex scenes with reflections, refractions, and/or point-light shadows anywhere from 4--20 FPS.

2007

A Real-time Beam Tracer with Application to Exact Soft Shadows

Beam Traced Soft Shadows
Overbeck, Ryan; Ramamoorthi, Ravi; Mark, William R..
EGSR 2007.  [pdf][video]

Abstract:  Efficiently calculating accurate soft shadows cast by area light sources remains a difficult problem.  Ray tracing based approaches are subject to noise or banding, and most other accurate methods either scale poorly with scene geometry or place restrictions on geometry and/or light source size and shape.  Beam tracing is one solution which has historically been considered too slow and complicated for most practical rendering applications.  Beam tracing's performance has been hindered by complex geometry intersection tests, and a lack of good acceleration structures with efficient algorithms to traverse them.  We introduce fast new algorithms for beam tracing, specifically for beam--triangle intersection and beam--kd-tree traversal.  The result is a beam tracer capable of calculating precise primary visibility and point light shadows in real-time.  Moreover, beam tracing provides full area elements instead of point samples, which allows us to maintain coherence through to secondary effects and utilize the GPU for high quality antialiasing and shading with minimal extra cost.  More importantly, our analysis shows that beam tracing is particularly well suited to soft shadows from area lights, and we generate essentially exact noise-free soft shadows for complex scenes in seconds rather than minutes or hours.

2006

Real-Time BRDF Editing in Complex Lighting

BRDF Editing
Ben-Artzi, Aner; Overbeck, Ryan; Ramamoorthi, Ravi.
SIGGRAPH 2006.  [pdf] [video]

Abstract: Current systems for editing BRDFs typically allow users to adjust analytic parameters while visualizing the results in a simplified setting (e.g. unshadowed point light). This paper describes a realtime rendering system that enables interactive edits of BRDFs, as rendered in their final placement on objects in a static scene, lit by direct, complex illumination. All-frequency effects (ranging from near-mirror reflections and hard shadows to diffuse shading and soft shadows) are rendered using a precomputation-based approach. Inspired by real-time relighting methods, we create a linear system that fixes lighting and view to allow real-time BRDF manipulation. In order to linearize the imageÂ’s response to BRDF parameters, we develop an intermediate curve-based representation, which also reduces the rendering and precomputation operations to 1D while maintaining accuracy for a very general class of BRDFs. Our system can be used to edit complex analytic BRDFs (including anisotropic models), as well as measured reflectance data. We improve on the standard precomputed radiance transfer (PRT) rendering computation by introducing an incremental rendering algorithm that takes advantage of frame-to-frame coherence. We show that it is possible to render reference-quality images while only updating 10% of the data at each frame, sustaining frame-rates of 25-30fps.

Exploiting Temporal Coherence for Incremental All-frequency Relighting

Incremental All-frequency Relighting
Overbeck, Ryan; Ben-Artzi, Aner; Ramamoorthi, Ravi; Grinspun, Eitan.
EGSR 2006.  [pdf]  [video] [presentation: ppt, files]

Abstract:  Precomputed radiance transfer (PRT) enables all-frequency relighting with complex illumination, materials and shadows. To achieve real-time performance, PRT exploits angular coherence in the illumination, and spatial coherence in the scene. Temporal coherence of the lighting from frame to frame is an important, but unexplored additional form of coherence for PRT. In this paper, we develop incremental methods for approximating the differences in lighting between consecutive frames. We analyze the lighting wavelet decomposition over typical motion sequences, and observe differing degrees of temporal coherence across levels of the wavelet hierarchy. To address this, our algorithm treats each level separately, adapting to available coherence. The proposed method is orthogonal to other forms of coherence, and can be added to almost any all-frequency PRT algorithm with minimal implementation, computation or memory overhead. We demonstrate our technique within existing codes for nonlinear wavelet approximation, changing view with BRDF factorization, and clustered PCA. Exploiting temporal coherence of dynamic lighting yields a 3×–4× performance improvement, e.g., all-frequency effects are achieved with 30 wavelet coefficients for the lighting, about the same as low-frequency spherical harmonic methods.  Distinctly, our algorithm smoothly converges to the exact result within a few frames of the lighting becoming static.

Exploiting Temporal Coherence for Pre-computation Based Rendering

Incremental All-frequency Relighting Overbeck, Ryan
Master's Thesis.  Tech Report #: CUCS-025-06.  May, 2006.  [pdf]


Abstract:  Precomputed radiance transfer (PRT) generates impressive images with complex illumination, materials and shadows with real-time interactivity. These methods separate the scene’s static and dynamic components allowing the static portion to be computed as a preprocess. In this work, we hold geometry static and allow either the lighting or BRDF to be dynamic. To achieve real-time performance, both static and dynamic components are compressed by exploiting spatial and angular coherence. Temporal coherence of the dynamic component from frame to frame is an important, but unexplored additional form of coherence. In this thesis, we explore temporal coherence of two forms of all-frequency PRT: BRDF material editing and lighting design. We develop incremental methods for approximating the differences in the dynamic component between consecutive frames. For BRDF editing, we find that a pure incremental approach allows quick convergence to an exact solution with smooth real-time response.  For relighting, we observe vastly differing degrees of temporal coherence accross levels of the lighting’s wavelet hierarchy. To address this, we develop an algorithm that treats each level separately, adapting to available coherence. The proposed methods are othogonal to other forms of coherence, and can be added to almost any PRT algorithm with minimal implementation, computation, or memory overhead. We demonstrate our technique within existing codes for nonlinear wavelet approximation, changing view with BRDF factorization, and clustered PCA. Exploiting temporal coherence of dynamic lighting yields a 3×–4× performance improvement, e.g., all-frequency effects are achieved with 30 wavelet coefficients, about the same as low-frequency spherical harmonic methods. Distinctly, our algorithm smoothly converges to the exact result within a few frames of the lighting becoming static.