|
Puddles Screen Saver (2012)
Mac OSX Installer: [ZIP] Movie: [MP4, YOUTUBE] Source Code: [ZIP]
The Puddles Screen Saver displays a 2D water surface simulation over an ever changing background of pleasant warm-to-cool Gooch tone gradients. This screen saver was implemented using the cross-platform XScreenSaver framework. To install the screen saver, just download the zip and follow the instructions in the readme file. You'll probably want to have a GPU from at least 2006 or so to get good performance and OSX 10.5 or newer is required. The source code is available but in order to build it you will first have to download the XScreenSaver source and then integrate the Puddles files into XScreenSaver's XCode project.
|
|
Practical and Realistic Facial Wrinkles Animation (2011)
Jorge Jimenez, Jose I. Echevarria, Christopher Oat, and Diego Gutierrez
Webpage: [HTML]
We present a method to add expressive animated wrinkles to characters, helping enrich stories through subtle visual cues. Our system allows the animator to independently blend multiple wrinkle maps across regions of a character's face. We demonstrate how combining our technique with state-of-the-art real-time skin rendering can produce stunning results that bring out the personality and emotional state of a character.
|
|
Order Independent Transparency (2009)
Demo: [EXE] Movie: [MOV] Webpage: [HTML]
A new technique for order-independent transparency is demonstrated. This technique allows for complex, overlapping semi-transparent surfaces to be correctly rendered without requiring that the surfaces first be sorted. By eliminating the need to sort surfaces, this technique allows for far more complex transparent surfaces than would be possible with traditional alpha blending approaches. The OIT algorithm stores fully shaded, transparent fragments in per-pixel linked lists and then sorts and blends each list during a final post processing pass.
|
|
Ladybug (2009)
Demo: [EXE] Movie: [MOV] Webpage: [HTML]
The Ladybug demo demonstrates a new technique for simulating physcailly based, lens-accurate depth-of-field effects based on real-world parameters of focal length and focus distance. This new approach does not display many of the artifacts that are present in previous screen space depth of field effects.
|
|
Samurai (2009)
Real time cloth simulation on the GPU using a modified Havok cloth backend. The cloth simulation backend was developed at AMD as a proof of concept to show a fully rigged character and environment cloth simulation running entirely on the GPU. Dynamic cloth was used for all of the samurai's clothing as well as the banners and flags throughout the environment. The samurai employed a sophisticated rig which allows dynamic cloth to blend with the character's choreographed motion captured performance. Image based lighting, cascaded shadow maps, screen space ambient occlusion, and subsurface scattering, and anisotropic shading techniques were employed to give the character and environment a rich and realistic feel.
|
|
Efficient Spatial Binning (2009)
Christopher Oat, Joshua Barczak, and Jeremy Shopf
Slides: [PDF] Tech Report: [PDF] Movie: [MOV]
A technique for sorting data into spatial bins or buckets using a graphics processing unit (GPU). Our method takes unsorted point data as input and scatters the points, in sorted order, into a set of bins. This is a key operation in the construction of spatial data structures, which are essential for applications such as particle simulation or collision detection. Our technique achieves better performance scaling than previous methods by exploiting geometry shaders to progressively trim the size of the working set. We also leverage predicated rendering functionality to allow early termination without CPU/GPU synchronization.
|
|
Froblins (2008)
Course Notes: [PDF] Course Slides: [PDF] GPU Crowd Management Sketch: [PDF] GPU Crowd Simulation Sketch: [PDF] Movie: [MOV] Webpage: [HTML]
The Froblins demo was designed to showcase GPU-based crowd simulation and scene management. In this interactive environment, thousands of animated, intelligent characters are rendered from a variety of viewpoints ranging from extreme close-ups to far away “bird’s eye” views. The demo utilizes parallel artificial intelligence computation for dynamic pathfinding and local avoidance, large crowd rendering with LOD management, dynamic tessellation for characters and the terrain, cascaded shadows, and high dynamic range rendering with spherical harmonic light maps.
|
|
Ping-Pong (2008)
Slides: [PDF] White Paper: [PDF] Webpage: [HTML] Movie: [MOV, YOUTUBE]
An interactive demo in which the player has a limited amount of time to blow thousands of ping-pong balls through designated goals. A dynamic GPU-based global illumination simulation is used as the primary lighting method along with a real-time ambient occlusion approximation. Deferred shading with a custom, edge-based, multi-sampled anti-aliasing technique is used for rendering. A custom multi-core optimized physics simulator was developed to model the behavior of thousands of ping-pong balls in the interactive environment. The dynamic ambient occlusion technique used in this demo was published in ShaderX 7 (Deferred Occlusion from Analytic Surfaces).
|
|
Animated Wrinkles (2007)
Course Notes: [PDF] Course Slides: [PDF] Performance Driven Animation: [MOV] Wrinkle Test Movie: [MOV]
Two different techniques for real-time animated wrinkles. The first technique uses animated wrinkle maps to allow artists to independently control different regions of a character's face with squish and stretch weights. We used this technique to animate Ruby's face in the Ruby: Whiteout demo. Our animated wrinkle weights came from a performance driven animation system that took video of an actress as input and provided the animated weights as output. These weights can also be hand animated by an artist. The second wrinkle technique does not require artist animated weights but instead derives stretch and squish coefficients by comparing an animated triangle's surface area against the corresponding triangle's area in a non-animated neutral pose. Wrinkle weight smoothing is used to convert the per-triangle weights into smooth, per-vertex weights. For more information, please consult the SIGGRAPH 2007 course notes and slides (the slides actually contain more information than the notes). See the Ruby: Whiteout demo or movie for examples of these techniques in action.
|
|
Ruby: Whiteout (2007)
Movie: [MOV] Webpage: [HTML]
A three minute cinematic sequence, rendered in real-time, was developed as a launch demo for the Radeon HD 2900. This demo was created by a team of four engineers and three full time artists and includes new methods for character facial animation, dynamic clothing and facial wrinkles, human hair and skin shading, GPU fur physics, real-time global illumination approximation, GPU-based spherical harmonic projection and rotation, procedural snow accumulation, volumetric snow and ice shading, and HDR rendering with dynamic tone-mapping using a real-time histogram generation technique.
|
|
Ambient Aperture Lighting (2007)
Christopher Oat and Pedro V. Sander
ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games
Paper: [PDF] Movie: [WMV]
A real-time shading model that uses spherical cap intersections to approximate a surface's incident lighting from dynamic area light sources. Our method uses precomputed visibility information for static meshes to compute illumination with approximated shadows in a single rendering pass. Because this technique relies on precomputed visibility data, the mesh is assumed to be rigid at render time. Due to its high efficiency and low memory footprint this method is highly suitable for games and other interactive visualization applications.
|
|
Rendering Gooey Materials with Multiple Layers (2006)
Christopher Oat
ACM SIGGRAPH, Course 26: Advanced Real-Time Rendering in 3D Graphics and Games
Course Notes: [PDF] Slides: [PDF] Movie: [MP4, AVI]
An efficient method for rendering semi-transparent, multi-layered materials. This method achieves the look of a volumetric material by exploiting several perceptual cues, based on depth and illumination, while combining multiple material layers on the surface of an otherwise non-volumetric, multi-textured surface such as the human heart shown here. Multiple implementation strategies were developed to allow for trade-offs to be made between visual quality and runtime performance so that the technique may be scaled appropriately for different platforms and applications.
|
|
Ruby: The Assassin (2005)
Demo: [EXE] Movie: [MPG, MOV] Webpage: [HTML]
This demo was originally developed for the Radeon X1800 launch but was also ported to the XBox 360 by a small team of engineers over the course of three (sleepless) weeks. I was responsible for porting the rendering engine's scriptable render-target management layer. I also ported the demo's shadow mapping, depth of field, texture space lighting, and final composite rendering paths so that they could make maximal use of the platform's unique EDRAM rendering architecture.
|
|
Real-Time Irradiance Volumes & Irradiance Volumes for Games (2005)
Slides: [PDF] DirectX 9.0 SDK Sample: [HTML] ShaderX 5 Article: [HTML]
An adaptively subdivided octree is used to place irradiance sample points in the environment. The octree is automatically subdivided using heuristics based on local scene complexity. The subdivision scheme uses GPU based tests to choose finer levels of subdivision in areas where irradiance changes quickly and coarser levels of subdivision in areas where fewer samples are necessary because irradiance changes slowly (top image). Irradiance distribution functions and their first order derivatives are computed at each sample point and are stored using 3rd order spherical harmonics. At render-time, this data is used to extrapolate a final, per-pixel irradiance estimate for dynamic objects moving through the scene (bottom images). This work has been incorporated into Microsoft's DirectX 9.0 SDK, presented at GDC 2005 in my "Irradiance Volumes for Games" presentation and published in ShaderX 5.
|
|
Ruby: Dangerous Curves (2004)
Demo: [EXE] Movie: [MPG, MOV] Webpage: [HTML]
The Dangerous Curves demo was developed for the Radeon X850. I implemented a radiance caching system for the diffuse lighting in this demo. Radiance and radiance gradients are sampled (both spatially and temporally) as a preprocess and stored in a cache using spherical harmonics for compactness. At run-time the cache is queried (the gradients are used to extrapolate radiance functions in the event of cache misses) and the results are integrated with per-pixel and per-vertex radiance transfer functions for final diffuse lighting.
|
|
Real-Time Subsurface Scattering (2004)
Demo: [EXE] Movie: [MPG, MOV] Webpage: [HTML]
This demo was developed for the ATI Radeon X800 launch. It incorporates Precomputed Radiance Transfer techniques to simulate global illumination effects such as bounced lighting and subsurface scattering. Post-processing techniques for depth of field and light blooms along with environmental effects such as atmospheric scattering give this demo a very natural look. In the process of creating this demo I also extended our effect framework and art pipeline to expose PRT-based material and lighting features to the artists. This work was featured as part of a conference presentation at GDC 2004.
|
|
Lava Caves Screensaver (2003)
Demo: [EXE] Movie: [MPG] Webpage: [HTML] Slides: [PDF] GPG4 Article: [HTML]
The Lava Caves screen saver was developed for the ATI Radeon 9800 launch. For this demo I used image-space post-processing techniques for the simulation of atmospheric effects such as heat-haze distortion. Additionally, I worked with the project's artist to develop a pixel shader technique to approximate the appearance of lava flow. I presented these techniques at GDC Europe 2003 and published them along with Natalya Tatarchuk in Game Programming Gems 4.
|
|
Nature (2001)
Demo: [EXE] Movie: [AVI] Webpage: [HTML] Textures as Lookup Tables GPG3 Article: [HTML]
The Nature demo was developed for the ATI Radeon 8500 launch. This was one of the first fully shader based demos that we created. The water surface is computed by scrolling a number of sine waves over a tessellated mesh. Because this demo pre-dates sin/cos shader instructions, the sine waves were computed the old fashion way, with carefully optimized shader assembly code, using Taylor series expansion. A similar technique is used to make the grass look like it is swaying in the wind. My primary contribution to this demo was in developing a technique for doing per-pixel specular lighting with a per-pixel exponent using lookup tables. This technique was used instead of the standard approach which requires a Normalization Cubemap which, due to texture filtering precision limitations at the time, resulted in banding artifacts.
|