For my research focus at the University of Washington, Bothell, I decided to focus on real-time caustic illumination. More specifically, I wanted to dial in on caustic refraction. Generally speaking, caustic illumination is the effect caused by light rays passing through reflective or
refractive objects onto a diffuse object.
For my project, I'd love the ability to provide game developers and artists the ability to set a few flags in their to help create this caustic effect. After doing some research and experimentation, I've been able to create some interesting effects within a simple Unity scene as shown in the video below.
This rendering technique utilizes separate cameras and render textures to help store computational information. Each camera in the scene is responsible for storing specific data such as positional data, light flux, caustic distance and more. The view of each camera is then stored into a texture which can then be read later when trying to rendering the receiving object.
What I love about this approach so far is that it provides an alternative approach to rendering caustics without the use of raytracing. This would allow game developers and artists render caustic effects without the need of raytracing supporting hardware.
There's still a lot more to do, but it's nice to see things finally come together!
Comments