Depth Map Based Ambient Occlusion Lighting
Update 2004 -
I wrote this article way back in 2000/2001 when Entropy was the state of the art in RenderMan raytracing and PRMan 11 was just on the horizon. Hence my comments about the relative speed of the technique described below. In four years many things change. Not only are machines faster but the raytracing algorithms have come on leaps and bounds too. These days I would suggest that ray-tracing is the best solution for generating ambient occlusion. I haven't used the techniques on this page for a while now and I see no reason I would again. I'm leaving this tutorial on the net simply as a historical oddity and because many people have been kind enough to post links to it. If you are studying ambient occlusion then this tutorial might still have some value by way of instruction on the reasoning and methodology of ambient occlusion but beyond that I think its time has past.
Recently, a new technique for lighting photorealistic scenes has been developed by several VFX companies. It is called ambient occlusion and works basically like this. For any given scene, you render a pass or texture map that determines how much ambient light hits any given point on a surface. i.e. a lot less light illuminates my armpit than the top of my head. This level of ambient light is independent of the actual main light source used to illuminate the scene. Once the ambient occlusion pass has been rendered, the indirect lighting contribution to the scene is then calculated by applying an environment map of the scene into which the object is to be integrated. This ambient light is attenuated by the previously rendered ambient occlusion pass, thus the top of my head would receive a great deal of illumination, whilst my armpit would remain darker.
A further development of this technique is that of the "average light direction vector" usually referred to as a bent normal vector. This is an additional pre-beauty render pass in which the R, G, and B values of the image do not represent physical colour but instead are used together to make a vector which the beauty pass surface shader uses to "bend" the currently shaded point's normal so that it "looks at" the position on the environment map from where the majority of light came from as calculated by the ambient occlusion render pass.
The techniques I present here utilise both of these techniques. The main difference between my method and most others is that it requires no ray tracing. I have taken this approach for a simple reason: speed. Ray traced ambient occlusion passes take forever to render. By using a depth-map based approach the render times are (in the tests I have done) between 8 and 20 times faster. This assumes that you're re-rendering all the depth-maps every frame, i.e. that your object deforms. If the object is static and say only the camera moves then the speed differences will be even greater as you will only need to calculate the depth maps once per job.
At this point I should point out that this technique is for PRMan only. You can probably apply many of them to other renderers but there are certain specifics in my shaders which will only work in PRMan, and are not directly applicable to other RenderMan compliant renderers. The other thing I would like to do at this point is to thank Hayden Landis of ILM for his fantastic paper at SIGGRAPH 2002 on ILM's ray traced approach. I've used a lot of his ideas in this method and also I would like to thank Ken McGaugh for his very kind assistance in showing how depth-maps in PRMan can be used to similar effect. All I've really done is to meld the two techniques, write the shaders and make a few pictures to demonstrate them. These other guys have done the brainwork! Thank you very much, gentlemen.
To use my method you need to download this zip file and unpack it. It contains:
1) light_sphere.mb - a Maya 4.0 file containing a sphere of spotlights with MTORSpotlight shaders attached.
2) occlusion_dmap.sl - A PRMan shader which once compiled you should attach to all objects for the ambient occlusion pass.
3) occlusion_constant.sl - A PRMan shader which should be attached to the backplane for the ambient occlusion pass.
4) dmap_amb_occ.slim - A Slim template which you should install by modifying your slim.ini file. This shading model is attached to the objects in order to apply the ambient occlusion data during the environmental light pass.
OK, let's run through a simple example of how to use these tools.
The Maya scene
This is a screen grab of Maya once you've loaded light_sphere.mb. You want to scale this light sphere until it is comfortably bigger than the objects you wish to light. Check that the light's cone angles cover the scene although try to make the cone angle fit as tightly as possible so that every pixel in the shadow map is useful.
Now attach the occlusion_dmap.slo shader to all the geometry in your scene. This shader asks each light if it is in shadow and creates an occlusion value based on the answers it gets. In order to do this it utilises a function of the MTOR lights in RAT called "__inShadow". This function is not a standard light function in PRMan so make sure that any light shader you use supports this output variable. I recommend the MTOR spot light because I know that it supports this.
You also need to import the occlusion_constant.slo shader. This should be attached to the standard backplane by naming the shader's appearance "backplane" and switching the radio button for backplanes in the render globals on. This shader makes the background of the ambient occlusion pass white and the bent normal pass black. This is important as it affects the way the values that these images parse to the final environment light render.
Finally you need to add a secondary output image to your PRMan render globals. Add "varying color outOcc" as the "mode". I generally render this arbitrary out as a TIFF, the same as the main render. This is the ambient occlusion pass. The main image pass creates the bent normal data. By using this additional output image you get both passes you need from the one render, thus saving time.
Using the ship I built for Downside we get this image as our main image render
This is our additional output image generated from the outOcc variable
You can use these passes for any lighting set up you choose from now on.
We now have the ambient occlusion data and bent normal information required to rend the beauty pass. Regardless of how we want the final beauty pass to look we can use these two pre-rendered images. Now let's look at how we apply these passes to create a beauty render.
I tend to find it easier to render three elements to make my final image. One pass is a "texture only" pass, rendered by inserting an ambient light into the scene with an intensity of one and reattaching the regular surface shaders that the object has, but no other illumination.
The texture only pass
The second pass is a key light pass, in which you place your conventional CG lights to create any strongly directional illumination and also any cast shadows, also using the regular surface shaders that you'd use if you were lighting the shot conventionally.
The key light only pass
The third pass is the indirect illumination pass.
For this pass you need to delete the light dome and detach the occlusion_dmap.slo shader from your geometry. From within Slim make an amb_occ_dmap shading model for each type of surface you have i.e. matte, shiny etc. You can modify the shader's parameters to do all this. The actual shading model for this template is a basic plastic shader but that should suffice for most applications.
You also need to specify the latlong environment map you wish to use to do your ambient lighting and also the ambient occlusion and bent normal passes you've just rendered.
Render the scene again. It should render like lightning as there are no additional lighting calculations just some look-ups to some image maps.
The indirect illumination pass
Once you've rendered these three layers you can composite the indirect illumination pass, flat textured pass, and direct illumination pass together by combining the elements in a compositing package. The texture only pass should be multiplied by the indirect illumination pass to provide a textured, occluded, fill-lit image.
Texture pass multiplied by indirect illumination pass
The direct lighting pass should then be added to this to complete the image.
As before but with the key light pass added
Of course because you've rendered these elements separately you now have much more control over the final image and how it integrates with the background. You can tweak the intensities of these three passes independently to perfect the blend with the live action.
This is the result of using the above renders and Paul Debevec's St Peter's Basilica light probe to provide the indirect illumination. My direct light pass consisted of one spotlight to provide the cast shadows, as well as to add some modelling light on the image. By roughly compositing in a background made from the same environment map and doing some basic colour correction we get the above result.
Another image produced using the same techniques
Please note that I have not attempted to match camera angles or properly integrate the geometry into these background plates. I just wanted to show that the lighting could be matched with a minimum of fuss. This technique needs some playing with to properly understand the concepts involved but I believe it can efficiently get you creating photoreal images quickly without lots of time spent tweaking and fiddling fill lights. It's also massively faster than ray traced ambient occlusion or global illumination. Have a play with the tools and good luck. This is not a beginner's tutorial and I expect anyone trying to use this stuff to be a pretty experienced RenderMan, RAT and Maya user. It isn't "plug 'n' play" yet I'm afraid! Make sure you resize the bent-normal and ambient occlusion passes to be square using txmake else the image won't register properly. Hope this is useful to you and gets you working with this exciting new technique!
This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 3.0 Unported License.