Capture and render real time illumination

Real time global illumination is resource consuming in graphics. It requires identifying illumination sources in the scene, evaluating their direct and indirect contribution in the scene, keeping track of the sources and their distribution and calculating their effect on the objects at real time. All these step are computationally intensive. People have still demonstrated beautiful and realistic scenes with global illumination. One of the techniques used is called light maps. It requires us to calculate contribution of each light source in different part of the scene, similar to what we do in Radiosity. Another way is to find dominant illumination using Spherical Harmonics or Irradiance maps. There are other methods called Voxel Cone Raytracing and Irradiance Volumes to represent global illumination. In all these methods, we know our scenes very well. We know lights and their position in the scene and we can apply one of the techniques to find illumination distribution in the scene. In our case, we do not know about the user's environment. In such a case, I decided to start with an environment map.

An environment map is a representation of a particular environment with the help of set of pictures of that environment. There are different ways in which environment mapping can be done. I used cube maps to generate environment maps. I took 6 images in positive and negative x,y and z directions in the real environment and built a cube map out of it. I rendered a scene with this cube map and placed an object to see its effect. Constructing the real time environment map did add a realistic feeling to my rendering as well. The maps could pretty well mimic the environment I was standing in and these effects were also reflected on the object in the scene.

Having duplicated the user's environment in the rendered scene, the appearance of the object was still due to the static environment. I still had to figure out how to detect and represent dynamic illumination components of the scene. I decided to use my front camera to monitor the scene at real time. Then, I compared each pixel in the frame and in previous frame to find a new (if any) dominant illumination source in the scene. I used light probes to add virtual light in the scene according to the detected illumination source. I track these pixels at real time and simulate the illumination in the virtual scene according to the location and intensity of the detected light in the real scene. I applied these techniques to mimic appearance of a copper plate according to illumination conditions in my room. I have demonstrated the effect of real time environment map and dynamic addition and removal of light in the scene.