Global illumination is a computer graphics-based concept of lighting a scene or rendering an environment. It illustrates the use of artificial light to simulate light being scattered and absorbed by objects in a scene to create realistic looking images. The term global illumination refers to any rendering algorithm that simulates the interactions of light between surfaces and over the last 20 years there have been several extensively different approaches, along with a number of variants of the same approach (Birn 2006, p. 108; Dutre et al. 2018). This report will focus on ray tracing and photon mapping algorithms and discuss how these techniques are implemented, the properties that they utilise, and the way in which they can be applied to real-time applications.
Photon mapping is a concept originally introduced by Jensen (1996) and refers to a two-pass method of global illumination that traces illumination paths from both the light and the viewpoint. In the first pass, photons emitted from light sources are traced and cached in a data structure known as the photon map. As the method has developed over time, two maps are now created during this construction phase. The first is a high-resolution caustic map that directly visualises the objects within a scene, and the second is a low-resolution rough estimation that is used in the rendering stage of the method. In the second pass, the information stored in the photon map is used to render an image (Jensen 1996, p. 22; Dutre et al. 2018). More specifically, the map created in the first pass traces photons emitted from every light source in the scene and calculates the progress of each proton through the scene. These emissions are determined by the source of the illumination (Jensen 1996, p.24). For example, a lighting method that has a global/direct illumination will emit a larger number of protons than an indirect source of light. Each proton travels through the scene until it interacts with an object and is stored. Once this has occurred, a method called Russian Roulette (as described by Arvo & Kirk 1990) is used to determine if the proton should continue through reflection, or if it should cease through absorption (Jensen 1996, p. 23). The data from these maps is exported into a balanced KD-Tree which allows for the rendering to be a notation of O(M . log2(N)), in which M is the number of Photons that need to render (Jensen 1996, p. 24). The way in which data points are stored on the tree allows for rendering efficiency, thus data consumption is minimal due to this storage choice.
The second pass in photon mapping is the rendering pass. As the name suggests, it renders the scene with data points that were obtained during the first pass. The formula adopted for rendering in global illumination algorithms is known as the Rendering Equation and was developed by Kajiya (1986). The equation is as follows:
𝐼(𝑥, 𝑥′) = 𝑔(𝑥, 𝑥′} [𝑒(𝑥, 𝑥′) + 𝑓𝑠 𝑝(𝑥, 𝑥′, 𝑥”)𝐼(𝑥′, 𝑥”}𝑑𝑥”]
𝐼(𝑥, 𝑥′) = the related to the intensity of light passing from point 𝑥′ to point 𝑥
~( 𝑥, 𝑥’) = a “geometry” term
𝑒(𝑥, 𝑥′) = related to the intensity of emitted light from 𝑥’ to 𝑥
𝑝(𝑥, 𝑥’, 𝑥’’) = related to the intensity of light scattered from 𝑥’’ to x by a patch of surface of 𝑥’
The number of photons that are created by the light source/s may make rendering difficult, however, using a balanced tree that is culled through Russian Roulette, the rendering pass takes an average from a number of samples from the Monte Carlo ray tracing method. By adopting this method and rendering equation a highly detailed scene can be created with rather realistic lighting effects as shown below in Figure 1.
Accuracy during this second pass is vital as the photons that are tracked in the scene must be enough to estimate the amount of light or shadow passed onto a surface. However, rendering time increases when a greater number of photons are mapped.
An important aspect of photon mapping to discuss is how the algorithm deals with direct and indirect illumination. Direct illumination for visible surfaces is achieved by using regular Monte Carlo sampling and is computed differently depending on whether an accurate or approximate evaluation is required (Dutre et al. 2018). For accurate computing, information from the photon map is used to locate areas that are either fully illuminated or in shadow in order to avoid using shadow rays. These rays are only used when the nearest photon in the global photon map contains a mixture of direct illumination and shadow photon or if the amount detected is too low. The approximate evaluation is a radiance estimate taken from the global photon map, whereby the light sources evaluations and shadow rays are not utilised (Jensen 1996, p. 26). Conversely, indirect illumination uses the global photon map to compute radiance at surfaces which are not immediately visible (Dutre et al. 2018).
Although the principle of ray tracing had previously been adopted to generate perspective and shadows in conventional art (see Appel 1968), the concept of ray tracing in computer graphics was first introduced by Whitted (1980) and is arguably the most popular algorithm in rendering. Often referred to as ‘Whitted-style ray tracing’ this algorithm handles reflections, refractions and computes radiance values for each individual pixel in the final scene or image by creating paths between the pixel and the light source (Dutre et al. 2018). Images are rendered by shooting rays into a 3D scene and tracing the path of light from the camera/eye back to the original source, which is a mechanism known as ray casting. This process is recursive and continues until either the light reflected in a point is too low, or a cut-off such as a maximum number of bounces is reached. This creates a tree of rays which is evaluated to produce colour (Haines & Shirley 2019, p. 9). In the ray tracing algorithm, shadows can appear as a result of both direct and indirect illumination. Direct illumination shadows are created when the visibility of the primary light source is obstructed, whereas indirect illumination shadows appear when the refractions or reflections of light at scene surfaces are blocked (Boksansky et al. 2019, p. 162).
Real-time applications of photon mapping
In recent years, the original concept of photon mapping has been further developed to allow for new scenes to be rendered at speeds that were not originally viable for real-time rendering (Hachisuka et al. 2008). A study undertaken by Fleisz (2009) demonstrated that the overhead GPU for storing the data in a KD-Tree was not efficient due to the data structures stalling when being handed off to the GPU. It was theorised and subsequently shown that the introduction of the use of Spatial Hashing resulted in an increased performance and optimisation of data transferring to the GPU (Fleisz 2009). This change in access and data storage methods allowed for a progression in photon mapping and consequently, hardware was able to deal with more complex scenes and challenges. For example, studies by Gupte (2011) and Mara et al. (2013) on photon mapping demonstrated that mid-range GPUs were able to render scenes with 500,000 photons within 11.6 milliseconds. This confirms that real-time mapping was both applicable and viable when implementing this algorithm.
Real-time applications of ray tracing
The increase in graphics processing unit (GPU) power over the last few years (eg. from NVIDIA and AMD), along with improved implementations and support from game engines has allowed ray tracing in real-time to become possible (Friedrich et al. 2006). Recent advances in both hardware and software has introduced more efficient data structure systems to be built in the GPUs architecture and therefore, the use of new algorithms which allow for increased workflow of the GPU has seen ray tracing in video game development become a standard to strive for, and to some developers, a new
‘big bang’ in the development of rendering which was not previously possible (Haines & Akenine-
Möller 2019, p. xvi; Friedrich et al. 2006). Despite this, unlike other global illumination algorithms, ray tracing often requires expensive components, as demonstrated in a study published by Shih et al. (2009) that showed real-time performance utilising GPUs such as the Nividia GeForce 8800 GT implementing a ray tracer to render all scenes. Therefore, whilst possible, real-time rendering is not always a viable option.
There are many different methods and algorithms that can be adopted for global illumination. This brief report has discussed in further detail photon mapping and ray tracing. It highlights the difference in two of the major global illumination methods in rasterisation, which we used photon mapping to discuss, and ray tracing. Over the coming years, we will see the development of more reasonable priced GPUs and therefore, more developers will be able to implement better techniques in ray tracing. Until then, rasterisation, and in many cases photon mapping, will continue to be the standard in video game development.
Appel, A 1968, ‘Some techniques for shading machine renderings of solids’, In Proceedings of the April
30–May 2, 1968, spring joint computer conference, pp. 37-45.
Arvo, J & Kirk, D 1990, ‘Particle Transport and Image Synthesis’, Computer Graphics, 24 (4), pp. 53-66. Birn, J 2006, Digital lighting & Rendering, Second Edition, New Riders.
Boksansky, J, Wimmer, M & Bittner, J 2019, ‘Chapter 13: Ray traced shadows: Maintaining real-time frame rates’, in E. Haines & T. Akenine-Möller (eds), Ray Tracing Gems: High-Quality and Real-Time Rendering with DXR and Other APIs, Apress, pp. 159-182
Dutre, P, Bekaert, P & Bala, K 2018, Advanced global illumination, CRC Press.
Fleisz, M 2009, ‘Photon Mapping on the GPU’, Master’s thesis, School of Informatics, University of
Friedrich, H, Günther, J., Dietrich, A., Scherbaum, M., Seidel, H.P & Slusallek, P 2006, ‘Exploring the use of ray tracing for future games’, In Proceedings of the 2006 ACM SIGGRAPH Symposium on Videogames, pp. 41-50.
Gupte, S 2011, Real-Time Photon Mapping on GPU, University of Maryland Baltimore County.
Hachisuka, T, Ogaki, S & Jensen, H.W 2008, ‘Progressive photon mapping’, ACM SIGGRAPH Asia 2008 papers, pp. 1-8.
Haines, E. & Akenine-Möller, T. (eds) 2019, Ray Tracing Gems: High-Quality and Real-Time Rendering with DXR and Other APIs, Apress.
Haines, E & Shirley, P 2019, ‘Chapter 1: Ray tracing terminology’, in E. Haines & T. Akenine-Möller
(eds), Ray Tracing Gems: High-Quality and Real-Time Rendering with DXR and Other APIs, Apress, pp.
Jensen, H.W 1996, ‘Global illumination using photon maps’, Rendering Techniques’ 96, pp. 21-30.
Kajiya, J.T 1986, ‘The rendering equation’, In Proceedings of the 13th annual conference on Computer graphics and interactive techniques, pp. 143-150.
Mara, M, Luebke, D & McGuire, M 2013, ‘Toward practical real-time photon mapping: Efficient gpu density estimation’, In Proceedings of the ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games, pp. 71-78.
Shih, M, Chiu, Y.F, Chen, Y.C & Chang, C.F 2009, ‘Real-time ray tracing with cuda’, International
Conference on Algorithms and Architectures for Parallel Processing, pp. 327-337.
Whitted, T 1980, ‘An improved illumination model for shaded display’, Communications of the ACM,
23(6), pp. 343-349.