Indirect Illumination

I started working on adding indirect lighting to the renders using path tracing since this is the most straight forward one to implement. Iam almost certain there are some problems with the implementation at the moment since its giving some strange results in some areas.

Below are some images for comparing with and without indirect lighting; The cornell box scene was converted from Mitsuba.

Comparison with other rendering engines:

cornellbox.png

Advertisements

Area Lights

I have implemented a few more lights; These types of lights belong to a class called area lights.

They have a finite area and are visible to the renderer so we must be able to intersect with them, but also calculate its light contribution. We must be able to sample uniformly over the surface, retrieve the normal, and also the pdf at the sample point with respect to the solid angle as seen from a point.

It is possible that there is a bug in the implementation bacuase the back wall appears too bright; it may be possible that there is a problem with the normals.

Area Lights

Planerender-plane-area-light.png

Sphere

render-sphere-area-light1.png

Mesh

render-mesh-area-light.pngDisc

render-disc-area-light.png

IES Lights

I have implemented IES lights, these are lights based off measured data that are stored in the IES file format. These files store the photometric-web which represents the 3D distribution of candelas of a particular luminaire.

Technical-Description-Final

Sites to download IES profiles

Unfortunely there is a bug I cannot seem to resolve, the problem is the light does not appear to cast any light down on the floor even though the shape appears correctly.

Another problem is for data that has sparse angle increments which causes banding. I probably need to implement a better interpolation algorithm to fix this.

Below are some bugged images:

ies_bug ies_light1

Update: 10/02/2016

Managed to find the bug and fix it. The problem had to do with not using the shading frame to calculate the cosine.

// Wrong
float costheta = Dot(wi, LocalToWorld * intersection.GetNormal());
// Correct
float costheta = OrthonormalBasis::CosTheta(WorldToLocal * wi);

Corrected Images:

A render that uses both IES lights and a dome light in conjunction:

render-ies-envmap.png

 

Importance Sampling – Environment map

I have implemented importance sampling for the enviroment map which really helps in reducing the variance which manifest itself as noise is the rendered image.

importance-sampling-envmap.png
Samples are drawn according the radiance of the pixels
render-provwash-envmap-importance-sampling.png
A comparison between two images rendered with the same amount of samples

References:

http://www.cs.virginia.edu/~gfx/courses/2007/ImageSynthesis/assignments/sampling.html

Physically Based Rendering, Second Edition: From Theory To Implementation

Normal Map and Linear Workflow

Normal Mapping

Normal mapping is a bump mapping technique for changing the appearance of the shading for an object. It does so by perturbing the shading normals. Each channel of the normal map corresponds to a dimension in 3d space X, Y, and Z.

The normal map iam using is a tangent space normal map.

render-normal-map1

Linear Workflow

I had some headaches getting the dome light and normalmap to work because I didnt account for this linear workflow.

Linear workflow is a rendering workflow that deals with colour space management throughout the rendering process. It ensures the correctness of shading and lighting.

Theres alot of information on this subject I will jsut post a few links:

http://renderman.pixar.com/view/LinearWorkflow

Understanding Linear Workflow – Working In Floating Point

http://filmicgames.com/archives/299

What this means for me:

  1. Everything should be calculated in linear space.
  2. Images loaded in need to be loaded with an inverse gamma curve (1/2.2) applied to it to linearize it.
  3. HDRI and Normal map images do not need to be gamma corrected and should always be left at 1.0.
  4. Once the image has been rendered we apply agamma curve of 2.2 back when saving it out as a sRGB encoded image.

 

Lights

Let there be light

Delta light sources:

Point light: An ininitely small area light source chraracterised by its position and radiant intensity.

render-point-light-225samples.png

Spot light: An infinitely small light source characterized by two angles the inner cone and outer cone, its default orientation is pointing down the y axis.

render-spotlight-64samples.png

Infinite light sources:

Distant Disc light: An infinitely distant spherical cap characterized by its radiance and the angle subtended from the view point of an object. The sun subtends an angle of about 0.52 to 0.54 degrees as seen from earth http://er.jsc.nasa.gov/seh/math30.html.

render-distant-disc-light.png

Dome light: An infinitely distant sphere that envelops the scene with a HDRI texture characterized by spherical angles phi(horizontal) and theta(vertical) corresponding to the amount to rotate.

render-provwash-envmap-box.png

render-sun_clouds-material-2048-samples.pngrender-sun-envmap1.png

Reflection and Refraction + Fresnel

I had a hard time this week trying to get the shading to work properly. The problem ended up being a bug with an unrelated part of the renderer. I had forgotten to transform the points back to their original places after applying the inverse transform for instanced primitives,  resulting in all sorts of strange images when rendering refraction and reflection.

render-reflectionrender-refraction-1.33eta