I have implemented two new materials. The first one is blend material, which adds a lot of power to the material system since you can arbitrarily use different materials on parts of the same mesh. The second material is two-sided material, which is useful for thin translucent materials like paper, lights, and leaves since it allows the back-face to receive light.
Below are three examples. The first one shows the blend material in action, the left sphere is a reflective sphere and the right is the blended material with a rough refractive material and the mask is the world map. The white parts receive 100% of material A and the black parts will receive 100% of material B, anything in-between receives a proportion of both.
For the two sided material the front facing parts relative to the camera receive material A while the back facing parts receive material B, and a flag for translucency to allow light to go through the back is optional and set as off by default. The second image shows a open cylinder with a two sided material. The third image shows an render with a spherical light place inside a tube, the interesting thing is that most of the light illuminating the scene is from indirect lighting.
Having some fun with the rendering engine, I created some 3D programmer “art” models in 3ds max, uv mapped them and exported them out as OBJs. I tried my best to model and texture them properly.
These are some of the books i am using for my bachelors project, I wasn not bothered making the rest of them.
I was surprised just how wavy and cellular the reflections are with these books in real life. I chose these specific books because they have different degrees of glosiness to their reflections. Click for full size.
I tried to implement some BRDFs, but kind of failed.
I tried really hard to implement the Ashikmin-Shirley anistropic brdf model, but it was producing all sorts of strange result I can only conclude that there are probably some nasty bugs in the rendering engine… I spent a week trying to make it work.
I also tried to implement refraction through rough surfaces a paper by Walter et al. with mixed success. My implementation doesnt seem to play well with the GGX or Beckmann distribution so theres definitely a bug somewhere.
I rendered a similar image to the one in the paper; a glass sphere with the world map etched in.
I started working on adding indirect lighting to the renders using path tracing since this is the most straight forward one to implement. Iam almost certain there are some problems with the implementation at the moment since its giving some strange results in some areas.
Below are some images for comparing with and without indirect lighting; The cornell box scene was converted from Mitsuba.
I have implemented a few more lights; These types of lights belong to a class called area lights.
They have a finite area and are visible to the renderer so we must be able to intersect with them, but also calculate its light contribution. We must be able to sample uniformly over the surface, retrieve the normal, and also the pdf at the sample point with respect to the solid angle as seen from a point.
It is possible that there is a bug in the implementation bacuase the back wall appears too bright; it may be possible that there is a problem with the normals.
I have implemented IES lights, these are lights based off measured data that are stored in the IES file format. These files store the photometric-web which represents the 3D distribution of candelas of a particular luminaire.