2019 The Lego Movie 2: The Second Part (senior lighting pipeline td: Animal Logic)
2018 Peter Rabbit (senior lighting pipeline td: Animal Logic)
2017 The Lego Ninjago Movie (senior lighting td: Animal Logic)
2017 The LEGO Batman Movie (senior lighting td: Animal Logic)
2016 The Great Wall (lighting td: ILM)
2016 The Master: A Lego Ninjago Short (senior lighting td: Animal Logic)
2015 Hitman: Agent 47 (lighting td: ILM)
2015 Jurassic World (lighting td: ILM)
2015 Strange Magic (3d modeling / lighting td: ILM)
2012 Battleship (technology: ILM)
2011 Star Wars: The Clone Wars (TV Series) (technical assistant / fx td)
2011 Transformers: Dark of the Moon (technology: ILM)

  • Jurassic World
  • Transformers: Dark of the Moon
  • The Great Wall
  • Battleship
  • Hitman Agent 47
  • Peter Rabbit
  • The Lego Batman Movie
  • The Master A Lego Short
  • The Lego Ninjago Movie
  • The Lego Movie 2
  • Strange Magic
  • The Clone Wars


The Lego Movie 2

The latest Lego movie introduced various new challenges.

  • The sets were bigger and more complex than in the first movie.
  • Apocalypseburg is covered in a layer of dust and sand which had to look good in closeup and wide shots.
  • Shots with thousands of propagated lights required efficient culling.
  • Massive sets with millions of transparent brick resulted in expensive refraction and caustics renders requiring optimisation.
  • Heavy use of volume required more efficient volume rendering techniques.
  • Introduction of fabric, glimmer and new brick types and materials.
  • Each of the different biomes throughout the movie had its own very distinct look that required adjustments to shading, lighting and comp.




Jurassic World

The gyrosphere chase with over 150 shots was the longest sequence of the movie. The biggest challenge was the gyrosphere. A breakdown of the process.

  • First the background and set had to be projected onto proxy geometry. Unfortunately we didn’t always have a clean set, often parts of the live action set and characters had to be painted out.
  • Next we used a clipping card aimed at the camera at the largest diameter to cut the sphere in a front and back part.
  • Then we rendered back half of the sphere to refract the projected set.
  • Now rendered the interior of the gyrosphere which was partially CG and live action but often was completely replaced with the CG version.
  • Cards of the roto-ed character were used to cast shadows if needed.
  • Finally we feed back the pre-comp of all these layers and rendered the front half of the sphere making sure there was no visible seams.
  • In order to be able to render all the above passes overnight we used Katana’s dependencies including a script to lunch nuke as an external application to do automated pre-comps which then were passed back to the next render pass.
  • The refraction on the ground were cheated with slide maps.

There was a lot of grading of the plates required. At the time we still used Renderman 19’s REYES with the raytracing integrator for the refraction, which was quite slow. To get quicker turnarounds we decided to replace the plate with ST-maps. This required to break our physically based shaders and turn off the energy conservation part. Otherwise, it shifted the whole ST coordinates.

Furthermore, we encountered a precision limitation in the shader which resulted black refractions. The only timely solution for that was to slightly modify the IOR on shot level. And if that didn’t help, filling the gaps in comp.




Peter Rabbit

Among the challenges on Peter Rabbit was the number of furry creatures. To be able to render them without exceeding the memory limits we had to use LODs for the fur depending on the characters size in frame. The images below hard to be rendered for the first trailer. Initially the LODs was set in lighting. And required some shader tweaking to match the look of the fully detailed version. Later in the show the process was automated.

A lot of work went into the automation of the lighting via lidars and hdri which had to be matched with the grading of the plate. It was particularly tricky for interior shots.

Animal Logic’s own renderer Glimpse proved to extremely capable of handling all of these new task. Many of the features for fur rendering and live action integration were developed while the show was in production, thanks to an incredible team of RnD, TDs and Artists.



The Lego Batman Movie

Joining the Lego Batman team relatively late into production I got chance to help setting up the sequence where the Joker transformed the Wayne mansion into crazy clown palace with hundreds of animated lights.

We used a combination of incandescent glowy shaders and propagated lights. Due to a lot of art directing on the animation, speed and intensity of the lights we split them into controllable groups that could be connected to a nuke gizmo and controlled from there.

The caustic effect from the big window in the center of the room were done in comp too via mapping and transforming of textures along the pWorld.

With all the colorful, saturated and animated set parts we still had to be able to make characters and actions readable. To achieve that we added shaping light for the different camera angles.




The Ninjago Movie

Another Lego movie but the art direction was quite different from the others. The world was wasn’t just bricks but also plants, sand, rocks water and even a cat.

We did a lot of prototyping of different looks early on as seen in the two images below from trailer one and two. Same shot with different moods. The shot started as a night scene which can still be seen at the end of one of the trailers minus all the city lights.

The shot started with the camera facing down at the water and revealing the WW logo. From there the camera pans up, the submarines should initially only be visible as a dark blurry shadow and then emerge from the water. The water had to reflect the fog and the island as well as the aircrafts. The water needed ripples, should look deep and dark near the camera but clear and welcoming at the beach. To achieve that a range of trickery with shaders, temporary transformations, refracted mattes and comp was required.





The Great Wall

Joining the production of The Great Wall around the time when Pixar released RIS my focus was to evaluate how and if we could use it to benefit the production. One of the test was to use the FX fire delivery in lighting as an actual light source.


February 3, 2019