Here at Blender Institute we challenge ourselves to make industry quality films while improving and developing Blender and open source pipeline tools.
One of the artistic goals for Agent 327 Barbershop is to have high quality motion blur. Rendering with motion blur is known to be a technical challenge, and as result render times are usually very high. This is because we work with:
characters with advanced shaders
in indoor environments
with complex light setups
and lots of mirrors
The initial test
As soon as animation was ready for one of the shots, Andy prepared it for rendering with the final settings (full resolution, full amount of Branched Path Tracing samples, etc.) and shipped it to the render farm (IT4Innovations, VSB – Technical University of Ostrava).
intel_cpu24_2xMIC - compiled with intel 2016.03 with patch https://developer.blender.org/D2396 (+ loop over samples moved to loop over pixels), with_cpu_sse=off, qbvh=off (using Xeon Phi coprocessor)
Rendering took a while, and when all frames where completed we produced the following clip:
The graph is made by collecting the render time on the 2 different system configurations (yellow is gcc_node_1 and green is intel_cpu24_2xMIC) and overlaying it on the animation, with a time marker matching the current frame.
The problem: some frames were taking over 100 hours to render. The solution: fix Cycles!
The first step was to reduce the overall render time of the scene, in order to do more tests and collect measurements more quickly.
Here is a chart comparing the render times of the same animation with and without motion blur. Notice that the y axis uses a logarithmic scale. In some cases a frame would be 100x slower with motion blur.
This appeared to be was mostly due to motion blur, especially hair. Who knew!