In the past four months Blender Institute worked with the Google VR team to convert the opening sequence of our latest Llama cartoon into a 360 degree VR experience. Thanks to their support and render powers we're happy to now present to you the Caminandes VR demo. If you are on mobile, watch it in the YouTube app.
Thanks to this production we finished a number of targets.
Make Blender ready for VR production (realtime preview, layout and animation).
Support for spherical stereo rendering with Cycles.
Improved the rendering workflow for high resolution image sequences (our delivery resolution was 4096x4096 pixels).
Before we started with VR production, a considerable amount of work was already being made by Dalai Felinto in his Blender-HMD Branch.
Thanks to the efforts of Joey Ferwerda, Julian Eisel and Sergey Sharybin we were able to test realtime VR preview of Blender scenes here in the studio.
Spherical stereo rendering is now available thanks to a patch originally developed by Dalai in his Spherical-Stereo Branch and finished and merged by Sergey.
Since a spherical picture has a decreasing "pixel density" towards poles, Sergey implemented an option that reduces the number of samples being calculated as we get close to the poles. This helped to drastically reduce render times.
In order to render the sequence we used the Flamenco render manager (developed during Cosmos Laundromat) running on the Google Cloud Compute infrastructure.
The final frames (two 4K images per frame) would take up to 8 hours per frame, so we render them in sample chunks, using the new resumable renders feature in Cycles, which allowed us to merge the multiple rendering iteration once all chunks were completed using the compositor.
While many important VR-related features are now available in the latest Blender, and will be shipped in the upcoming 2.78 release, some features are still waiting to be merged (HMD realtime viewport integration).