Netflix NextGEN Blender review

NextGEN is a scifi 3D movie produced by Tangent Animation using open source software "BLENDER". Netflix bought international rights for exhibition for 30Million Dollars. Which immediately raise the question: Is BLENDER ready to pipeline produce movies?

Let´s check the full technical review on NextGEN, recently produced with Blender 3D Open Source Software.



On 2016 I wrote this article and examined why was a good choice to switch to an Open Source application like Blender. Time has given the Blender Foundation Institute the right to stand between other giants for 3D animation and now with Netflix backing up this amazing story about a girl and a robot, there's not a shadow of  doubt it is a good time you can pick up Blender for your next movie producing deal. Click here to see some very important points to consider.



Netflix bought the worldwide distribution rights (except for China) on NextGEN at the Cannes Film Festival in 2018 for $30 million. Next Gen launch exhibit on Netflix happened on September 7, the same year.

The film is written and directed by Joe Ksander and Kevin R. Adams. This isn’t their first project together; the pair worked as a director / writer team on a 2014 short called Gear, which explores generally similar themes in NEXT GEN. The pair also worked together on the Shane Acker/Tim Burton animated feature 9, with Adams as art director and Ksander as animation director.

But the big surprise for most of the CG community was to learn that NEXT GEN is done entirely in the open source animation package called “Blender”, put to use for the film by Tangent Animation of Toronto, Canada.

Jeff Bell, one of the founders at Tangent Animation was one of the first animators to work on Maya at the R&D phase back in 1994-1998 development is huge for our projects. He made the jump to Blender and is actively pushing new development implementations and custom made solutions as part of the Blender Foundation project.

***** RIGGING ******


Rigging DOJO had a marvelous personal interview with David Hearn and Charles Wardlaw, two amazing Blender artists who tell us how was the rigging process for NextGEN. Using Blender's link and append functions along with custom libraries they were able to bring a pipeline workflow for Blender.

If you want to listen to the full podcast, link is here>>

***** PRODUCTION *******


In a post on BlenderNation, Jeff from Tangent Animation writes:

I can confirm Tangent is the primary production facility for this movie and the use of Blender in our pipeline. We’re effectively 100% Blender, other than plugging in apps in a few areas to supplement departmental workflows.

The budget for NEXT GEN was also 5x that of Ozzy, this is why there is such a large difference in the quality of the movies.

We can already establish Blender would be a huge advantage to produce this kind of contents.

******** LIGHTING ***********


Concerning lighting for the movie, Senior Lighting Artist Justin Goran writes, QUOTE:
Early on Motion-Blur was a big problem, causing huge render-times, but thanks to Stefan's implementation of Embree we were finally able to render motion-blur with a predictable increase in render-times instead of the random spikes we were getting before.

Average memory usage was 60-70gb (120-140gb for bigger shots), this movie would be impossible to render with GPU's. We used CPUs.

Cycles was used for everything, though our version of cycles was modified (with stefan's embree core, and crypto-mattes which were beyond valuable for compositing). The version of blender we used was the studios own dev version which I believe was using blender 2.78 as it's base.

Caches were stored in the fx files, nothing fancy there. Dynamics like cloth were baked out into alembics.

We rendered in linear space and used filmic as a lut

********* CLEANER RENDERS *******


Stefan Werner integrated Cryptomatte 30 for Tangent Animation´s core development, and brought Blender into a state of production level, among other integrated tools Tangent Animation is creating in-house.

Link: https://vimeo.com/136954966

Embree was used for motion blur. These are Intel libraries that help generate a better motion blur on scenes directly pointing to functions inside the CPU: https://github.com/tangent-animation/embree

Stefan also did some incredible work on volumetric rendering efficiency, adding the Intel Embree core to Cycles, and generally improving the Cycles renderer for user on NextGen. Render times were extremely reasonable and manageable, even with full 3D blur and in-camera DOF used throughout the film


********* WORKFLOW INTEGRATION********


Shane Jackson was the Surfacing Supervisor on the film and he wrote:
We created a lot of shader node networks in house, and some custom hair and skin shaders too! The challenges came when we were doing massive sets and needed techniques to cover wide areas while retaining resolution, something we couldn't do with sets of textures from Painter alone.
So long story short, shaders totally in Blender, created with methods that Im sure you guys have seen, and as much procedural stuff as we could get away with.

********** RENDER TIME *********


The average per frame was 3.76 hours on DCI 2K format for NextGen - some sequences rendered a lot more quickly, some took a lot longer (think: water effects via Alembic caches, 20-30 VDB’s in a single shot for smoke, fire and explosions, etc). There are obviously many redo’s and artistic retakes involved, but if you eliminate those, here’s a rough breakdown:

93 minute movie = 133920 frames @ 24 FPS
They rendered 5 different versions of the movie: English mono, English and Mandarin Stereo (for the Chinese market), so this is 669600 frames
Each frame took 3.76 hours on average
One machine would take 2517696 hours, or 104904 days to render the movie. We had around 2500 nodes working on it, which equals roughly 42 days

In reality, the Mandarin lipsync was only done for the shots that warranted it for facial animation, which was about 350’ish shots. Those renders were done by using an animated render region, targeting just the mouth, so those renders tended to take 1/4 to 1/2 the time of the full renders. These regions were then composited on top of the English renders to replace the lipsync.

Shane Jackson was the Surfacing Supervisor on the film and he wrote:
When we needed to modify a specific pass, we extracted the pass from the beauty pass (which was the main pass used in our composites), modified it, then added it back to the beauty pass. Cryptomattes were crucial for this.

The compositor is slow, but it was fine for our purposes. We have plans to add caching to it, and add a mask channel to nodes that are missing it - there are workarounds, but it’d be nice to be able to use all nodes in the same fashion.


****** CROWD SIMULATION *******


Many shots used an amazing add-on by Sam Wald that brought in rigged characters and instanced them intelligently while applying unique armatures, allowing for randomize and offset action blocks.

The crowd running near the end was largely Golaem a crowd in Maya, exported through Alembic. A custom script was used to reassign materials to the Alembic objects when they were imported back in.

The robot crowd near the very end was kind of custom. Nothing was giving us the control we needed so they faked it using nParticles. That gave them free collision avoidance and allowed them to apply fields to art direct the movement.

Long story short, if you can use hair instancing for massive crowd shots, do it. Cycles instancing is fantastic when employed correctly.

Are you ready to start using Blender?

I hope you enjoyed this review.
Don´t forget to subscribe, hit that bell button for instant notifications and comment

What do you liked most about NextGEN?
Thanks.

Comments