Anime 360 video animation made in Blender 3D

Blender can create 360 anime videos to be viewed in VR devices, phones and tablets. Many technical aspects are involved with rigging, composition and cameras. If you want to know some of the key aspects involving this production, keep reading because I will briefly talk about them in this post. More details will be revealed at the Blender conference I was invited in the month of May. Subscribe to this blog so you can have updates on the dates.

Character creation


Very early in the design process I received clear and pastel palettes to work with this anime stylized character. There are unique aspects to her which were described on the scope of the project. The modeling and texturing stage took around 12 days including cloth folds and interior clothing. I used Blender for the modeling and 3D Coat for texturing. The aim with this character is that she could really "dress" for each animation. There are 22 facial shape animations ranging for vowel and consonant verbalization as well as some facial expressions. Though in general, the character will not speak, we´d better be prepared for anything in the future.

Character shading and passes


Coming from previous (Modo, Maya, Max, Softimage) 3D packages, I needed different render passes for this project. Blender has checkboxes for basic passes like Mist, Normals, Object IDs and also Cryptomatte. Blender gives you the option to configure an override shader for the entire "render layer".

I needed to develop my common used passes in other applications as shaders on Blender. Once I figured how the exclusion and masking layers in render passes work, everything else was just a matter of setting out the correct order in layers and shader overrides. I re-created: Stylized shadows, stylized rimlight (left and up), a Y vector ramp shader, a normal-compatible pass shader and many other tweaks. In the end it is possible to even develop your own Z-depth pass as a shader override. I´ll be covering this in depth on the conference (pun intended).


Environment and props


The development of the entire lunar scenario with the roller coaster and other mechanical theme park machines, took around 14 days solid between revision, modeling and (initial UV texturing). Silhouette people had a basic rig that followed a direction which was animated through scripted expressions. One of the request on this production is that the park had to feature fireworks. I created a quick particle simulation and derived it into 2 sub particle simulations which I then instanced around the scene.



SCROLL DOWN TO CONTINUE TO READ PAGE 2 AT THE VERY BOTTOM OF THIS PAGE


Animation breakdown


It was crucial to animate the cameras as the action was happening since this is a complete 360º environment. We are looking around at everything that exist! In a common 3D animation, if your character is in a middle shot (waist up) you don´t need to worry about animating the feet or leg. But since the world itself was moving, I also needed to animate the details outside the camera (people, machines, fireworks)...

I resolved how and why the NLA editor in Blender worked the way it work in 15 straight hours. The most complexier thing I´ve seen in 3d behavior since chroma keying in Flame (Autodesk Flame workstation).

Anime in Blender

Whenever I created an NLA clip it would not play, it will not add animation, it will destroy previously laborious keyframe insets, it was all a stress. I figured a lot of people coming from other 3d packages were suffering the same cavalry, so I decrypted the entire workflow in this 8 minute introduction video>>

I used to work in the Softimage NLA with potent features, and I can say you can confidently go into Blender NLA once you see the video and create clips, loops and many other interesting effects.

Animation

PLEASE CONTINUE TO PAGE 2, scrolling down at the bottom of the page, for pagination. Thank you.


Rendering passes and compositing in 360º


Once the animation stage was approved I setup the file write node in the compositor, and connecting the output results to their respective node layers on an .exr multi-layered file gave awesome and expected results. Unfortunately the most expected feature provided by Freestyle (Blender's native render outliner for objects) didn´t work on the 360 camera rendered frames. It only works on regular normal perspective cameras.

I composed the frames into After Effects using Extractor Plugin once all the .exr sequences were ready. In Extractor's recent versions: you point at a channel (either R,G or B) and it will automatically fill in the other values for you in a pass. You will see this in detail at the conference. I will update this post after the conference with the slides.

So how do you "edit" 360º video in a 2D environment? I resolved to crop an area of the screen which I knew the main focus was, because after all the entire scene will be distorted almost polar like coordinate. The original video didn´t want to include a voice, but we tried a version with voice which is the director´s cut you see on the video link. The editing took around 4 days between revisions and approvals.

Uploading the video for VR and 360º view


The platform to display this work is YouTube. There were a couple of technicalities which I´d like to point during the conference concerning the 360 metadata injector tool from YouTube. The size of the video for this 1 minute and 20 seconds was almost 1GB. I tested the aspect ratio from 16:9 and cinemascope. The final size for the original rendered frames in the end (due to time constraints and deadlines) was HD resolution.



Thank you for subscribing to my blog. I´ll be updating this post with more news, questions, answers and pictures from the conference in the future.

 

Comments