Neill Blomkamp is the celebrated writer and director of District 9, Elysium, and Chappie. In early 2017, he and his producer brother Mike Blomkamp created Oats Studios, dedicated to incubating and producing independent films.
They set down roots north of the 49th parallel in a hangar-sized studio with some tattered furniture, and surrounded by movie props from Blomkamp’s past films.
With the help of VFX supervisor Chris Harvey (Tron Legacy, Zero Dark Thirty, Chappie) and a core group of award-winning designers, artists and engineers, they have been workshopping new filmmaking techniques to serve their sci-fi horror-fantasy story ideas.
Oats is meant to be a sanctum where Blomkamp and his crew can experiment. And to hear him describe it, he hopes that his audience, not a Hollywood producer, should dictate where his company should focus next.
Oats has already released several free films online. Critically acclaimed shorts – like Rakka, starring Sigourney Weaver, as well as Zygote and Firebase – have been collectively viewed over 10-million times on YouTube in three months.
Blomkamp also offers fully rigged and textured 3D models, concept-art booklets, scripts, scores and more on the Steam platform, as he told Wired Magazine, “I’m really interested in opening up all the stuff that is usually hidden behind closed doors in filmmaking, so that anyone who feels like they can cut it together in a more interesting way, or who just wants to take a stab at it because they’re learning film editing, can have access to that.”
At the Game Developers Conference in 2016, Unity’s Swedish demo team showcased the graphical quality achievable with Unity 5.4 by showing the first installment of ADAM. According to Chris Harvey, “it blew people away, not only because it was visually fantastic, but the story was super intriguing with a massive cliffhanger.”
ADAM, which went on to win several awards, including a Webby, tells the story of a human whose brain has been erased and imprisoned in a robotic shell. When the hero is expelled from a walled city with a crowd of his fellow prisoners, the newborn cyborg realizes he is exiled in a post-apocalyptic world.
“I was interested in directing several pieces on the story setup in the first ADAM episode, […] the thematic backbone of it is a discussion of where your soul is,” explains Neill Blomkamp.
Second installment The Mirror finds our amnesiac hero on a deadly trek from the walled city to a refugee outpost, where he will discover a clue to his identity. ADAM: Episode 3 reveals an opposing tribe of survivors, this one human and heavily armed. Both films hint what the future might hold for the ADAM universe. The films will be released on Oats’ YouTube channel.
To produce the next two ADAM installments, which are around six minutes each, Oats knew they had considerable technical challenges ahead of them. To realize their first-ever CG film “in engine,” they onboarded real-time rendering and artist-focused sequencing tools from Unity 2017.1.
In just five months, the Oats team produced in real-time what would normally take close to a year using traditional rendering. “This is the future of animated content,” declares CG supervisor Abhishek Joshi, who was CG lead on Divergent and Game of Thrones. “Coming from offline, ray-traced renders, the speed and interactivity has allowed us complete creative freedom and iteration speed unheard of with a non-RT workflow.”
Oats wanted to make the films look as real as possible, and Blomkamp promised, “with high-definition facial capture, dense polygonal environments, and with lots of characters, it’s going to max out the allotted computational power.” What followed was an intense period of experimentation using new features.
Technical director Jim Spoto recalls, “One of the most ambitious features that we co-developed with Unity was Alembic streams for cloth and face animation. Alembic is a standard for animated geometry cache data – it’s a staple in use at VFX studios, and the richness and fidelity it provides has been crucial.”
Coming off big-budget movie productions like Avatar, and Star Wars: The Force Awakens, rigging technical director Eric Legare remarked how “the Alembic integration allowed them to implement Unity into any VFX film pipeline.” For more technical details, see our blog post on Alembic support in ADAM.
True to his live-action background, Blomkamp insisted on real environments for ADAM. “We shot 30,000 photos,” he says. The crew spent time outside Indio, California, at a decommissioned iron mine, where they found ideal story locations. “We wanted the nuances and authenticity of a real environment that can’t be modeled or manufactured.” Dressing the set with digital props, using drones, and armed with several cameras, they captured their data over a couple of days.
This significantly cut down on asset-creation time. The team saw the environments instantly appear in the Editor and could never have modelled that level of detail from scratch in Maya. Not only were they able to start lighting right away, but they started propping the set. Seeing the virtual environment while we did motion capture with actors was really useful as well, says production designer Richard Simpson. “It was much easier to walk around seeing where the performers should go and such.”
Photogrammetry techniques were also used for cloth. Costume designer Kristin Thurber and digital tailor Sean Frandsen collaborated on creating real costumes they scanned in studio, and then piped through Marvelous Designer software, which greatly increased realism.
Using Marvelous, the team recreated the real-world costumes and simulated the physical behavior of the cloth, the same as for a feature film. Having the real physical costumes and reference video, the team then tuned the nuances of the cloth simulation to ensure that it performed correctly.
The resulting cloth simulation was piped out using Alembic to cache it for playback in Unity. The final result is much higher fidelity than would typically be possible in a real-time engine. For more technical details, see our blog post on cloth simulation in ADAM.
“The most difficult aspect has been RT photoreal humans,” Neill admits. Eager to put the engine through its paces, Blomkamp didn’t shy away from tackling one of the biggest challenges in VFX – humans. To achieve this, Chris Harvey engaged the two pillars of realism: shaders and animation.
Subsurface scattering (SSS) shaders are essential for believable skin. SSS is the phenomenon where light penetrates the surface of a translucent object, interacts with it, and then exits from a different location. In materials such as skin, milk, and marble, SSS plays a crucial role in creating a soft translucency.
The SSS shaders in ADAM were made by integrating shaders from the upcoming Unity 2017.2 – a brand-new implementation of SSS for Unity’s rendering engine. Both shader and integration code will be made available, so other creators can achieve the same effects.
The other challenge haunting CG artists working on humans is animation. “The uncanny valley occurs when we animate a face but cannot animate the perceptible yet invisible tiny movements of the face,” says Eric Legare. “This creates an uneasy feeling like we are watching something fake or unreal.” To avoid this problem, the Oats crew decided to lose the facial rig.
Blomkamp explains how they captured traditional motion data for the body then did something quite different for the head: “We scanned the actors’ facial performance by photographing them the way you do an environment, using photogrammetry. This was done in high resolution at 60 frames a second. We ended up with 60 heads, which translated as 60 different meshes deforming in Unity upon playback. “Sort of like classic Mickey Mouse animation,” adds Blomkamp.
This hyper-realistic facial animation dispenses with rigs, bones and all the trappings of classic 3D animation – it only needed Alembic support.
The Mirror’s hypnotic eyes involved a different setup, as technical director Jim Spoto explains: “It literally wouldn’t have been possible without the new features like the Custom Render Texture functionality.” The team used the Alembic-streamed animation for her face but also had the engine run a custom shader that involved GPU tessellation (via “compute shader”). The vertex animation was generated procedurally on the GPU via the mask (see illustration).
For more information, see Custom Render Textures in the Unity User Manual.
Another component that helped Oats tell the ADAM story, while ensuring the team’s workflow scaled well, was Timeline, a sequencing tool that was used for animation and scene management.
The studio nested dozens of timelines within a single master timeline, as technical director Mike Ferraro explains: “We broke the film into sequences so each could be worked on simultaneously. Within each sequence we nested timelines to further divide the work – animation, alembic caches, and FX. It even helped for things like background crowds where a whole group of characters has its animation sequenced in a timeline that’s used as a ‘clip’ in the sequence, making it easy to adjust and offset that group’s overall timing per shot.”
“Thanks to its real-time rendering capability, it doesn’t even feel like I’m working while I am in-engine,” adds lighting artist Nate Holroyd. For more technical details, see our blog posts on Timeline and Lighting tips & tricksin ADAM.
Blomkamp admits, “I have been obsessed with real-time graphics since I was around 16. It feels like some 21st-century playpen of creativity.” Blomkamp’s background as a VFX artist, and the crew he’s assembled at his studio, means that Oats can compete with most major studios in terms of quality.