X

Credits

Back to Work

Mama


Status

Released


Release Date

2013


Director

Andy Muschietti


Awards

Nominated
Achievement in Visual Effects

2014 Canadian Screen Awards


In 2008, Guillermo Del Toro discovered the 3-minute short film Mamá, directed and produced, respectively, by Spanish filmmakers Andy and Barbara Muschietti. When the trio began plans to create a feature-length version, everyone involved knew that Andy’s immense visual appetite (not to mention Del Toro’s love of the gruesome) would require some serious visual effects support, not the least of which would be the on-screen realization of the titular character, Mama herself. Whereas in the original short film, Mama’s screen time was limited to a mere 15 seconds, her role in the feature was expanded to appear in over 100 shots including a backstory reaching 200 years into the past, various physical manifestations, and a frighteningly organic and animated shock of hair that performed almost like a character itself.

Mama’s Body

The biggest challenge we had was in determining how to depict Andy’s vision of Mama’s physical presence on screen. Mama needed to float above the ground, crawl along walls, and move in an extremely non-human way. Sometimes she crawled like a spider, and sometimes her entire spine appeared broken with her upper and lower body impossibly positioned relative to one-another. A common audience assumption is that Mama is a completely CG creation, but creating an entirely digital Mama for every shot would have been not only budgetarily impractical, it would have seriously hampered the performances by not having the physical presence there to interact with the other cast. For the role of Mama, Andy cast Javier Botet, a 7-foot tall, extremely thin contortionist, who also happened to be a great actor. With heavy prosthetic makeup over his entire head and body, Javier brought Mama to life in front of the camera. And while there are certainly a few shots with a fully digital Mama, we were able to take the inspiration from Javier’s real performance and apply it to the CG character, something we never would have achieved with a 100% digital creation. The practical performance was additionally enhanced, pulling wires attached to his limbs to jerk his body unnaturally, and filming him with the camera upside-down or sideways and letting gravity do the rest of the work. Javier was shot either in the practical set or on bluescreen when further compositing manipulation was required to enhance all of Mama’s major contortions, or stitch together performances of various body parts from different takes.

Mama’s Hair

The second major challenge was the creation of Mama’s hair. Andy wanted the hair to feel other-wordly, almost like she was underwater, and his keen eye required that we needed to be able to precisely control the flow and direction of each tendril to achieve the desired performance while still keeping it natural and fluid. We initially explored a practical element-based solution, shooting high-frame-rate tests of dozens of long-haired wigs blown by fans, hoping to create a reusable library of hair tendrils that could be composited onto Mama’s scalp. Compared to a CG solution, this approach gave us real (and instantaneous) motion, volume, and texture but at the expense of other types of control, the most important being the ability to light the hair within the environment of the particular shot. As well, given that Mama would be moving around quite a bit, the hair needed to follow and react to her movements, something we could never anticipate and cover in an element shoot. We were going to need to solve this another way.

Creating a completely digital hair animation and rendering system was by far our greatest technical achievement on the show, requiring months of R&D during preproduction to create artist-friendly tools that gave control over the look, feel, and performance of the hair, right down to the last strand and clump of dandruff. No suitable off-the-shelf software solution existed to do what we were trying to do, so we wrote our own tools to work within SideFX’s Houdini to run hair performance simulations based on geometry tracked to Mama’s head in every shot.

Javier was photographed wearing a skullcap adorned with battery-powered LED tracking markers so they would show up even in the darkest environments. The plates were tracked and Javier’s head and body were precisely matchmoved and then exported from Maya as an Alembic file. After being imported into Houdini, the geometry was fed to Rapunzel, our proprietary hair creation tool, which procedurally grew the guide curves needed for Mama’s hair, typically between 256 and 512 curves. Each guide curve represented the eventual rendering of 40-50 individual hair strands. Rapunzel also gave us the option to quickly paint length, shape, and density attributes to enhance the look as needed. The curves were then passed to Fabio, our proprietary hair simulation tool, used to simulate the underwater current in the hair by accounting for the macroscopic movement of long hair through a physical fluid simulation. It also included controls that allowed the artists to place controllable guide curves, represented as kinematic bone chains, on Mama’s scalp that could be animated to influence the simulation output. The guide curves were quickly rendered over the original plate and reviewed with Andy, at which point he could make notes and we could make interactive changes to achieve the desired vision. When the guide hair performance was approved, the curves were passed to lighting to generate the full head of hair, a much more render-intensive and time-consuming process.

The hair was lit and rendered using Mantra’s Physically Based Rendering, utilizing the new hair model as a core. It was hooked into Houdini’s hair material which we further modified, taking advantage of the ramp controls and simplifying the interface, since the PBR approach typically needed fewer parameters to achieve an acceptable look. Raytracing the hair was going to be too slow for our needs, so we took a more traditional approach with spotlights and deep shadow maps, but keeping physically-based falloff so that lights behaved predictably within the PBR framework. HDR maps taken from set were sampled for the colours and intensities of any spotlights we used, so while we weren’t using area lights, we did try to keep the workflow as physically-based as possible with energy-conserving shaders and high dynamic range lighting. Render times were reasonably fast with most frames finishing in 15 to 25 minutes, allowing for lots of iterations.

Cocoon

Mama’s cocoon, in which she envelops Lily at the end of the film, was a hybrid of photographic, compositing, and CG cloth simulation techniques. For the early part of the transformation, we shot wire-guided, 300fps cloth elements on bluescreen and composited them together to begin the flower-like metamorphosis. As the cloth began to wrap around Mama, Andy wanted each petal to move with its own specific tentacle-like nature. Flowing CG cloth pieces were successively added into the mix, culminating in a 100% CG cocoon in a fully-animated sequence of completely digital shots as the cocoon fell down the cliffside, struck a branch, and exploded into a choreographed dynamic swarming simulation of thousands of digital moths.

Digital Doubles

In addition to the character of Mama, the VFX work on the film also comprised over 300 additional shots of more “invisible” areas of enhancement, including 100% digital doubles of the young girls in the cabin, keyframe-animated by hand to make them crawl down the stairs and up onto the counter and refrigerator. We began with cyberscans of the actresses to give us a starting point from which we modeled their emaciated bodies. Andy was very particular that even in the darkness of the cabin, we would feel that these girls had been extremely malnourished and feral. We pushed the 3D models to the limits of anatomical-correctness, thinning their limbs and removing fat and muscle until we achieved the desired silhouettes. CG wreaths of vines were placed atop their heads, their faces were modeled to be extra-gaunt and scary, and finally a hair simulation (a modified version of the system created for Mama’s hair) was added to finish off the effect with realistic swaying and motion as the girls moved.

Car Crash

In the opening scene of the film, Jeffrey’s manic skid that sends him and his daughters over the cliff edge was also a major undertaking. Plates were shot on a mountain road near Quebec City that provided the desired bend in the road, but the background beyond the road was completely replaced with a multiplaned matte painting, composed of layered cards and displaced tree geometry, all precisely matched to the parallax and motion of the on-set technocrane camera move. A CG Mercedes was built from a cyberscan of the car, animated to create a believable and weighty swerve, spin, skid, and ultimate roll over the side of the mountain, then textured and lit to match the practical version. Interacting with the animated car, snow spray and kick-up effects were generated as separate particle systems and then composited together with a falling aerial snow particle system.

Digital Environments

The finale of the film on the cliff top was shot almost entirely on bluescreen to be replaced with a fully digital background environment. The handful of shots which looked directly back from the cliff edge were shot at the Boyd Conservation Area north of Toronto. The rest of the sequence – 89 shots – was shot inside a stage at Pinewood on a 40’x36′ setpiece that we digitally extended back towards the forest with a digital matte painting created from elements shot at Boyd, and wrapping around a full 360 degrees to cover any angle Andy wanted to shoot. The environment looking out from the cliff was created by stitching together 43 separate 5K tiles shot on the Red Epic from atop a cliff at Rockwood Conservation Area on the Grand River near Guelph. The ultra-high-resolution stitched painting was projected onto the interior of virtual sphere, which would give us the correct angle and perspective of the background when rendered from a virtual camera derived from the practical camera motion in each take. For shots looking down the cliff, we created a CG rock face using LIDAR of the set piece (about 5’ high) and extended it another 200′.

The environment outside Helvetia – the forest cabin the abandoned girls lived in – was also a digital creation, using the same CG cliff, this time viewing it from the bottom. The surrounding trees and lake were added as matte paintings, using photographed elements from both conservation areas.

Additional Effects

All the moths in the film are CG, as well as the bruising and veining on the walls. In a scene reminiscent of the original short film, an invisible seam-up of nine separate plates culminated in a single shot that was almost four minutes long, had four separate appearances of Mama with hair, and seamlessly transitioned between the ground floor of the practical house location, and sound stage set of the upstairs floor.