No strings attached: Jim Henson’s Creature shop delivers digital puppetry with Redshift

Posted: October 24, 2016 by Panos Zompolas

‘Digital puppetry’ might sound like overblown marketing speak – an elaborate way of referring to the task of high-end animators working on fully CG characters, for instance. But it is in fact a distinct approach to animation – and in a vast studio space in the heart of Los Angeles, ‘digital puppetry’ is the most apt way to describe the project being worked on by Steffen Wild and his team of pioneers.

Indeed, something truly inspiring is happening at the Henson Digital Puppetry Studio. Here, the tangible, physically powered act of puppetry has crashed head first with the abstract world of digital VFX, enabling a new way to build on the Company’s heritage while exploring whole new forms of expression.

It’s an astoundingly creative approach, and one that could only be born from the same minds that brought us Fraggle Rock, The Muppets, Labyrinth and The Dark Crystal. Now, the same minds are bringing us a new form of cutting-edge puppetry in Netflix series Word Party, powered by the functionality of Redshift.

Hands-on animation

The lure of traditional puppetry is in its physicality – something that strikes a chord with the human viewer. The Muppets and other shows like it have touched the lives of millions across the world, and the reason is a simple one: the human element of a puppeteer’s hand can bring a sense of life and character to an object that digital work can sometimes lack. There’s a tangible connection between the object and human, imbuing the inanimate with all the warmth of a person; while the digital disconnect between a VFX artist’s keyboard and the final output can make the capture of a convincing being a significant challenge.

It’s this that got Jim Henson’s Creature Shop thinking: how could they combine both disciplines, complementing the Henson feel of human-driven movement with the flexibility and speed of the digital realm? The Henson Digital Puppetry Studio was the answer, and with Redshift on its side, it was fully equipped to embark on a new project unlike any the company – or indeed the industry – has yet attempted.


The mechanical marionette

“At the Henson Digital Puppetry Studio, the team uses what we call a performance-based animation system,” begins Steffen Wild, visual effects supervisor and one of the brains behind the project. “Instead of having animators just sit in front of computers moving function curves around, we actually have performers and puppeteers working inside a game engine.”

“It is really a blend of motion capture and puppetry,” he continues. Our puppeteers’ hand controllers are used as mechanisms that pick up the movements of their hands. That is translated into electronic values, and those get fed into what is much like a big game engine to power a character’s animation.”

On Word Party – an educational program for children that now available on Netflix, that stars an entirely digital cast – the mo-cap actors provide the performance of each character’s body, including limbs and hands, but not the face.

The face is instead driven by a puppeteer powering an intricate mechanical puppet, manipulating the special controllers to dictate the movements of a character’s face, and working in tandem with the mo-cap actor. With dexterous movements, the operator can control the lips, eyelids and jaws of the character, while also providing real-time voiceover.

“What you’re essentially seeing in Word Party is characters performed by two people,” Wild confirms. “There is the body performer, who performs a character from neck to toe, including the fingers using fiber channel gloves we developed ourselves so that they can interact with virtual props.

“Then, the puppeteer has total control over the facial expressions and voice of a character; they can freely create and craft expressions using subtle manipulations,” explains Wild. “That means no expressions, quite literally, will ever happen twice.”

The data from the body performer and puppeteer is captured simultaneously and pulled into the aforementioned game engine. From there, the digital characters are displayed in real-time in the studio for the benefit of the performers and crew, before being whisked away by the post team to craft the final product. That’s where Redshift comes in.


No limits

The engine that powers Word Party is a proprietary technology based loosely on game middleware, which Wild calls the “heartbeat” of the Henson Digital Puppetry Studio.

This engine powers the visualization of the performance seen on set, which the team then places back into a traditional Maya-based post pipeline. At this point, the captured data is fleshed out into the final piece in post-production and the power of Redshift is harnessed.

“In the past for projects, we used a variety of other renderers, but we were in a new position with Word Party,” says Wild. “Here, we needed to create detailed, furry characters, within the constraints of a television budget and timeline.”

To approach the project with a traditional CPU-based rendering solution, Wild estimates the team would have needed some 200 render nodes to meet the demanding timeframes of TV production. All of those render nodes – and all the associated costs – would push the production far beyond budget.

“We had to look for an alternative,” Wild states. “Through research we found Redshift, and the development team there were very open to the idea of supporting us and supporting our production. We made rendering the fur in Redshift a priority, and the team really helped out optimizing for that.

“Redshift ultimately gave us a post process that was lean enough, fast enough and worked within our television budget, while at the same time upping the quality of what we could deliver in the television space,” he continues. “We decreased render time by 80-to-90 per cent, without any compromise on the creative or visual output.

“Redshift was a big game changer for us, enabling us to deliver the kind of furry creatures you usually only see in feature films, but on a television budget.”

The party piece

Indeed, the results are quite unlike anything seen in an animated TV series.

“The Jim Henson Company now has sixty years of expertise in traditional puppetry. Word Party takes all of that knowledge, and uses it in a digital space,” says Wild. “Our process stays true to the roots of puppetry, and the Company’s origins, but in doing so creates a new flavor of performance-based animation in a digital space.

“Instead of keyframing, we can do everything live, in the moment and on our stage,” he continues. “Decisions can be made on the fly, just as they are in live action. It allows improvisation and spontaneity to occur in a virtual production, which in turn results in that all-important innovative spark.”

This speed and on-the-fly decision-making is mirrored in the studio’s use of Redshift. Thanks to the huge speed gains in rendering, the Henson Digital Puppetry Studio has also increased its capacity for creativity and iteration, spending more time looking for ways to improve its output.

“It really carries on the tradition of puppetry, where people work together and collaborate, both in terms of the on-set production and in the ability for many people to easily get involved with the post and rendering processes,” Wild offers. “We always strive to keep puppetry innovative. Although Jim Henson is known as a puppeteer, he was primarily an inventor, always exploring new ideas like combining animatronics with puppetry in projects like The Dark Crystal long before the digital age.

“With the Henson Digital Puppetry Studio, and with the support of Redshift, we’re able to continue in the original spirit of Jim Henson, continually finding new ways to explore, grow, and innovate.”