When Adobe announced the update to Creative Cloud that included Character Animator, it opened up a ton of new possibilities for motion capture of animated characters to make them more lifelike. But to use the technology to create a live animated project? Yeah, maybe some day, but Fox TV stunned us all Sunday by showing a live animated episode of The Simpsons, proving that Adobe’s Character Animator is far more mature than anyone realized.
“The Simpsons has always pushed the boundaries of what’s next and what’s possible in entertainment. They’re not afraid to take risks. When it came to putting a live segment into their wildly popular program, we couldn’t imagine a better opportunity to show what’s next in technology.” – Van Bedient, senior strategic development manager at Adobe.
Through Character Animator, designers are able to bring 2D animation to life in real time as the voiceover actors act out their lines in front of a webcam. Thanks to facial recognition points, the software is able to capture even the most minute and subtle facial expressions, while the animator uses keystrokes to adjust body motion in kind.
The software was born out of a lunchtime conversation between Adobe engineers about how far After Effects’ motion capture capability had progressed, and they wondered if the same could be done with animation.
“We realized there was the chance to create a product specifically for rigged animations,” said David Simons, co-creator of After Effects. “We created a live interface so animators could get immediate feedback on their performances, but as more people asked about live broadcasts, we knew we had something special.”
The idea for doing the live Simpson’s episode came from an unlikely inspiration, the live broadcast of Grease that Fox aired in January to record ratings. The Simpson’s animators contacted Adobe after hearing about what they could do, and they they provided an early version of the Character Animator software to get going.
“I’ve been a fan of The Simpsons since the early 1990s, so, when they contacted us, we jumped at the opportunity to work with them on their first live broadcast,” says Simons.
Technically, the entire episode, dubbed “Simprovised,” was pre-animated like normal, save for the last three minutes, which Fox aired live. Actor Dan Castellaneta donned a motion capture suit and then, while voicing the live version of Homer in the final scene, commented on some of the events in today’s world, by answering a few questions via Twitter with the hashtag #homerlive.
And the cool part was, he did it during both East Coast and West Coast broadcasts, so each was unique. They didn’t stop there, as they also issued an apology this morning for not having a live version for Europe:
“Traditional animation takes a huge amount of time to do well. It’s not easy to convey emotion and action, and if you design too fast, you risk losing all those great ‘in-between’ moments. Character Animator is a game changer.” – Bill Roberts, senior director of product management DVA for Adobe.
With Castellaneta’s live performance, Character Animator took care of the rest with realistic lip sync accomplished by keyboard animation. The animators for the long-running Fox comedy used the early version of the software, and had been working for the last three months on the stunt, getting more refined and polished with every update of the application.
And as they say, the rest is animation history, and Character Animator could change the way that traditional animation is done forever.
“Adobe Character Animator is the first product I’ve ever worked with where everyone—from CEOs to kids—smiles when they see it. It’s easy and fun to make animations come alive. Creatives now have more ways than ever to transform how audiences view and respond to their characters,” says Roberts.
Hat tip: Adobe