My Little Pony goes live
Lionsgate wanted to promote the release of its new My Little Pony movie in a way that would reach fans directly and leverage the engagement power of social media. AvatarLabs and Adobe had the answer.
As a digital marketing agency, AvatarLabs is experienced in developing campaigns for top-end content producers, from movie studios to broadcasters to the big streaming services, but as the capacity of internet video empowered companies to take control of their own messaging, Avatar clients are becoming increasingly diversified. Companies such as mega toy brand Mattel as well as companies in such diverse sectors as hospitality and beverages are now regular partners and the company’s traditional partners are seeking ever more innovative ways to promote their properties.
AvatarLabs has a dedicated innovation team that is always looking for new ways of using technology to improve its services for clients and had been looking at the potential for doing animated characters in a live broadcast.
They assembled a demo, which was sent to selected companies with animated properties, which included regular client Lionsgate.
The demo showcased how 2D and 3D animated characters might be fully integrated into a scene and interact with live human performers in a live broadcast. There were several technical challenges to solve, including appropriately synchronising and matching footage captured from the Kinect motion capture camera, depth camera and RGB camera. Footage was brought into Unity to render and integrated with Adobe Character Animator.
“What we were trying to solve with this demo was how to bring these animated properties into the live streaming space,” says JB Fondren, AvatarLabs senior digital producer. “We had already been offering live streaming, but how could we bring our clients’ animated characters into the live space as well?
“Lionsgate likes to take risks,” she continues, “and so do we. It was the perfect match. They had heard of Adobe Character Animator and were logistically trying to solve similar problems. They had the My Little Pony feature film coming up and they awarded us the opportunity to bring our demo to life.”
Adobe Character Animator had been the driver for last year’s live episode of The Simpsons, as well as the tool underpinning live animations of such figures as Donald Trump and Hillary Clinton on the Late Show with Stephen Colbert, now developed into the Showtime series Our Cartoon President. As far as AvatarLabs is aware, this was the first time the software would be used to create a live event around feature film content.
Character Animator has a simple but highly configurable motion capture feature which captures motion data through something as simple as a webcam and allows for puppetry of an illustrated character in real time. The technology allows a flexibility and opportunity for live characterisation which previously would have required a huge team and many coding hours.
CLEAR WINNER The final broadcast was seen by 70,000 live viewers and was so popular the team struggled to keep up with the comments coming in.
“Lionsgate likes to take risks, and so do we.”
Tackling the Ponies
Lionsgate introduced AvatarLabs to My Little Pony animation studio DHX Media. The team were given access to a library of character assets used for the feature film and collaborated with the DHX animation teams to make sure that specific character poses, mouth positions, head movements were authentic to the final feature animation. DHX also supplied short custom animations, which could be keyed during the streaming event.
Real-time character interaction was not the only element used to create the illusion of a real-time animated reality. Another was the ability to choose multiple camera angles of the live animated action.
“We wanted to it to feel like a live cartoon,” explains James Safechuck, AvatarLabs’ director of innovation and technology. “We wanted to have different virtual cameras and be able to cut around the scene, and camera movements with nice, fluid visuals.”
The team chose to bring the Adobe Character Animator feed into the Unity 3D game engine, which Avatar has used in the past for projects involving mobile games, AR and VR. Unity had recently released a camera system that allowed for creating in-game cameras and cinematics. This would allow the animated characters to inhabit a 3D virtual space which could be captured from different points of view on the fly.
Substantial technical rehearsals were run, with Adobe providing support throughout. Adobe technicians helped the AvatarLabs team tweak their delivery workflow to match the sync between performance dialogue and the live animation output.
“Fans appreciate any opportunity to directly connect with these characters that they know and love.”
The Mane Event
Broadcast on Facebook Live in September 2017, there was every indication the event would be a one-of-a-kind opportunity for fans to interact with their favourite My Little Pony characters, in anticipation of the upcoming movie release.
“We were finally creating this moment where the fans could comment and have a back and forth live with these animated characters,” says senior digital producer Fondren. “We created a script based on the story of the film, where Twilight Sparkle and Pinkie Pie are getting ready for the Friendship Festival, and they ask the fans for suggestions about how they should do it.”
The event wouldn’t be complete without the characters’ long-time voice actors, Tara Strong (Twilight Sparkle) and Andrea Libman (Pinkie Pie). In the studio, a computer was assigned to each, with Safechuck’s team using cameras feeding into Adobe Character Animator to capture their performance data and facial movements as they did both scripted and ad-libbed lines.
Each of the animated performances was then output to a third machine running Unity, which integrated the animation with the 3D set. The footage was mixed in another computer which sent the final version out to Facebook Live.
Character Animator also allowed for animation presets or cycles to be triggered with simple key commands, independent of the motion tracking. The team also employed kinematics to give the motion and bounce to details like the ponies’ manes.
“You can combine all those features together and they work seamlessly, and the character really seems like it’s alive,” says Safechuck.
The Fans
My Little Pony has a huge fan base not only among children, but also adults who grew up with the franchise, but the response to the Facebook Live event was still beyond what AvatarLabs and Lionsgate had anticipated.
“People were commenting ‘How are you doing that?’ and wondering if it actually was the real actors voicing the characters,” says Fondren. “It was overwhelming – but in a good way.”
The final broadcast was only 15 minutes long, but live viewers tallied 70,000 people, with an official reach of 120,000 individuals. This resulted in a flood of comments which the team struggled to keep up with.
“Fans appreciate any opportunity to directly connect with these characters that they know and love. The outpouring and the attendance was something we were not expecting, even with an established brand like My Little Pony. It’s very apparent that fans are interested in this type of material. They want this and they will show up if you offer it. That’s something we’re taking note on for future executions…It’s about letting the fans have that moment of exposure to the characters.”
“As an agency this was a very strong proof of concept for us,” says AvatarLabs senior digital producer Jason Steinberg. “It’s a very viable product to offer our clients. It’s such a great way of getting an animated product to market so much faster than the traditional way – and because it’s faster, it’s cheaper.”
This article originally appeared in the May 2018 issue of FEED magazine.