Rendering The Impossible

When games engines meet live broadcast, the real and photorealistic are interchangeable.
Tools that enable broadcasters to create virtual objects that appear as if they’re really in the studio have been available for years, but improvements in fidelity, camera tracking and the fusion of game engine renders with live footage has seen augmented reality go mainstream.
Viva Espana Sports shows in Spain, including Tot Esport and Tot Futbol, use AR systems from wTVision, based in Lisbon
Miguel Churruca, Marketing and Communications director at 3D graphics systems developer Brainstorm, explains: “AR is a very useful way of providing in-context information and enhancing live images while improving and simplifying the storytelling. Examples of this can be found in election nights and entertainment and sports events, where a huge amount of data must be shown in-context and in a format that is understandable and appealing to the audience.”
Virtual studios typically broadcast from a green screen set, but AR comes into play where there is a physically built set in the foreground, and augmented graphics and props placed in front of the camera. Some scenarios might have no physical props at all with the presenter interacting solely with graphics.
“Apart from the quality of the graphics and backgrounds, the most important challenge is the integration and continuity of the whole scene,” says Churruca. “Having tracked cameras, remote locations and graphics moving with perfect integration, perspective matching and full broadcast continuity are essential to providing the audience with a perfect viewing experience.”
The introduction of games engines, such as Epic’s Unreal Engine or Unity, brought photorealism into the mix. Originally designed to quickly render polygons, textures and lighting in video games, these engines can seriously improve the graphics, animation and physics of conventional broadcast character generators and graphics packages.
Lawrence Jones “We didn’t want to make the characters too photorealistic. They needed to be stylised yet believable”
Virtual Pop/Stars
Last year a dragon made a virtual appearance as singer Jay Chou performed at the opening ceremony for the League of Legends final at Beijing’s famous Birds Nest Stadium. This year, the developer Riot Games wanted to go one better and unveil a virtual pop group singing live with their real world counterparts.
K/DA is a virtual girl group consisting of skins of the four popular characters in League of Legends. Their vocals are provided by a cross-continental line-up of flesh and blood music stars: US-based Madison Beer and Jaira Burns, who both got their start on YouTube channels, and Miyeon and Soyeon from K-pop girl group (G)I-DLE. It’s a bit like what Gorillaz and Jamie Hewlett have been up to for years, except it’s happening live on a stage in front of thousands of fans. Riot Games tapped Oslo-based The Future Group (TFG) to bring K/DA to life for the opening ceremony of the League Of Legends World Championship Finals at South Korea’s Munhak stadium. The show would feature the real life singers performing their K/DA song Pop/Stars onstage with the animated K/DA characters. Riot provided TFG with art direction and models of the K/DA characters. Los Angeles post house Digital Domain supplied the motion capture data for the group, with TFG completed K/DA facial expressions, hair, clothing, texturing and realistic lighting. “We didn’t want to make the characters too photorealistic,” says Lawrence Jones, Executive Creative Director at TFG. “They needed to be stylised, yet still believable. That meant getting them to track to camera and having the reflections and shadows change realistically with the environment. It also meant their interaction with the real pop stars onstage had to look convincing.” All the animation, camera choices and cuts were pre-planned, pre-visualised and entirely driven by timecode to sync with the music. “Frontier is our version of the Unreal Engine which we have made for broadcast and real-time compositing. It enables us to synchronise the graphics with the live signal frame accurately. It drove the big monitors in the stadium (for fans to view the virtual event live) and it drove the real world lighting and pyrotechnics.” Three cameras were used, all with tracking data supplied by Stype including a Steadicam, a PTZ cam and a camera on a 40ft jib. “This methodology is fantastic for narrative-driven AR experiences and especially for elevating live music events,” he says. “The most challenging aspect of AR is executing it for broadcast. Broadcast has such a high-quality visual threshold that the technology has to be perfect. Any glitch in the video not correlating to the CG may be fine for Pokemon Go on a phone, but it will be a showstopper – and not in a good way – for broadcast.” Over 200 million viewers have watched the event on Twitch and YouTube. “The energy that these visuals created among the crowd live in the stadium was amazing,” he adds. “Being able to see these characters in the real world is awesome.”“K/DA is a virtual girl group consisting of skins of four popular characters in league of legends.”



The new virtual studio environment, created by the DCTI Technology Group using Vizrt graphics and Mo-Sys camera tracking, gives the illusion that the panellists are in the studio with Reali. Viz Virtual Studio software can manage the tracking data coming in for any tracking system and works in tandem with Viz Engine for rendering. “Augmented reality is something we’ve wanted to try for years,” Reali told Forbes. “The technology of this studio will take the video game element of Around the Horn to the next level while also enhancing the debate and interplay of our panel.” Since the beginning of this season’s English Premier League, Sky Sports has been using a mobile AR studio for football match presentation on its Super Sunday live double-header and Saturday lunchtime live matches. Sky Sports has worked with AR at its studio base in Osterley for some time but by moving it out onto the football pitch it hopes to up its game aesthetically, editorially and analytically. A green screen is rigged and de-rigged at each ground inside a standard match-day, 5mx5m presentation box with a real window open to the pitch. Camera tracking for the AR studio is done using Stype’s RedSpy with keying on Blackmagic Design Ultimatte 12. Environment rendering is in Unreal 4 while editorial graphics are produced using Vizrt and an NCam plug-in. Sky is exploring the ability to show AR team formations using player avatars and displaying formations on the floor of the studio, appearing in front of the live action football pundits. Sky Sports head of football Gary Hughes says the AR set initially looked “very CGI” and “not very real” but that it has improved a lot since its inception. “With the amount of CGI and video games out there, people can easily tell what is real and what is not. If there is any mystique to it, and people are asking if it is real or not, then I think you’ve done the right thing with AR.”“ESPN has introduced AR to refresh the presentation of its long running sports discussion show.”
