Read Latest Issue now

Esports Production: Shooting the Shooters

This year’s H1Z1 tournament gave broadcast production teams the chance to throw down in a new sector

For some, game streaming conjures up images of someone crouched in front of a webcam in an Ikea-furnished bedroom, playing Minecraft and commentating on it for an audience of dozens. It’s been a while since that has been a fair characterisation. As this year’s H1Z1 Pro League (known as H1PL) proves, esports has thoroughly come of age.

H1Z1, produced by Daybreak Game Company, is a battle royale game that was released to Steam’s early access program in 2015 and is currently available for Windows and PS4. Beginning in April, H1PL streamed matches weekly until the end of the league’s first split in June. Produced by Twin Galaxies, a veteran of the gaming world, the league’s streams were produced from a facility just off the Las Vegas strip, using equipment and personnel typical of a major outside broadcast. Eight broadcast cameras covered the action alongside a fixed camera for each of the 75 players, while experienced gamers operated virtual cameras for in-game shots.

Supervising producer Xander Denke notes that the sheer scale “is unusual. We had previously done an hour special for The CW here in the US which was called Fight for the Crown. That revolved around H1Z1 as well. We did a three-day event at TwitchCon 2017 where we took over the entire Long Beach Convention & Entertainment Center arena and had three days of eight hours’ live streaming…So while something of this scale is rare in the esports ecosystem, for us, it was a natural progression to get to this point.”

The technical package was supplied by GQC Entertainment. Marty Meyer was responsible for assembling equipment and crew. “When you think about having over a million dollars of equipment and a network sports team driving it,” she says, “to be doing something to Facebook Live is incredibly interesting, knowing where we are right now in the world of content production.”

She adds that “there’s nothing that separates it from being a broadcast job. We could do any broadcast production with this gear. In fact, our chief engineer Matt Battaglia also works for Fox Sports. He’s a technical supervisor for NFL, and he says, ‘Hell, we broadcast in 720p!’”

“Facebook doesn’t just grab and spit it out. They do another encoding. It’s being encoded twice”

A numbers game

Ensuring good picture quality from the 83-camera shoot was the responsibility of video controller Bob Kertesz, who speaks highly of the Panasonic AK-UC3000 cameras. “They are kind of unusual cameras. It’s a single 35mm-sized chip with the relay optics built into the head so it could take 2/3in B4 lenses. That’s the way the camera was designed. I was anticipating all kinds of relay lens issues with soft corners and flaring and all the usual problems you get with relay adapters and I got nothing. I spent a day evaluating them and I was very impressed with them.”

Five cameras were handheld, with others on jib, Technocrane and one with a 95:1 box lens to cover the announcers at their desk. Kertesz describes the players as “all facing out, and around the outside edge of the circle there was a two-and-a-half-foot wide platform all the way around where the handheld guys wandered.” He estimates, though, that more than half of the broadcast was made up of gameplay footage. “There was a second room set up with highly experienced gamers and another switcher. They’d preselect and they would send what they felt was important on the preview buses.”

The demand for 1080p60 pictures was unusual, even for sports. “More than half the football games broadcast in [the USA] are at 720p,” Kertesz continues. “They’re on Fox which is 720p. They’re on ESPN – which is Disney – which is 720p. NBC and CBS are 1080i broadcasters. Combined, Fox and ESPN are probably 55 or 60% of games.”

The complication, Kertesz says, is that “everything has to be 3G SDI and it’s not so easy to find a switcher that’ll do that. Everything was 1080p, everything was 3G as a result, and 1080p60 is not normally done. So, everything had to be configured properly and everything had to work at that rate.” A Grass Valley Kayenne vision mixer was used for its 3G support and automation features.

The motivation for targeting the higher resolution format, Denke says, comes directly from the fact that the gamer audience is technically savvy and used to watching high-quality images on a large, close-up display. “1080p60 is the gaming standard. When you have your gaming rig and you have it how you want it to be, you want the best possible, and you’re going to run your game at 1080p60.”

Finally, Kertesz mentions smoke: “The game has a mode as you approach the end in order to ensure people can’t just hide until all the other players are dead. I don’t know this, I don’t play the game – but suddenly they’re pumping green smoke into the hall so the attendees can have that visceral experience.” This, Kertesz admits, caused an instant of concern, until he realised that “it was on all the cameras, so I knew it wasn’t me!”

“There’s nothing that separates it from being a broadcast job. We could do any broadcast production with this gear”

Crushing it with colour

Using high-end equipment made certain things easier in an environment heavily populated with LED effects lighting. “The hall was lit to around 4600K,” Kertesz says, “and one of the things I loved about the Panasonic cameras was that I looked at the control panel, and I found the knob that said colour temperature, and I wound it round to 4600K – and the whites fell right in without having to tweak the reds and blues.”

He describes the desired result as “a dramatic look. The blacks were not seriously crushed but they were crushed. It was high contrast, high saturation. Not much detail because the cameras were sharp on their own, but in general there was no peeking into the shadows. Blacks were black. That’s what the DOP and I decided.”

The portrait cameras on the players were a different story. “The issue was that because the players would lean in and back, the colours of their faces would change constantly. You can’t override a 24in screen that’s six inches from their faces. That’s what it’s going to be. If they lean in they’re going to turn green and blue and I think that’s going to add a touch of realism to it.”

“In gaming there’s details… in the grass, the trees, I do see the importance of it”

Hitting the target (audience)

The link to Facebook’s servers was managed by encoding supervisor Tom Sullivan. “I supplied the gear and I encoded everything on-site,” he begins. “They gave me an SDI feed with embedded audio. I input that using an AJA Corvid 88 capture card.”

Sullivan selected his equipment with an eye to both convenience and reliability.

“I’ve used encoding machines like AWS Elemental and things like that, and I’ve found that a combination of software and hardware makes the job smoother. I know a lot of people who won’t use software because they say it’s not safe. I use a program called vMix that allows me to see everything, to hear it, if I have to add a corner bug or a logo or a technical difficulties slate or something, I can. It’s very basic but when I hit the encoding button it goes on my GPU, on hardware.”

The H1PL streams were encoded as 12Mbps H.264 packaged as MP4 with AAC audio, though Sullivan is very aware of the changes that may happen upstream.

“A lot of people don’t realise this: What I send out, Facebook doesn’t just grab and spit it out. They do another encoding. It’s being encoded twice. It may look awesome out of you, but you have to be in contact with upstream.”

Happily, Sullivan could rely on the full gigabit connectivity of the purpose-built esports arena, and understands the drive for high resolution and frame rate.

“In gaming there’s details…in the grass, the trees, I do see the importance of it.”

Closing the gap

Given all this technology and technique, the gap between conventional broadcast production and the high end of streaming is clearly closing.

“I grew up in broadcast television,” says Marty Meyer, who has worked extensively in both worlds. “The first time we did this was at least a year and a half ago. We did it as a pilot for one of the TV networks; the second time, we did TwitchCon. Now we’ve finished a ten-week run in Vegas. I think this is new territory…We like this space very much, we think it’s a growth space, and we’ve got the hang of it.”

Supervising producer Xander Denke is clearly aware of having done something new.

“It’s flattering when looking at other productions that are going on, to see them doing some of the similar things that we were doing. It feels like we were making the right choices and going in the right direction. Our production, unique as it was, will have an effect on future prods all across the esports space.”