Read Latest Issue now

Feeling Groovy about live streaming

Whether it be for marketing purposes, as a supplement to traditional broadcasting, to increase access to live events or for fun among a private group, live video streaming is adding another component to the way information is communicated.

But live streaming is not as easy as you might first imagine. Even the simplest content has an aspect of risk. Errors in the stream are difficult to hide given the live nature of the content, and viewers won’t wait around if there’s an interruption in the stream.

Live video streaming is composed of a chain of interlocking parts, including production, connectivity, encoding, CDN and delivery services. These and other elements are being increasingly offered as a package by streaming media companies, some of which also provide bespoke online video players and stream analytics.

Production

These days there’s nothing acceptable about a sub‑broadcast quality stream. Video production and live-streaming company Streaming Tank, whose clients include i24News and Eurosport, use Sony EX3 and PMW-300 and Canon EOS C300 cameras for capture – and have access to larger ENG cameras and capabilities for 4K for more complex events.

For its bigger productions or locations with poor network signals, Streaming Tank even runs its own OB truck which offers access to a Dawson Tooway satellite as well as integrated connectivity, vision and sound equipment, including BlackMagic Design’s ATEM Television Studio live production switcher and HyperDeck Studio recorder.

They assume that a strong internet connection for things like web browsing will be the same for live streaming

Streaming Tank uses a mix of in-house kit and expertise plus external partners and freelancers to put together a video production service to fit the event – from lean single-camera solutions to complex, dynamic shoots required in stadiums, festivals and outdoor events.

Connectivity

Once a video and sound team are in place some companies may want to utilise stand-alone connectivity solutions as a way to get the on-site video stream from venue out to the internet.

“In the simplest set-up this means having our own engineers on-site with our encoders connected to a stable broadband connection, but that is not always possible so we work with a number of alternatives,” says Jake Ward, business development director at live-stream specialist Groovy Gecko.

These connectivity alternatives include:

  • Satellite bandwidth: Streaming media producers with expertise in IP-over-satellite can set up an on-site broadband connection good enough to stream your webcast with full redundancy.
  • Satellite/fibre acquisition: When the video signal is already being uplinked to a satellite or transmitted over fibre to BT Tower, producers can bring the signal down into a partner satellite acquisition centre and encode your webcast from there.
  • Mobile multiplexing: For webcasting on the move or in difficult environments, backpacks are the best option. LiveU’s units, for example, merge together multiple 3G, 4G and wireless signals and output a high-quality video stream that can be acquired at the streaming provider’s hub and encoded for your webcast. Smaller, lightweight units, such as the company’s LU200, permit camera ops to wear them and move easily. More robust models like the LU500 can bond up to eight network connections, while being combined with the LiveU extender and providing up to 20Mbps.

Streaming media producers will also partner with a CDN, or several of them for redundancy, to deliver the live stream anywhere in the world.

“Quite often, we’re working with a production company,” explains Groovy Gecko’s Ward. “They give us a TX, their live output from their camera mix, and then it’s split (for safety reasons) into two or more encoders, which encode that stream into a suitable video format.

Live streaming a fashion show

“Maybe we’ll add in other interactive elements like live polling on Facebook Live. Then those live streams, once they’re complete, are sent to what’s called a publishing point – that’s on a standard CDN, something like Akamai – and then going onto the client’s own page or, more commonly these days, a publishing point on something like Periscope, Facebook Live or YouTube.

“Of course, you can run a very simple low stream off a single server that a company may be hosting, but as soon as that hits a certain number of viewers everything’s going to start to fall apart. From a CDN point of view, we use people like Akamai, which delivers a considerable portion of streaming on the Internet. If that goes down and fails to work we’ve all got much bigger problems.”

How does a CDN work?

CDNs are made up of a large number of server farms around the world joined together by ultra-fast connections. When a file is uploaded to a local server for viewing on-demand it is rapidly duplicated across all the CDN’s servers. You can upload a file in London, and when it is replicated, a user in New York will be accessing it from a local server in New York.

This means that there are multiple copies of your content on servers around the world, and that ensures 100% availability. For example, if servers in London were down, the users in London might be served their file from Frankfurt. There might be a negligible drop in performance, but the file would still be available.

One advantage of working through a CDN is redundancy. “You have the output you want to broadcast going into two different encoders then publishing hopefully through two different internet connections to two different places on the CDN,” says Ward. “That means that if something on the CDN goes down and you’re publishing through London, and London has an outage, your signal is still being sent via Bristol, via a different internet connection.

Facebook will automatically kill a Stream in under ten seconds if it detects any copyrighted material

“On CDNs, that seamlessly falls over, and the audience never knows that they’re suddenly accessing a secondary stream – the stream just continues as it was. Facebook and other social platforms only have a primary stream in, so we’ve done a lot of work to create a secondary workflow to enable that. For security purposes most of the social networks are now looking at adding a primary and secondary stream which will have seamless cross over.

“If a live stream of a major brand goes down, then it’s serious. It really is not just looking at the technical solution, it’s looking at the areas of risk. You have to sit down in a planning meeting from a content point of view and a technical point of view.”

Most often, the issue with bandwidth is purely making sure that it is strong enough to handle a high-quality stream.

“Many clients tend to forget about the importance of a strong Internet connection when it comes to getting live content offsite,” says Ward. “They assume that a strong Internet connection for things like web browsing means that it will be the same for live streaming, but this isn’t the case. They may have a speed of 100MB, but when a building full of people are draining the bandwidth, it often gets squeezed to considerably lower. We get around this when handling a stream by physically sending an engineer to test a venue’s broadband signal.”

Then there’s the added worry of the rise of live 360 video in 4K. On the one hand, shooting 360 footage in 4K is clearly beneficial for the medium, increasing the quality and therefore the viewer experience, but it requires more bandwidth. You will want to ensure the average viewer is able to enjoy a stream even without a 15MB connection. Part of this involves degrading streams for those who lack the bandwidth to stream 4K.

Copyright permissions

Whilst most people can appreciate the importance of getting the right permissions to use copyrighted material, many are not aware of how long this process can take, and how sensitive social networks are to any form of copyright infringement. Both Facebook and YouTube have sophisticated monitoring systems to detect copyrighted material, and if something isn’t cleared properly, you can bet they will know about it. YouTube offers a ‘three strikes and you’re out’ policy, but Facebook will automatically kill a stream in under ten seconds if it detects any copyrighted material which the streamer does not have the rights to use.

“The problem is, these systems are so sensitive that even a copyrighted piece of music played accidentally could take a stream off air,” says Ward. “I’ve had situations in the past where everything is copyrighted, but someone has driven past in a car playing a radio track, and I’ve got a strike on YouTube. Copyright is really a big issue at the moment, often not looked at and not cleared properly by the brands. It takes time. Facebook takes five or six days to clear a music track for use on a stream. If you’re trying to do something really quickly, you may hit problems.”

This article first appeared in the March 2018 issue of FEED magazine.