Posted on May 27, 2021 by Ann-Marie Corvin
As the Covid-19 pandemic forced productions to shut down last year, some companies turned to AI for help in venturing where humans feared to tread
Premier League teams may be fulfilling fixtures in empty stadiums at the moment, but at least fans at home can still watch these games. Amateur sides, niche sports and college teams cannot afford this luxury, yet fans’ desire to follow their favourite teams has actually risen during the pandemic.
It was this untapped need that prompted Swedish streaming giant Solidsport to form a partnership with Dutch AI and live-streaming specialists, Mobile Viewpoint, to increase the number of automated sports broadcasts it offers across northern Europe.
Highly regarded for its bonded cellular-to-live-streaming mobile encoders for 4G and 5G, Mobile Viewpoint has been developing AI solutions under its IQ Video Solutions banner for the past couple of years. These include the IQ Sports Producer (IQSP), a family of automated live sports production systems designed for single-operator use and remote productions.
The system is made up of a panoramic 4K camera, which overlooks and captures the whole field of play. AI, meanwhile, detects the action of the players and the ball and is capable of creating virtual crops from the video capture.
The system is driven by software rather than a camera operator and relies on a debonding server, based at the venue or in the cloud.
“The camera misses nothing, so you can either let the AI do everything or just use it for extra streams,” explains Mobile Viewpoint sales manager, Mark Andrews.
Via Mobile Viewpoint’s LinkMatrix cloud-based management platform, other tasks that can be carried out remotely include scheduling, highlights, replays and scoreboard ad overlays. Remote commentary and analytics also form part of this year’s roadmap, according to Andrews.
Solidsport chief executive Tobias Thalbäck says that his company made ten installations across Sweden and Finland throughout 2020 and intend to carry out more than 50 additional fittings across the Nordic region this year.
A Covid-hit Hollywood is exploring the possibility of using Coronavirus-immune, method-acting robots, after it was announced last year that an android named Erica will star in b, a new $70m sci-fi film, set to go into production next year.
Erica is the creation of Professor Hiroshi Ishiguro and Professor Kohei Ogawa, scientists from Japan’s Osaka University. The AI android has been trialled on many projects over the past five years, including work as a news anchor.
Designed to resemble a young woman, the robot is now making its film debut in a live-action thriller about an experiment with human DNA that goes dramatically wrong.
Erica is represented outside Japan by the Los Angeles-based, hi-tech entertainment company, Life Productions Inc. Its founder, Sam Khoze, hopes to build a robotic talent agency.
According to Khoze, a producer on b, training was given to simulate the robot’s motions and emotions through one-on-one sessions, including controlling the speed of its movements, talking through feelings, and coaching character development and body language.
“We also had to manipulate the autonomous algorithms she has for communication and train her to read dialogue without repeating the directions as well,” he explains.
While Khoze claims that Erica is fully autonomous, he also doesn’t shy away from highlighting her limitations. While the AI android boasts impressive facial expressions, she requires so many connections to other devices that her mobility is limited.
In addition, conversations with more than one person prove tricky – even though Erica can track a second person in close proximity.
“When you work with robots, you appreciate humans and our engineering even more. Just picking up an egg for a robot is a difficult task – it takes pages of calculations,” he admits.
Khoze is currently in the process of onboarding other robots to star in the movie, including humanoid musical artist, Alter 3. Created by Ishiguro, along with Mixi Corporation, Alter 3 generates its own motion using neural networks. It has even sold out performances at concert halls with its impressive ability to sing along spontaneously and expressively with – as well as conduct – the orchestra, and is now represented by Life Productions.
The new sci-fi film also aims to include a currently nameless creation that is being developed by Cambridge-based animatronic engineer, Jonny Poole. The AI is being created specifically for acting and Poole describes it, or her, as ‘the Tesla of robots’.
Made from materials designed by Nasa, it achieves motion, is life-sized and lightweight and has a huge number of activators – more than 600 – in its face. “This allows her to mimic human actions and expressions. Her eyes connect to her social media feed. And she sings if someone is playing guitar,” explains Poole.
“The uniqueness of this project is that everyone wants to see how they play. Some of them might need a digital retouch for their emotional expressions – but no more than was used in other live-action features recently, such as The Irishman.”
But given that even an all-singing, all-dancing robot isn’t capable of doing what a human can do, why not simply use an actor?
Khoze is not convinced. He likens his vision to that adopted by early pioneers who researched physical effects and animatronics. “People are not excited by CG anymore – cinema needs something new,” he counters.
“Hollywood has pushed the envelope with puppetry and animatronics – imagine what all these movies would look like if they were capable of adding sophisticated lifelike models that can harness AI and ML?”
We’ve used AI cameras in arenas for handball, floorball, basketball and ice hockey
“So far, we’ve used AI cameras in arenas for handball, floorball, basketball and ice hockey, and these sports work well,” notes Thalbäck. “We don’t see any limitations with other sports going forward.”
Andrews agrees, pointing out that Mobile Viewpoint ensured the system had been trained for less pitch-centric sports, including horse events, as well as velodrome cycling.
According to Thalbäck, the pandemic has actually boosted Solidsport’s business overall, since the need for teams and clubs to find new revenue has increased. Features such as IQ TeamStream, which enables clubs to sell subscriptions and run in-stream ads, are proving popular.
The demand for a good streaming solution for sports, regardless of level, has never been greater, says Thalbäck. “This trend has been going on for some years now,” he points out. “The pandemic just sped it up by three to five years.”
Dan Carew-Jones, a post production and workflow consultant at Arrow International, knew that AI and machine learning had potential, but it was only when Covid struck that a use case became clear. During the UK’s first lockdown, Arrow faced a significant challenge: completing 30 hours of unfinished programming. With editors storyboarding the shots they still needed, it was Carew-Jones’ task to find the specific footage that could fill those gaps.
“As they were returning 2000 hours of unused footage – 86,000 video clips – our first idea was to employ a small team of people to start logging the footage. But that was going to take time that we didn’t have,” he admits.
So, Carew-Jones turned to Curio, an AI-driven solution from US-based data solutions outfit, GrayMeta. It is designed to unlock information hidden inside assets such as words, images, logos, sounds, noises, faces and people. It works by taking a couple of frames every second and analysing the content.
Curio is capable of doing this for huge volumes of footage in a relatively short time period. In Arrow’s case, Carew-Jones added that it processed 2000 hours of footage in five days.
It helped that Arrow International was already in a strong position to start employing Curio – the company uploads most of its footage to the cloud and it had a solid naming protocol in place for all of its rushes.
Carew-Jones emphasises that machine learning still requires a great level of human assistance: the five loggers, who worked on the initial tagging before Curio took over, were soon redeployed as five searchers.
“The results were not enough on their own – humans are still needed for context for complex requests.
The more the system is used, the more accurate the database becomes
There are still going to be false results that need to be curated before they go to the edit,” he says.
For example, Carew-Jones admits the AI thought a shot of a camera tripod was a machine gun. “It’s an organic process,” he says. “The more the system is used, the more accurate the database becomes as those errors are removed.”
While Arrow did not use Curio’s facial recognition tools in its initial application, it plans on utilising them in the future to help catalogue 400 hours of interview footage, as well as speech recognition tags to search by topic.
The system also required Arrow to embrace integration with APIs (Application Programming Interfaces) – unchartered territory for the firm. “We don’t write the programs or the script, but we need APIs to make the most of the opportunities that the system presents,” says Carew-Jones.
Arrow is currently developing custom models to open up its archive of specific subjects, including AI models to identify specific models of aeroplanes.
Carew-Jones’ advice for firms looking to explore AI’s commercial benefits is not to take a fixed view of what’s possible now and what can be improved over time. “Objects like animated characters can take time. If Arrow had started off with that, they’d still be stuck here six months later. Focus on quick wins first.” He also suggests onboarding senior managers and encouraging them to become tech champions, so they’re more likely to invest in incremental improvements further down the line.
There’s no doubting the AI technology available is pretty impressive. Now is the time to get practical hands-on experience and make the most of it.
This first featured in the Spring 2021 issue of FEED magazine.