Weavr Consortium is a UK-funded project using data and AI to make esports a fully immersive experience
Data is the heart of esports. Video games themselves are literally no more than a visualisation of data manipulated in real time by humans. A new UK tech collaboration, the Weavr Consortium, aims to take the mass of data embedded in esports and create a new way of viewing competitive content.
Weavr is a collaboration between six UK companies with expertise in esports, education and entertainment, supported by a grant from UK Research and Innovation. The Weavr platform uses live and historic game data to create mixed-reality experiences for fans of both esports and traditional sports. The aim is to create a kind of data-driven entertainment with greater commercial opportunities for brands and teams, and a much more immersive, interactive experience for viewers.
ESL, the largest esports organisation in the world, has been at the centre of the Weavr project, which kicked off in January this year. The company had been collaborating for some time with researchers at the UK’s University of York, experimenting with using AI to improve the storytelling at ESL tournaments.
“It was cutting-edge stuff at the time. The industry hadn’t seen this sort of thing before,” says James Dean, ESL UK CEO. “Suddenly out pops this opportunity to bid for some funding as part of the Audience of the Future programme.”
Dazzling data ESL’s James Dean shows off the Weavr app at ESL One Birmingham
The Audience of the Future initiative, run by the UK’s Department of Digital, Culture, Media & Sport (DCMS), aims to bring creative businesses, researchers and tech experts together to create new types of storytelling, with an aim to make the UK more competitive in content creation and development internationally and to create experiences that are culturally impactful. Out of a total of £33 million available for research, £16 million was to be split between four “demonstator” projects in four categories, one of which was sports. ESL put forward its Weavr concept.
“Lo and behold, we won it. We at ESL had never actually won any funding before – or even applied for it, for that matter. Although the University of York was quite attuned to the process.”
Three levels of immersion
Weavr is basically a B2B framework that focuses on enabling three levels of content immersion. The first is sensorial immersion, which is generated by the basic visual and audio content that most of us are familiar with. Sensorial immersion could even include AR or advance audio through headsets, or even, in the future, haptics engaging the sense of touch. The second level is cognitive immersion, which involves engaging audiences in data, including game stats as well as using audience data to provide a better service. The third is social immersion, which is about sharing and communication. This could be about sharing between sports fans.
“We take those three levels of immersion, and we combine them all,” explains Dean. “What we’ve seen is if these types of immersion exist in a siloed experience they tend to turn out fairly gimmicky. People don’t want to have ten apps open. They just want one experience. So while we combine those three levels of immersion we also personalise it for each person. We use machine learning and AI to customise the narrative so it has more meaning and choice for those individuals. If we get that right, I think we’ll find that people will be willing to part with money rather than expecting all their content to be free.”
Potential revenue models for a “Weavr’d” sports viewing service might be through viewer micro-transactions or the support
of a sponsor, or as an inducement to buy merchandise. “Because it’s a framework, we can connect any number of technologies or platforms into one systemic experience, no matter how the user wants to engage.”
In addition to the University of York, the other partners in the Weavr Consortium are immersive content studio Rewind, Manchester-based studios dock10, machine-learning specialist Cybula and virtual reality company FocalPoint VR. The first major demonstration of the Weavr project took place over summer 2019. Weavr’s trial mobile app was shown to a stadium full of fans at ESL One Birmingham 2019, a major Dota 2 tournament and esports confab.
It’s a massively challenging and very complex system. It’s got a lot of moving parts
Florian Block is a University of York research fellow and lecturer in Interactive Media and Digital Technology (read FEED’s Genius Interview with Block in our August 2018 issue). His department’s collaboration with ESL has been key to the whole project.
“We realised that in order to produce immersive experiences that essentially happen live, we had to use data as a key driver. But that experience also has to be served up with the meat – the video,” Block says. “We knew blending the two technologies together was a perfect means of engaging viewers in a new way.”
Block explains the design of Weavr’s data technology workflow: “The data produced by games is ingested into a data pipeline. We’ve got a set of several technologies that interact in tandem that aren’t an integrated system but have clearly defined protocols and intersection. It’s a really complex system. But at the heart we have our internal databus so all the different stakeholders who analyse the data get the data in real time. Then there are various modules that perform AI and statistical analysis on the data in real time as the match happens. These are systems optimised for a real-time scenario.
“At the end of it sits what we call a narrative engine. That narrative engine converts interesting findings out of the artificial intelligence and data middleware into what we call ‘stories’. They are essentially connected sets of really interesting performances, achievements, strategic set-ups and certain actions that are the key highlights of the match.”
Getting this information-rich viewing experience to work together invisibly in real time is no easy feat. Some elements involve using high performance statistics, but parts are fully powered by AI. For example, neural networks are used to detect simultaneous actions in the game to predict when a highlight moment is going to happen.
AI is also used for detecting different playing styles in order to refine statistics. “In football you have a striker or a goalie,” explains Block, “but in some esports these roles aren’t so clearly defined. What happens is we find different archetypes of play styles and we can then explain to the viewer the significance of different strategies.
“It’s a massively challenging and very complex system. It’s got a lot of moving parts. Everything has to work on real-time data. We can’t exhaustively test all the possibilities that could come.”
The Weavr Consortium is now looking to head to ESL Hamburg at the end of October to show fans a whole new set of Weavr functionality. “We now have to build out a commercial proposition for Weavr,” says Dean, “forming a legal entity that represents all the work that the consortium has done, for the industry to harness what has been brought to fruition and help IP owners to prosper.”
“At the Birmingham demo Weavr actually delivered something that was incredible. Going forward it’s going to be an amazing demonstration of what we believe is one way audiences of the future will engage in live entertainment.”
This article is from the October 2019 issue of FEED magazine