Haptics at the Hermitage
An exhibition at the State Hermitage Museum in Russia shows off the future of 5G and VR
Museums are places we go to experience and examine the past. But an exhibit at the State Hermitage Museum in Saint Petersburg gives visitors a glimpse into the future. The exhibition consists of a 5G trial zone, which showcases innovative applications that are combined with immersive technologies and run on 5G networks. People often regard 5G networks as a cure-all for every type of technological woe – like the hypothetical cold fusion of nuclear reaction – that will bring about an IOT (Internet of things) paradise. However, it’s hard to deny that 5G, when it’s rolled out widely, is going to be a game changer for everything, from latency to reliability and from battery life to data access. When exactly 5G will arrive is a moving target, and will vary from country to country, or region to region within countries. But it will come. And the 5G trial zone at the Hermitage Museum offers a snapshot of what might be possible in the future.
Opened in May 2018 and running until the end of the year, the 5G trial zone is sponsored by Ericsson and Russian digital services provider, Rostelecom. The 5G use cases, open to the public, are put on display amid the Hermitage’s 18th-century grandeur and employ a mix of virtual reality, haptic technology and robotics, using the 5G test network in the 3500 MHz frequency band that has been supplied by Rostelecom.
One demonstration, for example, shows the restoration of a work of art – a statue, in this case – using a remote-controlled robotic arm. The arm, gripping a delicate restorer’s brush, is operated remotely. It serves to illustrate how 5G’s low bit rate, low-latency characteristics could enable restoration experts to conduct art restoration work with a high degree of accuracy from anywhere in the world. It is a window into a future where an expert in London might be able to restore a painting that is in Sydney.
On the other hand, another use case demonstrates the remote learning possibilities for 5G. In this one, an art master uses a remote-controlled robotic arm to show students precision techniques for restoring works of art. Both demonstrations use a 4K video stream that is transmitted to VR glasses. This creates the effect of real presence for both the teacher and the students.
Immersive designing
The exhibition was built by immersive design expert, Room One. It was a complex undertaking, requiring communication between robotic and haptic devices, which users (wearing Oculus Rift VR headsets) can use to interact with remote physical objects in real time. The force feedback haptic devices enable delicate manipulation of the remote-controlled robot arm and allow restorers and students to observe and physically feel the sculptures being worked on. This is combined with a 360º 4K video stream transmitted to the virtual reality headsets.
The virtual reality technology was provided by Focal Point VR, who have a wealth of experience in livestreaming VR and 360 content for sport, music, art, education and culture. The company contributed a 360° video camera rig, live stitching and a streaming solution for the 5G trial zone installation.
The team deployed a three-camera VR rig, composed of Blackmagic Micro Studio 4K cameras, explains Paul James, head of production at Focal Point VR. “We used Blackmagic cameras because, at the time we put this together, there were very few cameras out there that used single-cable SDI for 4K,” he says. “We have multiple cameras, so we don’t want quad-SDI coming out of each – that’s just a pain. And the form factor works. They are very small and we can fit them close together. They are well-ventilated, don’t overheat and they run forever. We did some testing with other camera technologies, but they seemed to get to a certain temperature and then just stop working.”
The camera outputs were live stitched and sent to the Focal Point VR Player, which was running on Oculus Rift headsets. Focal Point’s stitching solution employs Blackmagic 6G-SDI ingest cards in addition to the Focal Point VR live streaming software.
Long-distance relationship It’s hoped that 5G will enable experts to remotely perform ultra-precise tasks from across the globe, such as restoring works of art
Better connectivity
Despite the theme of the exhibit, 5G was not capable of providing 100% of the installation connectivity. Focal Point VR needed a robust connectivity solution to transport the ultra-low latency, very bandwidth-intensive live VR. After looking at a number of protocols, including WebRTC, the company went with NewTek’s Network Device Interface (NDI) technology. NDI is a royalty-free software standard that enables video-compatible products to communicate, deliver and receive broadcast-quality video over IP networks.
NDI is a way to get very high bit rates and resolutions without having to lay cable everywhere. Running over a GigE network, the lens-to-headset display latency was less than 200 microseconds, which outperformed the original brief.
“We had used other streaming technologies in the past, and knew there was an issue with latency, so that is why we went with NDI”
“We had used other streaming technologies in the past,” explains James, “and knew there was an issue with latency, so that is why we went with NDI… But we have just done exactly the same thing – minus the robot – using 5G for real at an event held by Huawei.”
Live museum and installation projects continue to play a major part in Focal Point’s work. Recently, the company has been working with Royal Holloway, University of London in the UK on proof of concept for a project, which will be rolled out as a full version in a museum. It has also been working with the University of York on a VR/AR project.
“Most of the interest at the moment is coming from university and museum people,” says James. “At the moment they’re in the UK, but we’re not stuck with the UK only – that’s just where we happen to be working right now.”
If 5G makes good on its promise, Focal Point VR will be able to work everywhere from anywhere.
This article originally appeared in the January 2019 issue of FEED magazine.