Read Latest Issue now

All the feels: will haptic technology allow us to feel what’s on our screen?

Haptic technology offers the ability to feel things in the digital realm. It has the potential to change how we experience content – and how we make it

Words by Martyn Gates, director of product development, Densitron

The broadcast industry has continuously pioneered improvements in image fidelity and resolution to ensure the best possible quality and broadest accessibility. However, there is one area that is only now catching up, and that’s the area of ‘haptics’, or communicating the sense of touch in a virtual environment. 

Haptic technology is already being deployed in the industrial world. But can the sensation of touch realistically be brought into the broadcast world? Is it possible to add yet another sensory input to our viewing, and edge nearer to a truly ‘immersive’ holy grail of viewer experience? Are ‘the feelies’ of Aldous Huxley’s Brave New World just around the corner?

At the very least, enabling the option of a sense of touch could offer more inclusivity for those whose other senses may be missing or less developed – on something of a par with subtitles and audio descriptors – and add powerful new dimensions for anyone else. Remember, touch is an equal opportunity sense. It enhances – in some cases fully communicates – for everyone. 

Let’s not forget, such devices have been in use in other markets and professions for years, medicine in particular and, more recently to a degree, esports and gaming. For once, it’s actually broadcasting – and its AV, VR and AR cousins – that is dragging its sensory feet. But not for much longer.

Beyond texture

What does this have to do with broadcast content, creation and delivery? So far, broadcast content is almost entirely a visual and aural affair. We can feel emotion, but we can’t feel that caterpillar on Countryfile.

Touch isn’t just a sense used to gather information about our environment, it can also influence how we make decisions. (Hell-o advertisers!) Although people order goods from online shops, nevertheless, it is still the physical item that we can hold in our hands and its tactile feel that determines whether it is kept or is returned. Similar sensations may soon also contribute to whether a programme or series is sampled, watched or abandoned. 

In an increasingly interactive world, the emergence of haptic-tactile broadcasting would not only enable consumers to watch and hear content, it would include sensations of feeling or movement. Haptics can offer an additional sensory dimension to virtual and augmented reality and 3D environments, and is essential to fully experiencing their immersive nature.

Haptics of some description are already included in touch-enabled devices, such as laptops, tablets, mobiles and many other systems that have a touchscreen interface. Most mobile device manufacturers incorporate some form of haptics – often in the form of taps, clicks or buzzing – in most, if not all, of their latest models Haptic technology has the potential to unlock extra features on devices that include it. Haptics are enabled by actuators, which apply forces for touch feedback, and controllers. When a stimulus is applied, it triggers a mechanical motion in the actuator. The latest generation of actuators are proving faster, with far more rapid response times, which is what has enabled haptic devices to start being used, for example, in the fields of medicine and surgery. Better to practise on something that feels like a pancreas than the pancreas itself.

Touching the future

But for broadcast and production purposes, the application of haptics – and the far-reaching benefits for delivering a quality experience – is only starting to be acknowledged. For example, miniature, high-resolution cameras that can deliver compelling new perspectives from previously unimaginable positions are now being augmented with devices designed to deliver the haptics of the event as well. For that reason, it is important to ensure the establishment of consistent, acceptable standards for the application of haptic-enabled devices.

SMPTE has been leading from the front in that regard with the 2017 publication of ST 2100-1, Definition and Representation of Haptic-Tactile Essence for Broadcast Production Applications – the standard for the transport of haptic-tactile essence. In short, the ultimate aim is to work towards delivering a high-quality, and ideally immersive, sensory experience, and to ensure its consistency for all.

As can be said about many recent media tech developments, the advent of IP-based broadcasting has been instrumental in enabling such initiatives to take place, and standards to be developed to regulate them to the ultimate benefit of consumers.

In the end, haptic-tactile essence must be tied to audio and display components that can help create new haptic-tactile displays, providing a real sense of operating a mechanical button on what might be flat surface. This is accomplished in a virtual sense via a membrane-enhanced touchscreen that gives operators of, say, a vision mixer, the sensation of actually turning a rotary knob or pressing a right button, offering the comfort of knowing that what was taken for granted as instinct in a mechanical world can also be confirmed – and felt – in a virtual one. 

It’s a way not just to bridge a gap between mechanical and virtual, it’s also a way to make them coalesce. Crosspoint switching, channel selection or even rotary controls that feel like the real thing offer tremendous levels of familiarity and comfort, as well as confidence.

I’m not suggesting we’ll all be wearing haptic jumpsuits in the near future. Haptics is, simply put, any tactile feedback to the operator. If you touch a button, you then experience a physical sensation telling you that the button has done what you expected it to. 

And if you touch a puppy’s nose on a screen, part of you expects it to feel cold. Don’t be surprised if we have that option in the near future. When we do, you (if the puppy and the algorithms are correct) will feel the love.

This article originally appeared in the June 2019 issue of FEED magazine.