Declutter your data
Just because you have lots of data doesn’t mean you have lots of insight. Datazoom’s Diane Strutner offers tips for cutting through your video analytics fog
Diane Strutner, CEO and co-founder, Datazoom 2019 is here, and it’s time to prioritise data to improve video operations.
It’s a critical time for content providers. That fabled moment when streaming would eclipse traditional television is upon us. Remember the oft-cited Cisco statistic, which states that by 2019 80% of internet traffic would be video?* Well, 2019 is here!
But for all the advancements the video market has made in some areas, there is still some catching up to be done in others. As viewers demand an ever-improving experience, there has been a rush to perfect content’s ‘kingdom’, but the data that makes its reign possible has been given less than equal attention.
More than mere jargon, data is truly the fuel for OTT. For some companies, the data-driven revolution has long been underway. In an effort to emulate the likes of Netflix, media companies built their video stacks around a mish-mash of pre-built ‘best of breed’ technologies. This strategy gave managers access to innovative solutions that independently worked well, and could theoretically be tailored to fit the organisation’s unique needs, but omitted the crucial component underpinning Netflix’s success – data.
Netflix realised that in order to scale and operate efficiently, and maximise the value of their video delivery stack investment, they required end-to-end visibility, and the ability to make adjustments to key services and infrastructure. This was achieved by provisioning the collection and integration of data in and between these systems, across the technology stack.
The difference between Netflix’s approach to data versus the rest of the streaming industry lay in their vision for data’s role – supporting the growth and optimisation of an end-to-end network. Even when Netflix purchased outside services, they maintained focus on that service’s ability to be included within an end-to-end system. It was essential that each new service be monitored and adjusted in the context of all other systems in the network.
This realisation – that while services may be purchased independently, they do not truly operate independently – guided Netflix toward their holistic approach to data. The linchpin of Netflix’s data strategy is their ability to collect data from any one source and then establish context, correlations and causation amongst any coexisting source.
“While services may be purchased independently, they do not truly operate independently”
Dumb about data
New entrants trying to catch up with Netflix raised cash and loaded up on independent, best-of-breed technologies. Taking the new service live took precedence, while long-term strategies – like how to gather, use and incorporate data – were deprioritised. Collected data was used for individual service monitoring and often provided by vendors. As new technologies were adopted, more data and more dashboards entered the picture, all lacking context, correlation and the ability to determine causation from one service to another.
Data was used as a way to monitor the performance (and the investment) of each service individually. Each service came with its own dashboard (and services that lacked dashboards became filled by yet more outside vendors, like QoE analytics for video players), and data turned into metrics which were specific only to that service.
As the streaming industry became more sophisticated, the number of services, dashboards and amount of data morphed into today’s debacle: each OTT provider must log into multiple dashboards, viewing metrics which cross-over between mismatched systems and reducing the usefulness of data, in terms of end-to-end optimisation, to effectively nothing. What good is knowing about buffer affecting a video stream if we can’t determine whether the encoder, transcoder, stitcher, CDN, transit network, ISP, third-party-embedded-service, or a setting on the video player itself caused the issue?
Get focused
Looking into 2019, every company should have a new focus: using data to improve video operations. Here’s how you can start:
Standardise data: Not data roll-up or aggregation, but cleaning. A ‘play’ event from your iOS player will have a different raw data read-out than your HTML5 player. But they represent the same thing. At Datazoom we have our own Video Data Standard which you may consider leveraging.
Data must be available in one place: Think data lake. Data is less useful when left in silos.
Data needs to be assembled: How useful is collecting player data, CDN log data, encoding log data and ad server data if you can’t identify the interplay of one service with another? Less useful than understanding their correlation by aligning all of them together.
Data needs to be put to work: Once we have the data, properly formatted, in the right place, and in the right context, how do we turn it into action? If you’ve purchased outside services, those are your data’s new stakeholders who must have access to this data in order to adjust their systems on your behalf.
As OTT enters its next industry-wide iteration, data which ties vendor technologies, departments and business units together will allow video distributors to finally compete, leveraging the technology’s full array of benefits. Otherwise, the business of video will stay more art than science, and the question of how to reliably, and profitably, stream video at scale will remain unanswered.
* “Cisco Visual Networking Index: Forecast and Trends, 2017–2022”: https://www.cisco.com/c/en/us/solutions/collateral/service-provider/visual-networking-index-vni/white-paper-c11-741490.html
This article originally appeared in the January 2019 issue of FEED magazine.]]>