Our White Papers
on Live Streaming
We have our views on the video streaming industry! Discover our data-driven articles below, written by our experts.
Imagine you want to buy a new car. How would you try to reduce your carbon footprint? The obvious answer can only be going electric. It’s the easiest way to do that without letting go off too much comfort.
What sounds like the obvious turns out to be a little more complicated when you really want to calculate the amount of footprint reduction. Where does your electricity come from? Where has your car and more importantly your battery been produced? What physical resources were used and how are they sourced? How much drinkable water has been used ? and, and, and….
When talking about the media streaming industry, reducing our media services footprint is a hot topic. But what is the right approach and how does it really impact the footprint?
A Cost Efficient Disaster Recovery
Seeing your OTT Service falling apart is the worst thing that can happen to every provider, lucky who implemented proper disaster recovery schemes at the right time. But to what extent does the recovery work? For how long are your services going to be affected? Are all channels covered equally? And at what cost does all of that come?
In this Whitepaper, you will see if and how the cloud can address some or all of the needs and how traditional software systems with license based business models might not be the best option to provide reasonable DR at a low cost without wasting a lot of idle CPU resources.
There is no other way to do that? – well, buckle up and download our white paper!
Transitioning from “Density per CPU” to “Pixel per Dollar”
“How many OTT channels can you have in a Dual-xeon Gold 6140” is a question I have heard hundreds of times in the past. Why: the density (i.e. the total number of video streams that a given hardware can process in “real time”) has always been considered as a key factor for determining the overall price of a video delivery solution. Hence, many companies have been investing a lot of money in making their video delivery chain (and especially the video transcoder) more “efficient” (efficient means “reducing the number of CPU cycles but keeping the quality as is”) to lower the TCO of their solution.
New software paradigms, born with the cloud, shed a new light on this story. CPU has now become a resource that can be provisioned anywhere, at any time, … at any price, and modern software architectures can leverage that diversity to dramatically lower the sacrosanct “price per channel”.
Let’s see how modern softwares can leverage new development and architecture paradigms to dramatically improve the video delivery TCO and transition from a “channel per cpu” to a “pixel per dollar” logic.
How Audience Aware Encoding dramatically reduces your cost whilst optimizing the QoE
Video streaming will become the de-facto standard for watching live events in the upcoming years. The target date varies, but there is a consensus that the 2024 summer Olympics will have more “connected” users than broadcast viewers.
In that perspective, the network (CDN) is key. There is likely to be more than 1 billion concurrent viewers for the 100m final, but for now the streaming ecosystem is far from sustaining such a traffic. This bandwidth problem also comes with a cost issue: today, it is more expensive to stream an event than to broadcast it. In this blog, we are going to see why “breaking silos” between the distribution network and the network headend is key to improve the user experience whilst lowering the overall cost for the content owner, and how Audience Aware Encoding(™) can be used to dynamically optimize the CDN and the headend together.
Why is Kubernetes a great fit for OTT?
Since its creation in 2014, Kubernetes has been widely adopted by many forward-thinking companies for several reasons: its open, rapid and safe development, and its ability to schedule, automate and manage distributed applications on dynamic container infrastructure. This new development paradigm (initiated by Google) is driving an inexorable transformation of how modern applications are built, delivered and deployed in the enterprise.
The video delivery ecosystem has many specificities that have slowed down Kubernetes adoption, but the number of features and the level of maturity of the latest versions pave the way for a new era in deploying and operating streaming services
The truth about Low Latency in OTT
60 seconds … It took me 60 seconds to understand what happened, to figure out that the France team just scored against Croatia during the 2018 World Cup Final. 60 seconds between the clamor from my neighborhood watching TV and me watching the goal on my laptop via my favorite OTT streaming platform. Believe me, it was a never-ending wait!
Why such a difference between OTT live streaming and broadcast TV services? With the upcoming advent of OTT streaming as the de-facto standard for watching live events, being as much as 60 seconds behind the real action is not acceptable. Customers ask for broadcast TV like quality of experience and thus, do not want to be spoiled by viewers on other services.
Before going into the possible solutions, let’s see where this latency comes from.