Overview
Low latency defines a computer network that is enhanced to process a very high volume of data messages with negligible delay. These networks are designed to support operations that need real-time access to quickly changing data. Latency is a time interval between the stimulation and response, or, from a more general point of view, a time delay between the cause and the effect of some physical change in the system being observed. Latency is actually a significance of the limited velocity with which any physical interaction can propagate.
When Is Low Latency Important?
For most streaming scenarios, the typical 30- to 45-second delay and its problematic Returning to our concert example, it’s irrelevant that the lead guitarist broke a string 36 seconds ago and you’re just now finding out. But for some streaming use cases, latency is a business-critical consideration.
Second-screen experiences: If you’re watching a live event on a second-screen app (such as a sports league or official network app), you’re likely running several seconds behind live TV. While there’s inherent latency for the television broadcast, your second-screen app needs to at least match that same level of latency to deliver a consistent viewing experience.
For example, if you’re watching your alma mater play in a rivalry game, you don’t want your experience spoiled by comments, notifications or even the neighbors next door celebrating the game-winning score before you see it. This results in unhappy fans and dissatisfied (often paying) customers.
Video chat: We all have all seen televised interviews where the reporter is speaking to someone at a remote location, and the latency in their exchange results in long pauses and the two parties talking over each other. That’s because the latency goes both ways—maybe it takes a full second for the reporter’s question to make it to the interviewee, but then it takes another second for the interviewee’s reply to get back to the reporter. That conversation can quickly turn painful.When true immediacy matters, about 150 milliseconds (one-seventh of a second) of latency in each direction is the upper limit. That’s short enough to allow for smooth conversation without awkward pauses.
Betting and bidding: Activities such as auctions and sports-track betting are exciting because of their fast pace. And that speed calls for real-time streaming with two-way communication.
For instance, horse-racing tracks have traditionally piped in satellite feeds from other tracks around the world and allowed their patrons to bet on them online. Ultra-low latency streaming eliminates problematic delays, ensuring that everyone has the same opportunity to place their bets in a time-synchronized experience. Similarly, online auctions and trading platforms are big business, and any delay can mean bids or trades aren’t recorded properly. Fractions of a second can mean billions of dollars.
Video game streaming and esports: . No one wants to play a game via a streaming service and discover that they’re firing at enemies that are no longer there. In platforms offering features for direct viewer-to-broadcaster interaction, it’s also important that viewer suggestions and comments reach the streamer in time for them to beat the level.
How Does Low-Latency Streaming Work?
As with most things in life, low-latency streaming involves compromises. We should balance three factors to find the mix that’s right for you:
- Encoding protocol and device/player compatibility.
- Audience size and geographic distribution.
- Video resolution and complexity.
Apple HLS is amongst the most widely used streaming protocols due to its reliability—but it’s not suited to true low-latency streaming. As an HTTP-based protocol, HLS streams chunks of data, and video players need a certain number of chunks (typically three) before they start playing. If you’re using the default HLS chunk size (10 seconds), that means you’re already 30 to 45 seconds behind. Customization can cut that significantly, but your viewers will experience more buffering the smaller you make those chunks.
RTMP has long been the standard for speedy stream delivery, but more people are moving away from it and implementing alternatives such as WebRTC, SRT, CMAF, QUIC, WebSockets and others. Here’s a look at how various technologies compare:
- RTMPdelivers high-quality streams efficiently, but requires a Flash-based player or custom players, meaning it is not supported on iOS devices (and soon won’t work in many web browsers).
- WebRTCis growing in popularity as an HTML5-based solution that’s well-suited for creating browser-based applications. WebRTC allows for low-latency delivery in an HTML5-based, Flash-free environment; however, it suffers from the inability to scale beyond 1,000 concurrent viewers.
- SRT (Secure Reliable Transport)is becoming popular for use cases involving unstable or unreliable networks. As a UDP-like protocol, SRT is great at delivering high-quality video over long distances, but it suffers from player support without a lot of customization.
- Two new protocols, CMAF (Common Media Application Format) and QUIC (Quick UDP Internet Connections), are emerging as alternative options, supported by Akamai and Google, While still in their infancy, both show promise of delivering super-fast video—but we have yet to see how well they scale.
- WebSockets, which is supported by Wowza Streaming Engine™ software, is also an alternative. Creating a direct connection between the server and client, WebSockets allows you to continually push a stream with little chatter between machines, thus decreasing bandwidth bloat.
—
Low latency broadcasting server for live events bring you easy to use capability to easy scale out your broadcast events
Cognosys provides hardened and ready to run images of Low latency Broadcasting server on Azure.
Deploy your Low latency Broadcasting server securely on cloud i.e. Azure marketplace with Reliable Services offered by Cognosys at one-click with easy written and video tutorials.
Features
Major Features Of Low latency broadcasting server
Higher throughput: Low latency broadcasting server enables improvements on multiple video quality broadcast.
Lower latency: Enables significant reductions in latency in last-mile networks that connect users to the internet.
Azure
Installation Instructions For Ubuntu
Installation Instructions For Ubuntu
Note: How to find PublicDNS in Azure
Step 1) SSH Connection: To connect to the deployed instance, Please follow Instructions to Connect to Ubuntu instance on Azure Cloud
1) Download Putty.
2) Connect to the virtual machine using following SSH credentials:
- Hostname: PublicDNS / IP of machine
- Port : 22
Username: Your chosen username when you created the machine ( For example: azureuser)
Password: Your Chosen Password when you created the machine ( How to reset the password if you do not remember)
Step 2) Other Information:
Default ports:
- Linux Machines: SSH Port – 22 – Please configure ACL to restrict access to SSH port
Configure custom inbound and outbound rules using this link