logo

Live Production Software Forums


Welcome Guest! To enable all features please Login or Register.

Notification

Icon
Error

Options
Go to last post Go to first unread
Archer01  
#1 Posted : Thursday, November 13, 2025 9:53:41 PM(UTC)
Archer01

Rank: Newbie

Groups: Registered
Joined: 12/16/2019(UTC)
Posts: 9
United States
Location: Indiana

Was thanked: 1 time(s) in 1 post(s)
I’ve been working with SRT links for years in studios and headend environments, and recently I came across something quite interesting that I thought might be useful to share here.

When you feed an unstable SRT input (bitrate spikes, packet loss, jitter, etc.) into a gateway and repeat it as UDP/RTP multicast without any remuxing at all, the multicast output stays surprisingly stable — even when the SRT input is suffering short periods of 8–12% packet loss.

As long as SRT manages to recover the missing packets within the buffer delay, the UDP/RTP output remains clean and unaffected.
No added latency, no TS modifications, no glitches.

I tested this across multiple setups (including a multi-NIC on-prem system we use for some cable headends) and got the same result every time:
The multicast side receives a perfectly solid stream even while the input SRT looks like it’s “fighting for survival”.

So I’m curious:

Has anyone else here noticed this kind of behavior?

Do you rely on SRT recovery to protect your multicast distribution layer?

Do you use dedicated SRT→UDP gateways, or feed SRT directly into your decoders?

Interested to hear real-world experiences from people running SRT contribution/distribution in production.
Users browsing this topic
Guest
Forum Jump  
You cannot post new topics in this forum.
You cannot reply to topics in this forum.
You cannot delete your posts in this forum.
You cannot edit your posts in this forum.
You cannot create polls in this forum.
You cannot vote in polls in this forum.