Visualizing Live Stream Synchronization and Metrics in Norsk

A Norsk workflow where audio, video, and subtitles are received from subscriptions, validated, and synchronized before being output to HLS variants.

Every live streaming workflow relies on synchronization, ensuring that multiple streams of media (audio, video, subtitles, and metadata) are aligned in time. In Norsk, synchronization points aren’t just abstract concepts; they’re real nodes in the workflow that you can inspect visually and measure quantitatively.

The Norsk visualizer makes this visible. Each box represents a stage in your pipeline, and each arrow represents a flow of media between them. Metrics inside the boxes display timing deltas, queue lengths, and reduction counts in real-time. 

[For an in-depth look at the challenges of dealing with time in live streaming workflows, see the white paper “The Hidden Complexities of Time in Live Streaming.”]

Seeing Sync in Action

Take the example below: a workflow where audio, video, and subtitles are received from subscriptions, validated, and synchronized before being output to HLS variants.

A Norsk workflow where audio, video, and subtitles are received from subscriptions, validated, and synchronized before being output to HLS variants.

In this view, you can see how the audio, video, and subtitle inputs feed into a Queue Until Validated stage. This node holds back data until all inputs have been validated and aligned in context. The Filter With Context and Foreach Group nodes then propagate those synchronized contexts downstream.

Video typically has higher latency requirements than audio due to its higher bitrate needs. The difference in processing time is why video paths typically take longer than audio paths. When these paths need to come together at composition nodes or outputs, Norsk includes a Sync node to re-align the video and audio streams. Internally, the Sync node manages queues for each stream, buffering faster inputs (typically audio) while waiting for slower ones (typically video) to catch up.

In the visualizer, you can literally see this in the queue lengths:

  • Audio queues tend to be longer when video processing takes more time.
  • Deltas (delta_ms, realtimeDelta, etc.) show the timing offset between streams.
  • Reductions indicate the number of synchronization or timing adjustments that have been made.

These metrics aren’t just for curiosity. They provide operational awareness, allowing you to detect drift, overloads, or bottlenecks before they become user-visible issues.

When paired with Prometheus and Grafana, Norsk’s metrics can feed real-time dashboards that visualize synchronization health across your pipeline. From queue depth trends to delta distributions, you can see how timing behaves as workloads scale, helping you tune performance, anticipate trouble, and maintain confidence in your output.

Get in touch or set up a demo to find out more about how Norsk lets you visualize synchronization across your live streaming workflows.

Author

  • Kelvin Kirima is a developer at id3as, proficient in Rust, JavaScript/Typescript, PostgreSQL, and web technologies such as HTML and CSS.

    View all posts