Example setup of a combined MPEG-DASH and WebRTC distribution

Eyevinn Technology
6 min readFeb 22, 2023

--

Available in a feature branch of DASH-IF’s Reference Player there is now experimental support for switching between an HTTP-based stream and a real-time WebRTC based stream. Follow this guide if you want to try it out. All based on open source components.

In this blog post we will walk you through an example setup for a hybrid of MPEG-DASH and WebRTC-based distribution. One use case for this type of setup is to use MPEG-DASH distribution as fallback to real-time WebRTC in cases where device does not support WebRTC or the network connection is not good enough to sustain a very low latency stream of high quality.

For the real-time WebRTC-based distribution we will be using WebRTC HTTP Egress Protocol (WHEP) as setup protocol and the WHEP URL is included in the MPEG-DASH manifest as an adaption set.

We will setup two parallel distribution pipelines where we will be using MPEG-TS over SRT as the ingestion protocol to the MPEG-DASH distribution pipeline and WebRTC HTTP Ingest Protocol (WHIP) as the ingestion protocol for the real-time WebRTC distribution pipeline. For playback and switching between the distribution pipelines we will be using a feature branch of DASH-IF reference player.

To produce a signal to ingest to these pipelines we will be using a camera that outputs RTSP and use ffmpeg to repackage it into an MPEG-TS over SRT stream. This SRT stream we will ingest into an SRT-WHIP gateway application that can ingest to a WHIP endpoint in parallel to an SRT receiver. It should be noted that the ingest-part here is not optimal when it comes to reducing latency as it contains several latency-adding steps. Glass-to-glass setup in this setup should not be considered a benchmark and the focus is on the distribution pipelines.

Before we start, we will create a docker network that we will connect all containers to.

docker network create -d bridge mpd-whep-net

MPEG-DASH pipeline

For the MPEG-DASH pipeline we use the srt-live-transmit tool that is included in the SRT library for the SRT-protocol negotiation, the MPEG-TS stream is sent over UDP to shaka-packager that is being used to create the MPEG-DASH manifest, video- and audio-segments. To serve these files we use a simple webserver.

Create a volume where to put segments and manifests:

docker volume create mpd-whep-media

Start the shaka-packager container:

docker run --rm -d --name=shaka --network=mpd-whep-net \
-v mpd-whep-media:/dash \
google/shaka-packager:v2.5.1 packager \
'in=udp://0.0.0.0:1234,stream=audio,init_segment=/dash/shaka_audio.mp4,segment_template=/dash/shaka_audio_$Number$.m4s' \
'in=udp://0.0.0.0:1234,stream=video,init_segment=/dash/shaka_video.mp4,segment_template=/dash/shaka_video_$Number$.m4s' \
--mpd_output /dash/manifest.mpd \
--base_urls http://localhost:8080/

As Shaka Packager does not setup an SRT receiver we will use the tool included in the SRT library called srt-live-transmit:

docker run --rm -d --name=srtrecv --network=mpd-whep-net \
-p 4141:4141/udp howlowck/srt \
./srt-live-transmit "srt://:4141/?mode=listener" "udp://shaka:1234"

Then we need a simple webserver to serve the MPEG-DASH manifest and media segments:

docker run --rm -d --name=webserver --network=mpd-whep-net \
-v mpd-whep-media:/dash \
-p 8080:5000 \
ghcr.io/patrickdappollonio/docker-http-server:v2 -d /dash --cors

To verify that the MPEG-DASH distribution is correctly setup we can use ffmpeg to test:

ffmpeg -re -i <VIDEOFILE> -c:v libx264 -tune zerolatency -preset ultrafast -c:a aac -f mpegts srt://127.0.0.1:4141

Then open an MPEG-DASH player to http://localhost:8080/manifest.mpd to verify that you have the MPEG-DASH distribution pipeline ready.

WebRTC pipeline

For the WebRTC pipeline we use WHIP endpoint service in combination with Symphony Media Bridge as SFU. The WHIP endpoint is responsible for the SDP transport and negotiation according to the WHIP protocol and controls the SFU to setup the RTP connection between the ingest client and the SFU. The WHIP endpoint service also creates the channel on the WHEP-endpoint service and to establish a connection between the origin SFU and an egress SFU. With this setup of separating the ingest SFU from the egress SFU makes it possible to horizontally scale up the distribution by adding additional egress SFU:s.

Start the origin SFU container:

docker run --rm -d --name=sfu --network=mpd-whep-net \
-p 8280:8280/tcp \
-p 13000:13000/udp \
-e HTTP_PORT=8280 \
-e UDP_PORT=13000 \
-e IPV4_ADDR=127.0.0.1 \
-e API_KEY=example \
eyevinntechnology/wrtc-sfu:v0.4.1

Then start the edge SFU container:

docker run --rm -d --name=sfu-edge --network=mpd-whep-net \
-p 8380:8380/tcp \
-p 12000:12000/udp \
-e HTTP_PORT=8380 \
-e UDP_PORT=12000 \
-e IPV4_ADDR=127.0.0.1 \
-e API_KEY=example \
eyevinntechnology/wrtc-sfu:v0.4.1

Next thing is to start the WHIP endpoint service:

docker run --rm -d --name=ingest --network=mpd-whep-net \
-p 8200:8200/tcp \
-e PORT=8200 \
-e EXT_PORT=8200 \
-e ORIGIN_SFU_URL=http://sfu:8280/conferences/ \
-e SFU_API_KEY=example \
-e EDGE_LIST_CONFIG=/etc/edge-list-config.json \
-e HOSTNAME=localhost \
eyevinntechnology/wrtc-origin:v0.4.0

And finally start the WHEP endpoint service:

docker run --rm -d --name=egress --network=mpd-whep-net \
-p 8300:8300/tcp \
-e PORT=8300 \
-e EXT_PORT=8300 \
-e HOSTNAME=localhost \
-e SMB_URL=http://sfu-edge:8380/conferences/ \
-e SMB_API_KEY=example \
eyevinntechnology/wrtc-whep:v0.1.2

You can verify that the WebRTC pipeline is ready using the web whip client at https://web.whip.eyevinn.technology/ and the WHIP endpoint http://localhost:8200/api/v2/whip/sfu-broadcaster?channelId=test

Verify playback with a WHEP client and the WHEP URL http://localhost:8300/whep/channel/test

Now we have both an MPEG-DASH and WebRTC pipeline up and running. We can then move on to the ingestion part.

Ingest

As mentioned, we will be using ffmpeg to repackage the RTSP stream from the camera to an MPEGTS-SRT stream that we will push to the SRT-WHIP gateway.

First spin up the SRT WHIP gateway application:

docker run --rm -d --name=srtwhip --network=mpd-whep-net \
-p 3000:3000/tcp \
-p 9000-9999:9000-9999/udp \
eyevinntechnology/srt-whip:v0.2.3

Create an SRT-WHIP transmitter via the SRT-WHIP gateway REST API:

curl -X 'POST' \
'http://localhost:3000/api/v1/tx' \
-H 'Content-Type: application/json' \
-d '{
"port": 9995,
"whipUrl": "http://ingest:8200/api/v2/whip/sfu-broadcaster?channelId=srt",
"passThroughUrl": "srt://srtrecv:4141",
"status": "idle"
}'

You can verify that the transmitter has been created in the web UI at http://localhost:3000/ui

Start the transmitter by clicking on the port in the UI or via the REST API:

curl -X 'PUT' \
'http://localhost:3000/api/v1/tx/9995/state' \
-H 'Content-Type: application/json' \
-d '{
"desired": "running"
}'

When the transmitter is running (indicated in the UI as colored red) you can use ffmpeg to take the RTSP stream from the camera and push to the SRT-WHIP gateway.

ffmpeg -i "rtsp://<username>:<password>@<rtsp-address>" \
-c:v libx264 -tune zerolatency -preset ultrafast -c:a aac \
-f mpegts "srt://localhost:9995"

Combining the pipelines

These two pipelines are combined with the MPD-WHEP service. A service that produces a new MPEG-DASH manifest based on the URL to the MPEG-DASH manifest and a WHEP URL. This new MPEG-DASH manifest contains an audio and video adaptation set and a WHEP (RTP) adaptation set.

To start the MPD-WHEP service run the following command:

docker run --rm -d --name=mpdwhep --network=mpd-whep-net \
-p 8000:8000 \
-e MPD=http://webserver:5000/manifest.mpd \
-e WHEP=http://localhost:8300/whep/channel/srt \
eyevinntechnology/mpd-whep

As WebRTC in MPEG-DASH is still in early stages there are no players that supports it, but there is an experimental branch of DASH-IF reference player (dash.js) that we can use.

Once you have everything up, you’ll be able to play the stream on http://localhost:8000/manifest.mpd and use the quality switcher to switch between MPEG-DASH and WebRTC.

For convivence all the above is also available as a docker-compose file.

If you want to extend this example setup into a more production like setup we are happy to assist. Just drop an email to sales@eyevinn.se and we can tell you more how we can help. We are vendor-independent video streaming experts that provides tech strategy consulting, video software development and open-source contributions.

--

--

Eyevinn Technology
Eyevinn Technology

Written by Eyevinn Technology

We are consultants sharing the passion for the technology for a media consumer of the future.

No responses yet