Extending our WebRTC streaming POC with MPEG-TS support

Eyevinn Technology
3 min readApr 27, 2022


In a previous blog post we described our proof-of-concept of an open and standardized way of using WebRTC for real-time low-latency streaming. One thing we described in the previous post was how we developed a Javascript browser library implementing the WebRTC-HTTP Ingestion Protocol (WHIP) as a standardized way of ingesting media over WebRTC.

This library made it possible to write a browser application for mobile and desktop computers to broadcast yourself using WebRTC. An application we have a demo available at https://webcast.eyevinn.technology. However, that might be a limited use case and you might want to ingest any type of media source and not only the connected web camera.


To address this my colleague Marcus Spangenberg developed a small application that can consume an MPEG-TS stream and ingest over WebRTC using the WHIP protocol. A tool that we now also have made available as open source.

The application is written in C++ and uses the GStreamer pipeline to handle the media pipeline. It is currently limited to consume H.264 or MPEG2 video and AAC or PCM audio, and it will transcode to VP8 video and OPUS audio. Instructions on how to build the application is available in the GitHub repository and we have tested it on both Linux/Ubuntu and OSX.

Start the application by specifying which interface and port to listen for (UDP) and the WHIP endpoint to ingest to.

./mpeg-ts-client -a -p 9998 -u https://whip.dev.eyevinn.technology/api/v1/whip/broadcaster

Once it detects that it is receiving a valid MPEG-TS stream with the supported video and audio codecs it will start the WebRTC negotiation in conformance with the WHIP protocol. To try it out we can generate an MPEG-TS stream by looping a file using the ffmpeg-based container we have in our toolbox.

docker run --rm -v $PWD:/mnt eyevinntechnology/toolbox-loopts \
--withtc \
--withaudio \
VINN.mp4 \

The application will then produce a log output similar to this:

[2022-04-27 21:06:06.335] Creating pipeline mpegTsAddress, mpegTsPort 9998, mpegTsBufferSize 1000000000 ns
[2022-04-27 21:06:06.612] New pipeline clock
[2022-04-27 21:06:07.613] Dynamic pad created, type video/x-h264
[2022-04-27 21:06:07.616] Dynamic pad created, type audio/mpeg
[2022-04-27 21:06:07.722] onNegotiationNeeded
[2022-04-27 21:06:08.020] onOfferCreated
[2022-04-27 21:06:10.448] Server responded with resource https://whip.dev.eyevinn.technology:443/api/v1/whip/broadcaster/9368aeb1-1ba8-42b2-b322-f5dcf02b6e5e
[2022-04-27 21:06:10.448] Setting local SDP
[2022-04-27 21:06:10.448] Setting remote SDP

To playback the WebRTC stream we use our media server independent WebRTC-player.


If you would want to take the feed from an RTSP camera instead we could use ffmpeg to consume the RTSP feed and remux to an MPEG-TS stream first.

The ffmpeg command to accomplish the above could look something like this.

ffmpeg -i rtsp://<username>:<password>@<ip>/stream1 \
-vcodec copy \
-acodec aac \
-f mpegts \


And in a similar way we could take an MPEG-TS that is transmitted using SRT (Secure Reliable Transport).

ffmpeg -i srt:// \
-vcodec copy \
-acodec copy \
-f mpegts \

Open Source

As mentioned before, all the source code for this is open source so we welcome you to contribute and give feedback. This WHIP-client has mainly been developed by Marcus Spangenberg from Eyevinn Technology.

If you want to know more about this prototype and proof-of-concept you can contact Jonas Birmé, VP R&D Eyevinn Technology and author of this blog post.

Eyevinn Technology is leading specialists in video technology and media distribution, and proud organizer of the yearly nordic conference Streaming Tech Sweden.



Eyevinn Technology

We are consultants sharing the passion for the technology for a media consumer of the future.