Streaming GStreamer pipelines via HTTP

In the past many people joined the GStreamer IRC channel on FreeNode and were asking how to stream a GStreamer pipeline to multiple clients via HTTP. Just explaining how to do it and that it’s actually quite easy might not be that convincing, so here’s a small tool that does exactly that. I called it http-launch and you can get it from GitHub here.

Given a GStreamer pipeline in GstParse syntax (same as e.g. gst-launch), it will start an HTTP server on port 8080, will start the pipeline once the first client connects and then serves from a single pipeline all following clients with the data that it produces.

For example you could call it like this to stream a WebM stream:

http-launch webmmux streamable=true name=stream   videotestsrc ! vp8enc ! stream.   audiotestsrc ! vorbisenc ! stream.

Note that this is just a simple example of what you can do with GStreamer and not meant for production use. Something like gst-streaming-server would be better suited for that, especially once it gets support for HLS/DASH or similar protocols.

Now let’s walk through the most important parts of the code.

The HTTP server

First some short words about the HTTP server part. Instead of just using libsoup, I implemented a trivial HTTP server with GIO. Probably not 100% standards compliant or bug-free, but good enough for demonstration purposes :). Also this should be a good example of how the different network classes of GIO go together.

The HTTP server is based on a GSocketService, which listens on a specific port for new connections via a GLib main context, and notifies via a signal whenever there is a new connection. These new connections are provided as a GSocketConnection. These are line 424 and following, and line 240 and following.

In lines 240 and following we start polling the GIOStream of the connection, to be notified whenever new data can be read from the stream. Based on this non-blocking reading from the connection is implemented in line 188 and following. Something like this pattern for non-blocking reading/writing to a socket is also implemented in GStreamer’s GstRTSPConnection.

Here we trivially read data until a complete HTTP message is received (i.e. “\r\n\r\n” is detected in what we read), which is then parsed with the GLib string functions. Only GET and HEAD requests are handled in very simple ways. The GET request will then lead us to the code that connects this HTTP server with GStreamer.

Really, consider using libsoup if you want to implement an HTTP server or client!

The GStreamer pipeline

Now to the GStreamer part of this small application. The actual pipeline is, as explained above, passed via the commandline. This is then parsed and properly set up in line 362 and following. For this GstParse is used, which parses a pipeline string into a real GstBin.

As the pipeline string passed to http-launch must not contain a sink element but end in an element with the name “stream”, we’ll have to get this element now and add our own sink to the bin. We do this by getting the “stream” element via gst_bin_get_by_name(), setting up a GstGhostPad that proxies the source pad of it as a source pad of the bin, and then putting the bin created from the pipeline string and a sink element into a GstPipeline, where both (the bin and the sink) are then connected.

The sink we are using here is multisocketsink, which sends all data received to a set of aplication-provided GSockets. In line 390 and following we set up some properties on the sink that makes sure that newly connected clients start from a keyframe and that the buffering for all clients inside multisocketsink is handled in a sensible way. Instead of letting new clients wait for the next keyframe we could also explicitly request the pipeline to generate a new keyframe each time a client connects.

Now the last part missing is that whenever we successfully received a GET request from a client, we will stop handling any reads/writes from the socket ourselves and pass it to multisocketsink. This is done in line 146. From this point onwards the socket for this client is only handled by multisocketsink. Additionally we start the pipeline here for the first client that has successfully connected.

I hope this showed a bit how one of the lesser known GStreamer elements can be used to stream media to multiple clients, and that GStreamer provides building blocks for almost everything already 😉

117 thoughts on “Streaming GStreamer pipelines via HTTP”

  1. It’s possible using this to write a simple app that transcodes any (gstreamer-compatible) media into a suitable format for mobile devices like mp4 for ios? I’m thinking in something like the plexmediaserver transcoder.

    1. Yes, you could use encodebin with the encoding profiles for that for example. You can pass it any media and it will convert that to something that complies to the profile you set on encodebin. Note that for (live) streaming MP4 via HTTP you need to use something like DASH due to header requirements of MP4.

      1. Hi Sebastian,
        I need to use MP4 for live streaming.
        Can you please elaborate how DASH could be used with http-launch to be able to live stream MP4 format ?
        Thank you so much !

      2. DASH can’t be used with http-launch as is. You’ll need to serve multiple different files at different locations from the HTTP server: one Manifest and then multiple MP4 fragments.

        You could use this for creating all this https://bugzilla.gnome.org/show_bug.cgi?id=668094 , or just plug all the things together that already exist in GStreamer to create a DASH stream. And then you just need any HTTP server for serving these files.

        Also see HTTP Adaptive Streaming with GStreamer

  2. Hi Slomo:

    Thanks for sharing. I’m new to GStreamer, the following command works for my PC-to-PC streaming and I really want to know how to change it into http-launch version because I need PC-to-mobiles streaming. BTW, I’m using Raspberry Pi.

    sender command:
    raspivid -n -t 0 -h 720 -w 1080 -fps 25 -hf -b 2000000 -o – | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay config-interval=1 pt=96 ! gdppay ! tcpserversink host=0.0.0.0 port=5000

    receiver command
    gst-launch-1.0 -v tcpclientsrc host={Server IP} port=5000 ! gdpdepay ! rtph264depay ! avdec_h264 ! videoconvert ! autovideosink sync=false

    For PC-to-mobiles, I’ve tried rtsp but failed, get-rtsp-server need GStreamer-1.0 > v1.2 (perhaps, I forgot the exact version) but Raspberry Pi only offer v1.0 package now. Maybe I should compile it myself. Hope it won’t be too complicated. And still some people use web server like nginx-rtmp module or node.js packaging rtp/tcp into http to get the job done. However, those approach were not so easy for me to work it out so far. If things could be package into http in gstreamer level, mobile developer could stream so much easier because all they need is just a WebView, that’s all.

    Thank you for opening the gateway that streams video into people’s pockets.

    Any help would be VERY appreciated!

    1. For gst-rtsp-server you need GStreamer >= 1.2.3, yes. But otherwise this is a very good solution for real time and low latency streaming.

      So in general I would recommend to not pipe the raspivid output to GStreamer like that. That’s never going to work reliable. Instead use a real source element for the camera, e.g. https://github.com/thaytan/gst-rpicamsrc or v4l2src (later RPi firmware should have a v4l2 driver for the camera).

      Otherwise there should be no problem using something like http-launch or any other way of streaming video from the RPi to the network.

    2. If you want to receive the video on an Android mobile you can use this API https://code.google.com/p/gstreamer-java/
      I managed to stream from the raspicam using the same method as you. I’m now just searching a way to be able to display it in a web browser, so I gope http-launch wil help me ^^

      1. Are you sure gstreamer-java works on Android? It uses JNA and last time I checked dalvik did not support that, only JNI. Also gstreamer-java still uses the old and unmaintained GStreamer 0.10 release series.

        However you can use GStreamer just fine on Android if you write the GStreamer code in C and call it from your Java application via JNI.

  3. Hi ! 🙂
    I’m not really used to compile projects and I couldn’t compile http-launch :
    I tried to use these commands :

    ./autogen.sh –prefix=/usr –libdir=/usr/lib/arm-linux-gnueabihf/
    make
    sudo make install

    The first one seems to work.
    But when I run the second I got errors (so I don’t try the third one) :
    $ make
    make all-recursive
    make[1]: Entering directory `/home/pi/http-launch’
    Making all in src
    make[2]: Entering directory `/home/pi/http-launch/src’
    CC http_launch-http-launch.o
    In file included from /usr/include/gstreamer-1.0/gst/gstpad.h:65:0,
    from /usr/include/gstreamer-1.0/gst/gstelement.h:57,
    from /usr/include/gstreamer-1.0/gst/gstbin.h:27,
    from /usr/include/gstreamer-1.0/gst/gst.h:34,
    from http-launch.c:24:
    /usr/include/gstreamer-1.0/gst/gstbuffer.h: In function ‘gst_buffer_ref’:
    /usr/include/gstreamer-1.0/gst/gstbuffer.h:335:10: error: cast increases required alignment of target type [-Werror=cast-align]
    /usr/include/gstreamer-1.0/gst/gstbuffer.h: In function ‘gst_buffer_copy’:
    /usr/include/gstreamer-1.0/gst/gstbuffer.h:372:10: error: cast increases required alignment of target type [-Werror=cast-align]
    In file included from /usr/include/gstreamer-1.0/gst/gstevent.h:182:0,
    from /usr/include/gstreamer-1.0/gst/gstpadtemplate.h:36,
    from /usr/include/gstreamer-1.0/gst/gstpad.h:68,
    from /usr/include/gstreamer-1.0/gst/gstelement.h:57,
    from /usr/include/gstreamer-1.0/gst/gstbin.h:27,
    from /usr/include/gstreamer-1.0/gst/gst.h:34,
    from http-launch.c:24:
    /usr/include/gstreamer-1.0/gst/gstmessage.h: In function ‘gst_message_ref’:
    /usr/include/gstreamer-1.0/gst/gstmessage.h:315:10: error: cast increases required alignment of target type [-Werror=cast-align]
    /usr/include/gstreamer-1.0/gst/gstmessage.h: In function ‘gst_message_copy’:
    /usr/include/gstreamer-1.0/gst/gstmessage.h:353:10: error: cast increases required alignment of target type [-Werror=cast-align]
    In file included from /usr/include/gstreamer-1.0/gst/gstpadtemplate.h:36:0,
    from /usr/include/gstreamer-1.0/gst/gstpad.h:68,
    from /usr/include/gstreamer-1.0/gst/gstelement.h:57,
    from /usr/include/gstreamer-1.0/gst/gstbin.h:27,
    from /usr/include/gstreamer-1.0/gst/gst.h:34,
    from http-launch.c:24:
    /usr/include/gstreamer-1.0/gst/gstevent.h: In function ‘gst_event_ref’:
    /usr/include/gstreamer-1.0/gst/gstevent.h:407:10: error: cast increases required alignment of target type [-Werror=cast-align]
    /usr/include/gstreamer-1.0/gst/gstevent.h: In function ‘gst_event_copy’:
    /usr/include/gstreamer-1.0/gst/gstevent.h:442:10: error: cast increases required alignment of target type [-Werror=cast-align]
    http-launch.c: In function ‘main’:
    http-launch.c:369:33: error: cast increases required alignment of target type [-Werror=cast-align]
    http-launch.c:385:24: error: cast increases required alignment of target type [-Werror=cast-align]
    http-launch.c:400:21: error: cast increases required alignment of target type [-Werror=cast-align]
    cc1: all warnings being treated as errors
    make[2]: *** [http_launch-http-launch.o] Error 1
    make[2]: Leaving directory `/home/pi/http-launch/src’
    make[1]: *** [all-recursive] Error 1
    make[1]: Leaving directory `/home/pi/http-launch’
    make: *** [all] Error 2

    I don’t know if it comes from my way of compiling or something else ?
    Thanks ! 😉

      1. Thanks, it works great !
        I’m using this command to stream from the raspberry camera now :

        http-launch webmmux streamable=true name=stream\
        rpicamsrc bitrate=1000000 !\
        video/x-h264,width=320,height=240,framerate=25/1 !\
        h264parse !\
        rtph264pay config-interval=1 pt=96 !\
        gdppay !\
        stream.

        But I can’t display it on a webbrowser, do you have any idea ?
        This is the way I try to display it (I’ve change your code to be on the port 8001 instead of 8080) :

        Your browser does not support the VIDEO tag and/or RTP streams.

      2. Try without rtph264pay and without gdppay, then it should work better at least. You can’t put RTP inside GDP into a WebM container.

      3. That’s what I feared, but when I use this pipeline :
        http-launch webmmux streamable=true name=stream\
        rpicamsrc bitrate=1000000 !\
        video/x-h264,width=320,height=240,framerate=25/1 !\
        h264parse !\
        stream.

        I got this :
        pi@raspberrypi ~ $ ./test-http-launch.sh
        Listening on http://127.0.0.1:8001/
        New connection 172.23.212.118:58153
        Starting to stream to 172.23.212.118:58153
        Starting pipeline
        Error Internal data flow error.
        Removing connection 172.23.212.118:58153

        But when I try this :
        gst-launch-1.0 -v\
        rpicamsrc bitrate=1000000 !\
        video/x-h264,width=320,height=240,framerate=25/1 !\
        h264parse !\
        tcpserversink host=$ip port=5000

        The pipeline launches.
        I don’t really understand why it reacts differently

      4. Well, one is streaming raw h264 via tcpserversink. The other muxes it into webm. And webm does not support h264. Try using matroskamux instead of webmmux 🙂

  4. Hello again !
    Sorry about asking for help again ! ^^
    I would like to know if it is possible to use http-launch at the same time as my tcp pipeline. Because my Android application is based on the tcp pipeline and the web interface on http-launch. I tried to use the gstreamer “queue” element but I don’t think I’m using it right since it tells me “parse.” is not found when I launch this script :
    #!/bin/bash

    ip=$(hostname -I)

    http-launch matroskamux streamable=true name=stream\
    rpicamsrc bitrate=1000000 !\
    video/x-h264,width=320,height=240,framerate=25/1 !\
    h264parse name=parse
    parse. !\
    queue !\
    stream.
    parse. !\
    queue !\
    rtph264pay config-interval=1 pt=96 !\
    gdppay !\
    tcpserversink host=$ip port=5000

    I know it may not be the most appropriate place to post my problem, let me know if it bothers you ! 😉

    1. I tried this way, because, if I first start the TCP pipeline and then run the HTTP pipeline, when I try to load the video from the HTTP pipeline it show me this error :

      Listening on http://127.0.0.1:8001/
      New connection 172.23.200.60:59199
      Starting to stream to 172.23.200.60:59199
      Starting pipeline
      mmal: mmal_vc_component_enable: failed to enable component: ENOSPC
      mmal: camera component couldn’t be enabled
      Error Internal data flow error.
      Removing connection 172.23.200.60:59199

      And the program stops.
      Which seems to be caused by the TCP pipeline which already uses the camera.

      1. Yes, you can only access the camera once. What you need to do is to write a proper application with GStreamer that has a single pipeline and with the help of a tee element duplicates the camera output to the HTTP and to the TCP sinks.

        As mentioned in the blog post, http-launch (just like gst-launch) is just an example. For real usage you will want to include this code into a real application and adapt it for your purposes.

  5. Hello Sebastian,

    Thank you very much for this nice example, something like an http sink is really missing …
    i try to integrate this in an Android app, but it doesn’t work.
    http-launch works if i run it from the command line, but it doesn’t work integrated into the app.
    The issue is the following:
    the “incoming” on_new_connection callback is properly called, but then nothing happens, the on_read_bytes callback is never called.
    I also had to replace g_socket_listener_add_inet_port by g_socket_listener_add_address because no connection was possible with the use of g_socket_listener_add_inet_port.

    Is there something special we should think about, or to modify, when integrating inside an Android app ?
    Or something related to network with Android devices ?

    Thanks a lot for any advice !

    1. sorry i found it, i was just using the wrong context.
      Is there a way to improve latency or is it only related to the web browser ?
      Thanks again.

  6. If the app is restarted it always returns an “Unable to start http service: Error binding to address: Address already in use” error message, and the service cannot be run.
    I did not found how to solve this, would you have any idea about this ?

    Thanks

    1. This would mean that something else is already using the port you selected for listening on. Just select a different one, or stop the application that listens on that port already.

  7. I’m pretty sure no other application use the port.
    Running the application works fine the first time.
    Then when the application is stopped the service is stopped too in the code.
    But when starting again the application I have this message … :/

    1. That probably means that the application still runs in the background somehow, or didn’t shutdown cleanly and the OS didn’t reclaim the port yet or something like that. If you have a rooted device you can check with “netstat” on the shell which process is using that port.

  8. Hi,
    I’m trying to get yout tool working, with no success. It starts listening (saying “Listening on http://127.0.0.1:8080/“) but when I open a browser to http://127.0.0.1:8080, it crashes and prints :
    New connection 127.0.0.1:51524
    Starting to stream to 127.0.0.1:51524
    Starting pipeline
    Error Internal data flow error.
    Removing connection 127.0.0.1:51524

    Any clue (I’m using the pipe you gave as an example) ?

    Thanks

    1. I would need more information. Can you send me a GStreamer debug log via mail? Run http-launch with GST_DEBUG=6 and just pipe all the output into a file.

      1. Thanks for your quick answer !
        I tried again this morning with your pipe, and it is now working (the computer has been reboot since my last try, it may have changed something). But my pipe is still not working. It is a tcpclientsrc, I’m trying with this pipe :
        http-launch multipartmux name=stream ! tcpclientsrc port=3000 ! stream
        I also tried with this one (very close to the working one) same result :
        http-launch webmmux streamable=true name=stream tcpclientsrc port=3000 ! vp8enc ! stream. audiotestsrc ! vorbisenc ! stream.
        I sent you by mail the debug output of the first command.

        Thanks for your time !

  9. When I try to autogen and then configure (also with –enable-warnings-as-errors=no
    ) I get the follwing error:
    ./configure: line 2655: syntax error near unexpected token `pic-only’
    ./configure: line 2655: `LT_INIT(pic-only)’

    Not sure what to do?

    1. That sounds like autogen.sh should’ve already shown an error. You either don’t have libtool installed, or a too old version of it.

  10. This seems to be a very comnmon request. Wouldn’t it make sense to encapsulate the functionality of http-launch as a gstreamer element? Then http streaming from gsreamer would be trivial.

    gst-launch-1.0 v4l2src ! vp8enc ! vp8enc ! webmux ! httpstreamsink

    Where httpstreamsink is the new element.

  11. Hi,
    So here is my working gstreamer 1.0 source pipeline I have been using for testing ect. However I need to be able to use it with a web server:

    gst-launch-1.0 v4l2src device=/dev/video1 ! ‘video/x-raw,format=(string)YUY2,width=640,height=480’ ! jpegenc ! udpsink host=192.168.1.54 port=5000

    and this also works but has insane latency:

    http-launch webmmux streamable=true name=stream v4l2src device=/dev/video1 ! vp8enc ! stream.

    However this does not work:

    http-launch webmmux streamable=true name=stream v4l2src device=/dev/video1 ! jpegenc ! stream.
    Listening on http://127.0.0.1:8080/
    New connection 127.0.0.1:59056
    Starting to stream to 127.0.0.1:59056
    Starting pipeline
    Error Internal data flow error.
    Removing connection 127.0.0.1:59056

    As well as this:

    http-launch webmmux streamable=true name=stream v4l2src device=/dev/video1 ! ‘video/x-raw,format=(string)YUY2,width=640,height=480’ ! vp8enc ! stream.
    Listening on http://127.0.0.1:8080/
    New connection 127.0.0.1:58852
    Starting to stream to 127.0.0.1:58852
    Starting pipeline
    Error Internal data flow error.
    Removing connection 127.0.0.1:58852

    and the same for the above, but with jpegenc.

    I need to be able to set the width and height, as well as having a stream that is as close to live as possible. I also will need to be able to connect to it via wireless ap and connect to the webserver from the connected devices. At the moment I am using ubuntu 12.04 on my laptop, but the intended use will be an embedded board running ubuntu 12.04 to stream to android/ios devices. I have used the SDK but we use c# and not java for our apps so I have to resort to a web server.

    1. vp8enc and jpegenc do not support the YUY2 format. Put a videoconvert element in front of them, then it should work.

      Also after jpegenc you might want to use multipartmux. But that depends on your exact use case.

      1. Thank You very much for your assistance. I do have this working correctly now to the point where I can change the resolution:

        http-launch v4l2src device=/dev/video1 ! ‘video/x-raw,format=(string)YUY2,width=640,height=480’ ! videoconvert ! jpegenc streamable=true name=stream ! .stream

        The above pipeline only outputs 1 jpeg though. I can refresh the page and it will update the jpeg, but that is it. I assume that is because I am encoding it to jpeg. I tried the same with vp8enc and my browser just spits out a bin file at me.

        A few other attempts that just spit out bin files:

        http-launch v4l2src device=/dev/video1 ! ‘video/x-raw,format=(string)YUY2,width=640,height=480’ ! videoconvert ! jpegenc ! multipartmux streamable=true name=stream ! .stream

        http-launch v4l2src device=/dev/video1 ! ‘video/x-raw,format=(string)YUY2,width=640,height=480’ ! videoconvert ! vp8enc streamable=true name=stream ! .stream

        And I am assuming, based on all the documentation I have read, that multipartmux is expecting a file to output to. Though I am having a hard time figuring out how to incorporate it. I also ran into multifdsink in my research but I do not think that is the appropriate approach. I have also got gstreamer streaming server up and running, but I can’t figure out how to get any sort of pipeline or anything set up with it because of a lack of documentation and no examples anywhere on the net. So I have given up on that, and it has a lot of dependancies so its a bit big for an embedded board.

        My goal here is to wirelessly stream from a usb camera(microscope) at the highest resolution possible, while maintaining a high frame rate so the microsope can be focused easily.

        p.s. I am not using any audio so I see no reason to mux it. If this is causing me problems I would like to know. Also your help is very much appreciated and I appologize for my lack of knowledge, I have been working with this for only 2 weeks and have no background in C, just web programming and linux.

      2. I should also add I get this with some of the pipelines since I added videoconvert

        (http-launch:15133): GStreamer-CRITICAL **: gst_ghost_pad_new: assertion `!gst_pad_is_linked (target)’ failed

        (http-launch:15133): GStreamer-CRITICAL **: gst_element_add_pad: assertion `GST_IS_PAD (pad)’ failed

        (http-launch:15133): GStreamer-CRITICAL **: gst_pad_link_full: assertion `GST_IS_PAD (srcpad)’ failed

        I have not seen this before and I am not sure what it means

      3. For VP8, put it into a WebM container. Otherwise it will not work. Even if you don’t need audio you need a container format around the VP8 stream.

        For the multipartmux problem, I think it’s necessary to somehow tell the browser that this is multipart mime encoded data containing jpeg images. Not sure, will have to be tried and researched 🙂

        For those warnings, that means that you are linking a pad that already is linked and some other things. If you set G_DEBUG=fatal-warnings and run in a debugger you will get the application to stop at the warning and can get a backtrace.

  12. Hi guys,
    one question
    can i get udp stream with gstreamer and re-stream in http to an other pc?
    someone can help me ?thx

  13. Hi Sebastian, thx for fast reply, i just download and build from your git repo http-launch. Where can find some doc for correct pipeline?

    i explain what i would like to do:

    stream –> eth0=10.0.0.2
    eth1=11.0.0.2 –> dest. pc 11.0.0.3

    i have h264 unicast udp stream income to eth0 ( example udp port 12345) i have to restream to eth1 in http

    i have to do something like that:??

    http-launch -v udpsrc port=12345 ! tsdemux ! queue max-size-buffers=0 max-size-time=0 ! mpegvideoparse ! mpeg2dec ! videoconvert ! omxh264enc ! video/x-h264,stream-format=byte-stream,profile=high ! h264parse ! mpegtsmux ! tcpserversink host=11.0.0.3 port=5000

    1. Almost, http-launch does not need a sink in the end. Instead it requires a muxer with the name stream as the last element, which is then connected to the HTTP sink. Just changing your pipeline to “… ! mpegtsmux name=stream” should make it work.

      Note that you try to receive MPEGTS over UDP in your pipeline, and not raw H264. If that’s what you want, ok… but that’s not clear from your description.

  14. Hi, I downloaded http-launch and made a build. I tried with the simple command
    http-launch webmmux streamable=true name=stream videotestsrc ! vp8enc ! stream. audiotestsrc ! vorbisenc ! stream.

    But i cannot get http-launch to running.
    I have this following error.
    “invalid pipeline: no element “webmmux””

    Do I miss out any library?
    I have compiled, gstreamer, gstreamer1.0, and gst-plugins-good-0.10.31.

    Please help.

      1. Hi slomo,

        I have removed gstreamer-0.10, and older gst plugins. Now I have gstreamer1.0, and gstreamer1.0-plugins-good.
        When I do this:
        http-launch v4l2src device=/dev/video0 ! ‘video/x-raw, format=YUV, width=640, height=480, framerate=30/1’ ! jpegenc ! multipartmux ! tcpserversink host=1
        67.67.168.250 port=5000

        I have the following error:
        no element with name “stream” found

        I double checked with this command:
        gst-launch-1.0 v4l2src device=/dev/video0 ! ‘video/x-raw, format=YUV, width=640, height=480, framerate=30/1’ ! jpegenc ! multipartmux ! tcpserversink hos
        t=167.67.168.250 port=5000

        But I get this error instead:
        WARNING: erroneous pipeline: could not link v4l2src0 to jpegenc0

        I have all neccessary gst-plugins.
        root@siemens-sdt-trunk:~# gst-inspect-1.0
        video4linux2: v4l2src: Video (video4linux2) Source
        video4linux2: v4l2sink: Video (video4linux2) Sink
        video4linux2: v4l2radio: Radio (video4linux2) Tuner
        multipart: multipartdemux: Multipart demuxer
        multipart: multipartmux: Multipart muxer
        coreelements: capsfilter: CapsFilter
        coreelements: fakesrc: Fake Source
        coreelements: fakesink: Fake Sink
        coreelements: fdsrc: Filedescriptor Source
        coreelements: fdsink: Filedescriptor Sink
        coreelements: filesrc: File Source
        coreelements: funnel: Funnel pipe fitting
        coreelements: identity: Identity
        coreelements: input-selector: Input selector
        coreelements: output-selector: Output selector
        coreelements: queue: Queue
        coreelements: queue2: Queue 2
        coreelements: filesink: File Sink
        coreelements: tee: Tee pipe fitting
        coreelements: typefind: TypeFind
        coreelements: multiqueue: MultiQueue
        coreelements: valve: Valve element
        videoconvert: videoconvert: Colorspace converter
        matroska: matroskademux: Matroska demuxer
        matroska: matroskaparse: Matroska parser
        matroska: matroskamux: Matroska muxer
        matroska: webmmux: WebM muxer
        tcp: tcpclientsink: TCP client sink
        tcp: tcpclientsrc: TCP client source
        tcp: tcpserversink: TCP server sink
        tcp: tcpserversrc: TCP server source
        tcp: multifdsink: Multi filedescriptor sink
        tcp: multisocketsink: Multi socket sink
        jpeg: jpegenc: JPEG image encoder
        jpeg: jpegdec: JPEG image decoder
        staticelements: bin: Generic bin
        staticelements: pipeline: Pipeline object

        Can you help if you know what went wrong?
        Thanks alot!!

      2. The last element in the pipeline has to be a mixer with the name property set to the string “stream”. No sink is needed because http-launch provides the sink.

  15. I took out the sink and added in “steam” information as below:
    http-launch webmmux streamable=true name=stream v4l2src device=/dev/video0 ! ‘video/x-raw, format=YUV, width=640, height=480, framerate=30/1′ ! jpegenc ! multipartmux

    and now I am getting this error:
    Failed to set pipeline to ready

    1. Hi,

      Sorry, i made some mistake.
      Currently i use this command:
      http-launch v4l2src device=/dev/video0 ! ‘video/x-raw, format=YUV, width=640, height=480, framerate=30/1′ ! jpegenc ! multipartmux streamable=true name=stream

      And I am getting this result:
      Listening on http://127.0.0.1:8080/

      However, when I create a simple website in my target using the following html tags, i failed to see video streaming from my target. (My target is a web server)

      VIDEO TEST!!

      VIDEO TEST!!

      1. First try if your pipeline works with gst-launch, e.g. use a normal video sink after the capsfilter or write the jpeg to a file. Your capsfilter is also wrong, there is no “YUV” format, but different YUV variants like YUY2 or I420.

  16. Hi,
    I tried it with different pipelines to stream a live video, works great but there is something I can’t understand, I hope you could clarify this to me 🙂
    When a client connects to the http stream it waits some seconds before it can show the video, and the time it takes to connect is then almost the amount of latency the client will have.
    I noticed that if the input live video doesn’t change (doesn’t move) then it can take a lot of time before connecting, so in such case the latency will be huge (with matroskamux is usually around 5 sec, with mpegtsmux can be more than 10 or 15 sec !!).
    If the input video changes a lot (moves a lot) then it connects really faster, so when I am “lucky” the latency can be very small.
    I also checked that the input video sends key frame at regular interval, so (to me) it doesn’t seem to be related to this…
    First I was thinking about sync frames, and then maybe about buffer or queue that fills faster when the input changes more …. ?
    And I didn’t find any setting with multisocketsink that could change this, any property I tried didn’t change anything.

    Do you have any thoughts to share about this ?
    Thank you very much, I would really appreciate if I can fix this.

    1. I would try analyzing the stream on the receiver as it is received. That way you can detect if it’s missing keyframes or for which reason it can’t start displaying the video sooner.

  17. Hello Slomo,
    May I ask you for a big favor?

    I’m trying http-launch with h264/mp4, but can never get it to play, neither in a page nor in mpv. mpv fails “Failed to recognize file format.” after trying a lot of containers – I suppose, I’m using a wrong container setup.

    The command I use is:
    ./http-launch 8080 v4l2src ! “video/x-raw,width=640,height=480” ! x264enc tune=zerolatency ! h264parse ! mp4mux streamable=1 name=stream

    A very similar command with theoraenc and oggmux works for me.

    What could be the problem?
    Thanks a lot!

    1. MP4 is not a streamable container like that. You first have to finish the file before it can be read. Which is what the warning message for which you fixed the crash should’ve told you 😉
      For streaming MP4 you could use the DASH streaming protocol, but that works differently and needs more work than just the simple http-launch

  18. Hello would you please Explain how can I compile your code for imx6 devices ( Wandboard ) when I try to compile it , it says no gstreamer1 found , but I have gstreamer 1 with all their dependency
    Best Regards

  19. hi i am running this following http-launch 3333 webmmux streamable=true name=stream tcpclientsrc port=3000 ! vp8enc ! stream. audiotestsrc ! vorbisenc ! stream
    and ,it works fine saying Listening on http://127.0.0.1:3333/
    but from client side if i want to see video (vlc http://127.0.0.1:3333/) t pipeline was closing and giving the following messages
    New connection 127.0.0.1:40987
    Starting to stream to 127.0.0.1:40987
    Starting pipeline
    Failed to start pipeline
    Removing connection 127.0.0.1:40987
    any wrong from Client side ? if yes how can i watch video ?

    1. TCP provides a stream, so you don’t get the raw video one frame per buffer. You can use something like videoparse after tcpclientsrc. Make sure that tcpclientsrc receives raw video in the format you configured and uses exactly the expected stride.

      How do you send the raw video over TCP there?

  20. Hi Slomo,
    Thanks for this great tool!
    I have been trying to use it for MJPEG streaming, as I have seen others tried before, using the multipartmux muxer. I didn’t have much success first, but I eventually did some changes to enable MJPEG on http-launch. Where could I submit a patch for it, in case others find it useful?
    Basically, it checks whether the “stream” element is a multipartmux or not, and if it is, it sets the HTTP header “Content-type” accordingly in the 200 response. It works in firefox and chrome -the only ones I tried yet.

    Best regards

      1. I have just sent it to your email, so that you can have a look. The following line should work now:

        http-launch 8080 videotestsrc ! jpegenc ! multipartmux name=stream

        And hopefully no regressions are introduced. 😀

      2. OK, I have just created a fork at github, in case someone finds it useful already:

        https://github.com/ldearquer/http-launch

        with the basic MJPEG support. I will add the proper one (notify::caps based) on the following days, and do the pull request… and hopefully learn how to use github properly 🙂

  21. Hello . thank you so much for your tutorial . I always follow your tutorials and answers you provide in any websites..
    I want to stream the camera from IMX6 devices and receive it on webpage . I was able to cross-compile your code for Arm-linux But I am not able to stream with this
    We use this code without any problem
    : gst-launch mfw_v4lsrc capture-mode=1 ! video/x-raw-yuv,width=320,height=240,framerate=\(fraction\)5/1 ! ffmpegcolorspace ! jpegenc ! multipartmux ! tcpserversink host=192.168.5.181 port=5000

    when I use just testsrc it works! : ( the only pipeline that works for me is this one )
    ./Omid-custom-output 8080 videotestsrc ! theoraenc ! oggmux name=stream
    But when I changed it for the camera with this command :
    ./Omid-custom-output 8080 mfw_v4lsrc ! theoraenc ! oggmux name=stream
    it says listening but on the target it does not show any video just blank screen .
    Would you please Give me a command where I can stream the camera from arm-linux without delay
    When i use this one also

    ./Omid-custom-output 8080 mfw_v4lsrc ! x264enc ! h264parse ! mp4mux streamable=1 name=stream
    nothings happen it says :
    Starting to stream to 192.168.5.87:54171
    Starting pipeline
    Removing connection 192.168.5.87:54171
    I have tried All the above codes but without any success
    I really appreciate your help 🙂
    Best regards.

    1. That either means that the mfw_v4lsrc element is misbehaving, or it’s the missing videoconvert between that and the video encoder 🙂 Please try adding a videoconvert

  22. Hello thanks for the answer mfw_4vlsrc is working because All the gst-launch pipeplines works for me I have problem with http-launch your code . I tried this one also : http-launch 8080 mfw_v4lsrc ! vpuenc codec=6 ! h264parse ! videoconvert ! multipartmux oggmux name=stream
    But same result .
    I want to stream the camera from Wandboard to web browser through Http
    Would you please help me if you can 🙂
    Thanks

  23. Hello, I don’t know how to make a client embedded in html for an experimental website, which html label should I use and It is possible to reuse your code to achieve the server? (I want to stream the in of the microphone of one computer in the local network webpage)

    1. You could use the audio tag, and stream Vorbis via WebM with my application. That should work in most recent web browsers except for Safari and Internet Explorer.

      1. Can you please, explain which is the first step to stream Vorbis via WebM with your application? I am new working with his tools.

      2. Check the example in the article. That’s WebM with VP8 and Vorbis, you can just drop the video part.

  24. I ran “./autogen.sh” and then says:

    “configure: error: Package requirements (gstreamer-1.0 >= 1.0.8) were not met:

    No package ‘gstreamer-1.0’ found

    Consider adjusting the PKG_CONFIG_PATH environment variable if you
    installed software in a non-standard prefix.

    Alternatively, you may set the environment variables GST_CFLAGS
    and GST_LIBS to avoid the need to call pkg-config.

    And I have already installed the “gstreamer-1.0”. Any Idea about this error?

      1. I was able to execute “./autogen.sh” (Solving the missing packages dependency), how do you run the “http-launch” command?

  25. I was able to install and use the command 😀 now I was trying to remap the alsasink to make it stremeable, with this line “http-launch 5050 alsasink ! vorbisenc ! oggmux name=stream” and the listener have this label “”, there is a way to check if the streamer is working?

    I open a webpage and then the log is:

    $ ~/Desktop/http-launch/src$ http-launch 5050 alsasink ! vorbisenc ! oggmux name=stream
    Listening on http://127.0.0.1:5050/
    New connection 141.21.37.31:59867
    New connection 141.21.37.31:59868
    New connection 141.21.37.31:59869
    Starting to stream to 141.21.37.31:59867
    Starting pipeline
    Timeout
    Removing connection 141.21.37.31:59868
    Timeout
    Removing connection 141.21.37.31:59869
    Removing connection 141.21.37.31:59867

    why could be this happening?

    Thanks beforehands!

    1. Don’t use alsasink, first test with audiotestsrc and once that works test with “alsasrc ! audioconvert”.

      If that still doesn’t work check if your browser actually supports Ogg/Vorbis, and if data is sent over the network (e.g. with wireshark).

  26. Hi,
    when I use and next open the webpage:

    http-launch 5050 alsasrc ! audioconvert ! audioresample ! alanwenc ! rtppcmapay ! udpsink ! oggmux name=stream

    I get this log:

    Listening on http://127.0.0.1:5050/
    New connection 141.21.37.31:50591
    New connection 141.21.37.31:50592
    Starting to stream to 141.21.37.31:50591
    Starting pipeline
    Error Internal data flow error.
    Removing connection 141.21.37.31:50591

    what could be wrong in the pipeline connection? There is a way to see more info about the error?

    Thanks beforehand.

    1. There are a few problems here: First of all a typo, alawenc would be correct. Then you have an udpsink, which can’t be connected further downstream, but also you can’t really stream RTP over HTTP like this, and oggmux also does not accept RTP as input.

  27. Thanks, just another question about how can I reduce the latency of the audio streaming. My final line is: http-launch 5050 pulsesrc ! audioconvert ! vorbisenc ! oggmux name=stream

    1. You don’t really have control over that as web browsers are going to buffer as much as they want to. If you want low-latency audio streaming in the browser, your only real option here is WebRTC nowadays.

    2. Thanks Seb for this nice sample code which helps me a lot.

      I’m just wondering how could i start a playback faster with a browser as client and this pipeline:
      ./http-launch 8080 filesrc location=some.mp3 ! decodebin ! audioconvert ! lamemp3enc name=stream

      The transfer speed is really slow and the playback starts around 7 seconds after the first connection.

      1. That most likely is the buffering that happens inside the browser on the client side. You don’t really have any control over that unfortunately, if it decides to buffer 7 seconds then it does so. If you need lower latency, WebRTC is currently your only standard solution (for web browsers… if you can work with other client applications you have infinitely many possible solutions).

  28. I succed to make it start immedialty with sync=0 in multisocketsink but the playback jumps to the end in the middle of the playback. I guess i should handle in a better way the data sent such as changing the content-lenght in the HTTP header or maybe change the playback speed…What do you think ? You were mentionning the use of soup http, maybe it will handle the http protocol in a better way.

    1. sync=0 would mean that data is sent as fast as possible over the network, instead of real-time. That might be what you want, or not. It will at least allow clients to buffer faster, yes.

      What do you mean with the playback jumps to the end in the middle?

  29. In a 3 min song,the song jumps to the end after 1min30 of playback with sync=0. I suspect http answer seen that the browser looks to behave in a wrong way in the inspector. Im using chrome.

  30. Hi,

    Great post! It is really hard to find out information about HTTP tunneling and GStreamer out there. I would like to go a little bit further and implement RTSP/HTTP. I noticed some HTTP tunneling functionalities have been added to the newer versions of GStreamer. One specific method took my attention:

    http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gst-rtsp-server/html/GstRTSPServer.html#gst-rtsp-server-transfer-connection

    So my question (from a very simplified point of view) is, assumming I already have a RTSP server implementation working and your HTTP server, should I only use this call to tell the RTSP server API to use the HTTP server socket as the output port?

    In summary, can you help me with a hint on how to use the RTSP server API to stream RTSP over HTTP?

    I really appreciate any help on this.

    Thanks beforehand

    1. IIRC gst-rtsp-server and rtspsrc already support HTTP tunneling of RTSP, if that’s what you mean. Other than that you can probably build some kind of proxy with this application to make a non-HTTP-aware RTSP server work via HTTP.

  31. Hello, I noticed that the pipeline seems to be left running (ie. top shows http_launch at >80% on my old laptop) when I close the client (browser, vlc…). I am completely new to gstreamer and gio so I’m struggling with what to do. I tried something like this in on_client_socket_removed:
    if (started) {
    g_print (“Stopping pipeline\n”);
    if (gst_element_set_state (pipeline,
    GST_STATE_READY) == GST_STATE_CHANGE_FAILURE) {
    g_print (“Failed to stop pipeline\n”);
    g_main_loop_quit (loop);
    }
    started = FALSE;
    }

    Does not work (prints “Failed to stop pipeline”), and top still shows full cpu load. How can I pause/stop the pipeline when there are no more clients listening?

      1. Thanks for replying. I had already tried with GST_STATE_NULL, GST_STATE_READY and GST_STATE_PAUSED.

        Afterwards ‘gst_element_get_state’ returns the wanted state in *pending and GST_STATE_PLAYING in *state. Something is wrong then I suppose… (on ubuntu 14.04)

  32. I can’t find a way to stop the pipeline from chewing up cpu cycles when noone is listening. Multisocket seems to be the element that doesn’t want to stop.

    Removing connection 127.0.0.1:58126
    Stopping pipeline
    0:00:16.756983908 23003 0x22df5e0 INFO GST_STATES gstbin.c:2227:gst_bin_element_set_state: current PLAYING pending VOID_PENDING, desired next PAUSED
    0:00:16.757052079 23003 0x22df5e0 INFO GST_STATES gstelement.c:2609:gst_element_change_state: have FAILURE change_state return
    0:00:16.757081066 23003 0x22df5e0 INFO GST_STATES gstelement.c:2203:gst_element_abort_state: aborting state from PLAYING to PAUSED
    0:00:16.757114592 23003 0x22df5e0 INFO GST_STATES gstbin.c:2672:gst_bin_change_state_func: child ‘multisocketsink0’ failed to go to state 3(PAUSED)
    0:00:16.757159714 23003 0x22df5e0 INFO GST_STATES gstelement.c:2609:gst_element_change_state: have FAILURE change_state return
    0:00:16.757186884 23003 0x22df5e0 INFO GST_STATES gstelement.c:2203:gst_element_abort_state: aborting state from PLAYING to NULL
    Failed to stop pipeline

  33. Well, adding this line to the end of the “on_client_socket_removed” function:
    gst_element_set_state (pipeline, GST_STATE_NULL)
    , connecting to the stream and then disconnecting does it (using the test pipeline suggested “videotestsrc ! theoraenc ! oggmux name=stream”).

    I actually don’t want to file a bug since I have no idea what I’m doing or if this is supposed to work like that, and digging up that info would probably take a couple of days full time at minimum. Anyway my usecase was more of a “nice to have” toy thing so I’ll get by with killing the process when I’m done.

    1. I didn’t look closer but I think you can’t do it from that signal. That signal is going to be called from a streaming thread of the sink, and you can’t change states there. It would have to be dispatched from there to another thread… like the main thread that runs the main loop.

      1. Calling gst_element_set_state (pipeline, GST_STATE_PAUSED) from the main loop works, but GST_STATE_NULL and GST_STATE_READY segfaults. Anyway ‘GST_STATE_PAUSED’ does what I want, no unnecessary cpu load when nobody watches the stream. Thanks for your help!

  34. Hi, very good post! I have successfully managed to stream h264 video from my device to VLC on my computer. I’m feeding H264 data from my encoder to to gstreamer through appsrc and I’m using Gstreamer’s rtsp server to do the RTSP handshaking and RTP packetization.

    Now I would like to run the same setup but instead tunnel RTSP/RTP over HTTP. I see that there is function gst_rtsp_server_transfer_connection(). Is this what I should use? Do I need to have a HTTP server running on my device, or can it be solved without? Will the rtsp server automatically do the needed base64 coding?

    Thanks!

    1. gst-rtsp-server should allow to do all that automatically without requiring anything specific from you. Just access the RTSP server via HTTP/RTSP tunneling, e.g. by using GStreamer’s rtspsrc with the rtsph:// URI scheme.

  35. Is it possible to stream raw RGB or YUV from http-launch ?

    Pipeline ->
    ./http-launch 8080 videotestsrc name=stream

    Response ->
    (http-launch:3066): GLib-GObject-WARNING **: invalid cast from ‘GstVideoTestSrc’ to ‘GstBin’

    (http-launch:3066): GStreamer-CRITICAL **: gst_bin_get_by_name: assertion ‘GST_IS_BIN (bin)’ failed
    no element with name “stream” found

    1. With some code changes, yes. Currently it assumes that a GstBin is created, but as you only have a single element this is not the case. Change the code accordingly.

      How do you want to receive the raw video though?

  36. Nice post. How to do this using a standard webserver such as Apache or nginx? Assuming a webserver is already bound to a port 80 and I want to stream using the same Web server, what would be an ideal way to deliver mpegts live stream over UDP for HbbTV kind of application?

    1. Multiple options. For example, you could use Apache/nginx as a reverse proxy to just access GStreamer or you could develop a Apache/nginx module that does it (and then works on the socket given to the module directly). As written, this http-launch tool is just an example and in production you would ideally build something properly around the same concepts 🙂

  37. I’m glad this thread is still active. Can someone give me an update on how to make a live H.264 video-only stream at 5 to 10 fps from a named pipe accessible using rtsp-tunneled-over-http on a Raspberry Pi 0 running Raspian ? For minimal cpu usage, I just want to make the stream available without modification, so it should just be a matter of wrapping the stream in the right protocols.

    1. Use gst-rtsp-server for that, it has support for RTSP-over-HTTP tunneling already and can be fed with a pre-encoded stream.

  38. Thank you. I see it is a library not an executable. I have no idea how to create such a thing. Where can I get an executable for RPi that has the features I need, or that is ready to compile ?

  39. I am very new to gstreaming by just referring to ios tutorial 3 trying my own port so by using pipeline as “gst_parse_launch(“tcpclientsrc port=5000 host=***.***.**.*** ! h264parse ! avdec_h264 ! videoconvert ! autovideosink tcpclientsrc port=2000 host=***.***.**.*** do-timestamp=true ! application/x-rtp-stream,media=audio,clock-rate=48000,encoding-name=MPA ! rtpstreamdepay ! rtpmpadepay ! decodebin ! audioconvert ! audioresample ! autoaudiosink sync=false”, &error); ”
    its returning “unable to build pipeline : no tcpclientsrc , so can you please guide me how and where am going wrong

  40. Thank you a lot for your code. I’m confused when using your code to stream H265 video to HTTP Server. Your code that can stream h265 data to HTTP?
    So if it can stream, could you please give a suggestion about muxing this data to stream HTTP or a pipeline to streamer H265 video to HTTP Server? Thanks.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.