GStreamer and GObject can be used from Python via the python-gi and python-gstmodules. This repository is just a trivial docker file that gets you a workinggstreamer pipeline from which you can start building and exploring.
Tutorials Welcome to the GStreamer Tutorials! The following sections introduce a series of tutorials designed to help you learn how to use GStreamer, the multi-platform, modular, open-source, media streaming framework. Before following these tutorials, you need to set up your development environment according to your platform. So, with this in mind, I am going to write a simple first guide to getting started with GStreamer using the excellent Python bindings. Thinking of writing a media app in Python? This seems like a good place to start. email protected ยป GStreamer Dynamic Pads. Getting started with GStreamer with Python. Ubuntu Jono.
It is assumed that you understand how to program in Python. You can see thegstreamer script in python/demo.py.
Installation/prerequisites:
Running the demo:
Once the image comes up you'll see the following being logged:
which indicates that the gstreamer pipeline is producing HLS content.You can now view the video by starting another window and running:
and you should see the video-test-pattern displayed in VLC.
What's Going On?
The docker-ish parts are:
- there's an nginx server that's just doing static file serving (port 8080)
- there's an ubuntu yakkety image with gstreamer, python3-gst, and python3-gi
- there's a very small Python file (python/demo.py) that runs the pipeline
What's a Pipeline?
GStreamer is a pipeline-oriented media processing system. It works by hookingup 'pads' (sources, sinks) on elements and then allowing the low-level frameworkto do the actual media processing.
The pipeline we're setting up is a trivial HTTP Live Streaming stream. It doesn'tdo multiple encodes or anything fun like that. It just creates a single playlistthat includes:
- h264 video encoding
- aac audio encoding
- mpeg TS muxing and HLS playlist creation
and makes that available in the nginx server's root directory.
We're using the same command-line syntax to setup the pipeline as thegstreamer-tools gst-launch program uses. In this syntax you'll seeelements (and caps) joined with ! characters which act as a mediapipe between the elements.
Note
You can also setup the pipelines programatically by addingindividual elements to a pipeline instance.
You'll note that elements start with an element name:
- videotestsrc
- x264enc
- mpegtsmux
- hlssink
- audiotestsrc
- avenc_aac
and that they can have properties specified with x=y syntax:
- pattern=0
- bitrate=256
- name=muxer
The name property is special in that all elements have names, andyou can use the name to get a handle to a property from your pipelineif you want to programatically manipulate the element.
You'll note that at the end of the audio pipeline we include:muxer. which is actually saying 'route this into the elementnamed muxer.
Lastly, you'll note that we have two sections that look like:
- video/x-raw,framerate=30000/1001,width=720,height=480
- audio/x-raw,rate=48000,channels=1
these are caps in Gstreamer, their role is to constrain thedata-formats being negotiated between the two elements that areon either side of them, so that two elements which are veryflexible in what they can produce and consume can be restrictedto produce the format you actually need.
Next Steps
You likely don't want to spend the rest of your day looking attest patterns in vlc, so how do you go about doing more interestingthings?
- gst-inspect will let you see what elements are available, and whatproperties those elements have
- you'll find lots of gst-launch samples on the internet