Gstreamer overview

= What is GStreamer =

GStreamer is a library for constructing graphs of media-handling components. The applications it supports range from simple AV playback, audio/video streaming to complex audio (mixing) and video (non-linear editing) processing. The GStreamer core function is to provide a framework for plugins, data flow and media type handling/negotiation. It also provides an API to write applications using the various plugins. GST elements chain via pads (similar to ports in OpenMAX world) and exchange data between source pads of a data generating element and sink pads of the data consumer element. Applications can take advantage of advances in codec and filter technology transparently. Developers can add new codecs and filters by writing a simple plugin with a clean, generic interface.

In omap, both GST-OpenMax and GST-Ducati provide elements that interface with omap hardware.

The code for GStreamer-omap can be found in GStreamer Omap repository in gitorious. Check the GStreamer build Instructions to build GStreamer from source code.

For general GStreamer development info check GStreamer developers Info and GStreamer official website

GStreamer Stack
GStreamer provides plugins across the entire filter graph. GStreamer plug-ins could be classified into
 * protocols handling
 * sources: for audio and video (involves protocol plugins)
 * formats: parsers, formaters, muxers, demuxers, metadata, subtitles
 * codecs: coders and decoders
 * filters: converters, mixers, effects, ...
 * sinks: for audio and video (involves protocol plugins)

TI specifically focuses on Codecs, Sinks and Filters.

Omap GStreamer modifications
Gstreamer for omap adds strided video to GStreamer, correspondent to the video format "video/x-raw-yuv-strided". GStreamer library is patched to work with strided format and GST-Ducati and GST-Openmax support strided video.

StrideTransform is an element that is able to convert between strided and non-strided video.

Source, sink and filter elements

 * Video sinks:
 * PVRVideosink is the recommended sink to display video. This element is contained in gst-plugins-bad and depends on libpvr2d library.
 * XImagesink can be used as a fallback video sink when libpvr2d is not available.


 * Video sources:
 * v4l2src can be used as a video source.

Display webcam video
gst-launch v4l2src device=/dev/video3 ! "video/x-raw-yuv, width=640, height=480,framerate=15/1" ! queue ! ffmpegcolorspace ! pvrvideosink

= GST-Ducati =

GST-Ducati is a GStreamer plugin that provides elements to decode and encode video. GST-Ducati uses the hardware accellerated codecs on IVA-HD via the codec-engine API. It uses the libdce (distributed codec-engine) library to use the codec on ducati.

From GLP1.5 on, GST-Ducati encoders and decoders should be used instead of GST-OpenMax ones.

Resources
GST-Ducati and libdce code can be downloaded from GStreamer-omap repository. Source for Debian packages are also available at Ubuntu-omap repository.

Encoders and decoders elements
The following elements are provided by GST-Ducati plugin:


 * ducativc1dec: VC1 decoder
 * ducatimpeg2dec: MPEG2 decoder
 * ducatimpeg4dec: MPEG4 decoder
 * ducatih264dec: H264 decoder
 * ducatimpeg4enc: MPEG4 encoder
 * ducatih264enc: H264 encoder

Decoding
GST-Ducati decoders are automatically chosen to play video. (Use option -v to see what elements are being used):
 * gst-launch playbin2 uri=file://home/user/sample.mp4
 * gst-launch filesrc location=/home/user/sample.mp4 ! decodebin2 ! pvrvideosink
 * gst-launch filesrc location=/home/user/sample.mp4 ! decodebin2 ! ximagesink

Example manual pipeline using a gst-ducati decoder and pvrvideosink:
 * gst-launch filesrc location=/home/user/sample.mp4 ! qtdemux ! h264parse ! ducatih264dec ! pvrvideosink

Encoding
Encode webcam video to file:
 * gst-launch v4l2src device=/dev/video3 ! "video/x-raw-yuv, width=640, height=480,framerate=30/1" ! queue ! ffmpegcolorspace ! ducatih264enc ! h264parse ! qtmux ! filesink location=sample.mp4

Reworking an existing pipeline
In the case you have previously working pipelines using GST-OpenMax encoders and decoders, there are some general rules of thumb to make your pipeline work with GST-Ducati and latest GStreamer-Omap:


 * Replace omx_h264enc by ducatih264enc ! h264parse.
 * Replace omx_mpeg4enc by ducatimpeg4enc ! mpeg4videoparse.
 * Replace h264parse access-unit=true output-format=byte by h264parse without options.
 * Replace nal2bytestream_h264 element by h264parse.
 * Remove any unsupported parameter. For example input-buffers and output-buffers are not needed for DCE encoders anymore.
 * Adapt values that may have changed. For example, gst-inspect ducatih264enc says that H264 high-profile is now set with profile=100. Formerly with omx_h264enc it was profile=8.
 * Replace video/x-raw-yuv-strided by video/x-raw-yuv.
 * Replace v4l2sink by pvrvideosink.

More samples wanted
Stream Video from File

Stream Video from Webcam

Play Video Stream

= GST-OpenMax =

In GST-OpenMax, GStreamer plugin interfaces with OMX codecs, drivers or transforms based on the functionality. The TI GST- OMX SW stack is as follows.



GST Buffer Flow
This section provides more details on the exact interfaces between a GST plugin (as in GSTOMX_Base_filter) and OpenMAX component at different stages of the life cycle. This description is based on a generic filter behavior (GSTOMX_Base_filter) that is applicable for video and audio decoders/encoders.



Example TI-OpenMAX filter graph for AV Capture
The following diagram describes a full filter graphs and associations between GST, TI SW and HW components taking AV capture as example



GST – OMX integration issues and Impact:

 * GST does NOT adhere to OpenMAX rule of 1 to 1 mapping of OMX Buffer Header to OMX data buffer (pBuffer).
 * GST Src elements (Non-TI element, originator of bitstreams) can potentially provide new data buffer every time; which means OMX Buffer Headers that come to OMX elements via a ETB call could find new pBuffer pointers every time.
 * GST Sink elements (TI V4L2 sink, consumer of RAW data – 1080p NV12) maintains a buffer pool through which they re-cycle data buffers.
 * The buffer pool could have M buffers (TI V4L2 sink could maintain M tiler buffers) sent to OMX Component using N OMX Buffer headers.
 * Value of M could be GREATER or EQUAL to N. This implies that even though we can MAP once the N buffers during OMX_UseBuffer call. There could potentially be new data buffers (M – N) found in OMX Buffer headers during the FTB call. This mandates a check and map (if required) these new buffers during a FTB call.
 * OMX Decoders currently keep a tag of reference buffers w/ OMX Buffer headers (assumes 1 to 1 mapping of OMX Buffer hdr and pBuffer pointer). Hence will break if GST puts arbitrary buffers into OMX Headers.

Impact:

 * OMX Proxy by design does mapping in OMX Usebuffer/OMX Allocate buffer calls. Now need to check for new data buffers and map them during FTB calls.
 * Ideally OMX Buffers map only once and caches pre-mapped addresses. Now for bitstream buffers we need to Map/UnMap every time.
 * Reference counting in Decoders are w/ OMX Buffer headers, now they need to reference count based on actual data buffer (pBuffer pointer).