Wednesday, April 24, 2013

Hardware Accelerated QtMultimedia Backend for Raspberry Pi and OpenGL Shaders on Video

EDIT: This article is completely outdated. Refer to newer posts for properly working builds, instructions on how to build etc... In particular: "Using POT Builds" and "Build Procedure for PiOmxTextures".

In some previous posts I developed a custom QML component to render video in a QML scene using hardware accelerated decoding capabilities and rendering without passing on the ARM side. This resulted in good performance of the Raspberry Pi even with 1080p high profile h264 videos.

Many bugs need to be fixed, code should be refactored a little but still it shows it is possible and that it works good. So I decided to move the following step: modifying Qt to make it possible to use the "standard" QtMultimedia module to access the same decoding/rendering implementation. This would make it possible to better integrate with Qt and allow users to recompile without changing anything on their implementation.

The QtMultimedia module uses gstreamer on Linux to provide multimedia capabilities: gstreamer is unfortunately not hardware accelerated on the Pi unless you use something like gst-omx.

Thus, I started to look at the QtMultimedia module sources in Qt5 and found out (as I was hoping), that the Qt guys have done, as usual, a very good job in designing the concept, providing the classic plugin structure also for multimedia backends. Unfortunately, also as usual, not much documentation is provided on how to implement a new backend, but it is not that difficult anyway by looking at the other implementations.

Design

At the end, I came up with a structure like this: I implemented a new QtMultimedia backend providing the MediaPlayer and VideoOutput minimal functionalities leveraging a "library-version" of the PiOmxTextures sample code which in turn uses a "bundled" version of omxplayer implemented using the OpenMAX texture render component as a sink for the video.

As said, Qt guys have done a good job! I didn't have to change almost nothing of the Qt implementation; all the implementation is inside the plugin (apart from a minimal modification on the texture mapping, for some reason it was upside-down and inverted).

Results

The result is pretty good, I don't see many differences from the previous custom QML component (the decoding and rendering code is the same and the QML component is implemented using the same exact principle, so nothing really changed).
I'm only beginning to play a little bit with this, I just tried a couple of things. In the video you can see the "standard" qmlvideo and qmlvideofx examples provided with the Qt sources.

How to Build

Clone the repo somewhere, then use the prepare_openmaxil_backend.sh script in tools. It will compile PiOmxTextures as a shared lib and will place everything you need in the openmaxil_backend directory. Copy that directory recursively into your Qt source tree in qt_source_tree/qtmultimedia/src/plugins naming it simply openmaxil.

Some changes are needed to the Qt tree to make it compile the new backend automatically instead of the gstreamer backend, for the texture mapping and to make the "standard" qmlvideo and qmlvideofx examples work. No real modification to the code is needed: sufficient to instantiate the QQuickView those examples use with a specific class definition. This is needed. and to provide the plugin the instance of the QQuickWindow containing the media player.
These changes can be applied with a patch to the qtmultimedia tree using the patch in the tools in git. Then build the qtmultimedia module with:

path_to_qmake/qmake "CONFIG+=raspberry"

You'll find all you need here: https://github.com/carlonluca/pi.

How to Use

After you have built the plugin, you can simply use the "standard" Qt API for MediaPlayer and VideoOutput. Only restriction is that the plugin needs to access a QQuickView to access the renderer thread of the Qt Scene Graph. This might be an issue, but I've not found another solution to this yet.

What you have to do is to simply provide your application the QQuickView by using the exact class, which must be included in your application:


class RPiQuickView
{
public:
   static QQuickView* getSingleInstance();
};

Q_DECL_EXPORT QQuickView* RPiQuickView::getSingleInstance() {
   static QQuickView instance;
   return &instance;
}


This is needed because the plugin will look for the RPiQuickView::getSingleInstance() symbol, which should be found after the dynamic linker has linked to plugin the the executable. Also, you'll need to add -rdynamic to the LFLAGS of your application, so we ensure that the linker will add the symbol to the symbol table.

This is what I added to the qmlvideo and qmlvideofx examples to make those work. This is of course not elegant, but still I couldn't find a better way in reasonable time.

Of cuorse, you'll have to copy the Qt libraries that are built to your Pi, together with libPiOmxTextures.so (unless you build it statically) and the ffmpeg libraries (do not use the ffmpeg libs you have in your Pi, it is likely those won't work; use those compiled by the compile_ffmpeg.sh script in tools.

What Remains to Be Done

Most the calls are not implemented, just the minimal to get video on the screen. Also, still the audio implementation is missing (but OMX_MediaProcessor class should be ready to play audio as well) and only the QtQuick side is taken into consideration: I've never had the time to look at the widget implementation.

In case you find bugs, try to report an issue on github. If I'll find the time I'll answer.

Edit 6.25.2013

Instantiation of the QQuickView using the RPiQuickView class is no more needed from 30e24106c5dd7a5998d49d7093baef49f332b1d2. I tested this revision with Qt 5.1.1 and everything seems to keep working correctly.

Friday, April 19, 2013

Accelerated Video Decoding and Rendering in QML with Necessitas Qt 4 for Android

Ok, I did this a couple of months ago but I just realized it might be of help to someone who is currently using Necessitas Qt4 for some project and still cannot use Qt5.
This is a sample code which shows how to create a custom QML component in the Qt4 Necessitas porting to use hardware acceleration on any Android devices with API level at least 11. The result is pretty good, you can check the demo I uploaded on youtube a couple of months ago (the third application shown is the one which is implemented over Qt 4):



The description says it all: "The third sample code uses a custom QML component written in C++ using a Qt 4.8.2 port for Android (Necessitas). Regular QML animations are then applied to the custom component. The semi-transparent image is a regular QML Image element with alpha set to 0.5."

The code is available now here on github: https://github.com/carlonluca/TextureStreaming. The project TextureStreaming can be opened with Qt Creator and run on a device (assuming API level constraint is met).

Take into consideration that Qt guys are working on the QtMultimedia backend for Android, and I think it should be available in Qt 5.2. You might want to try that also: http://qt.gitorious.org/qt/qtmultimedia/trees/dev/src/plugins/android.

How it Works

As you can see from the code, a custom QML component is used and placed in the QML scene. That component instantiates some Java classes through JNI glue code and use the Android standard Media Player to start decoding video and playing audio. The sink is set to be a SurfaceTexture instance, which provides the OpenGL texture that the custom QML component renders in the QML scene. Result is pretty good.