In some previous posts I developed a custom QML component to render video in a QML scene using hardware accelerated decoding capabilities and rendering without passing on the ARM side. This resulted in good performance of the Raspberry Pi even with 1080p high profile h264 videos.
Many bugs need to be fixed, code should be refactored a little but still it shows it is possible and that it works good. So I decided to move the following step: modifying Qt to make it possible to use the "standard" QtMultimedia module to access the same decoding/rendering implementation. This would make it possible to better integrate with Qt and allow users to recompile without changing anything on their implementation.
The QtMultimedia module uses gstreamer on Linux to provide multimedia capabilities: gstreamer is unfortunately not hardware accelerated on the Pi unless you use something like gst-omx.
Thus, I started to look at the QtMultimedia module sources in Qt5 and found out (as I was hoping), that the Qt guys have done, as usual, a very good job in designing the concept, providing the classic plugin structure also for multimedia backends. Unfortunately, also as usual, not much documentation is provided on how to implement a new backend, but it is not that difficult anyway by looking at the other implementations.
Design
At the end, I came up with a structure like this: I implemented a new QtMultimedia backend providing the MediaPlayer and VideoOutput minimal functionalities leveraging a "library-version" of the PiOmxTextures sample code which in turn uses a "bundled" version of omxplayer implemented using the OpenMAX texture render component as a sink for the video.As said, Qt guys have done a good job! I didn't have to change almost nothing of the Qt implementation; all the implementation is inside the plugin (apart from a minimal modification on the texture mapping, for some reason it was upside-down and inverted).
Results
The result is pretty good, I don't see many differences from the previous custom QML component (the decoding and rendering code is the same and the QML component is implemented using the same exact principle, so nothing really changed).I'm only beginning to play a little bit with this, I just tried a couple of things. In the video you can see the "standard" qmlvideo and qmlvideofx examples provided with the Qt sources.
How to Build
How to Use
What you have to do is to simply provide your application the QQuickView by using the exact class, which must be included in your application:
class RPiQuickView
{
public:
static QQuickView* getSingleInstance();
};
Q_DECL_EXPORT QQuickView* RPiQuickView::getSingleInstance() {
static QQuickView instance;
return &instance;
}
This is needed because the plugin will look for the RPiQuickView::getSingleInstance() symbol, which should be found after the dynamic linker has linked to plugin the the executable. Also, you'll need to add -rdynamic to the LFLAGS of your application, so we ensure that the linker will add the symbol to the symbol table.