Saturday, February 23, 2013

QML Components for Video Decoding and Rendering POC Code Available

As requested I shared the sources of the demo videos I posted recently. I tested these components with a few videos and I saw that it seems to work "reasonably well" also for 1080p h264 high profile with 5.1 audio. The current implementation uses a player class which decodes data and a surface class that renders. Rendering the video on more surfaces seems to work.

Beware that the code is not complete, it is only a proof of concept of how to implement. If you need to use it in production code, you'll have to work pretty much on it. There are many TODO's left and no testing has been run on the classes. The cleanup code must be completely rewritten and only pause/resume/stop commands are implemented at the moment. Also consider going through the relevant code for leaks, I didn't pay much attention when implementing because it was my idea to refactor, sorry.

Only 1080p resolution is currently supported, never even tried anything different, you'll probably have to look around and see where I hardcoded those values (I was in a hurry :-))
There are many unused classes in the code, I left those there only because those might be useful for new implementations.

I started to work on other things recently, so I really have few time to work on this. But still I see that many are interested, so I decided that incomplete code is better than no code. Also, I have to say I have no practical need of these components, I only worked on this as a challenge in my spare time. Now that there is no challenge anymore, I have to say I lost some interest and I'm looking for a new one :-D

This is the github URL of the repo (PiOmxTextures is the project directory):

https://github.com/carlonluca/pi

The current implementation of the OMX_MediaProcessor class uses the components implemented in the omxplayer code, with modifications to some of those. Those modified sources are placed in the omxplayer_lib directory in the project sources: I chose this architecture to make it relatively simple to merge the changes from the omxplayer sources.

How to build

To build the project, you'll need a build of the Qt libraries, version 5.0.0 at least. Instructions on how to build can be found around the web. I also wrote a quick article on that if you need it (this is the updated version for 5.0.1).

Once you have your Qt build and Qt Creator setup, you can open the .pro file. You should also have the Raspberry sysroot somewhere in your system, and that should be known by Qt Creator. The project is based on the same dependencies of omxplayer, so you need those as well. I tested this only against a specific build of ffmpeg, which is the one omxplayer was using when I last merged: to compile you can use this script, which is included in the source tree. Just running it passing the number of compilation thread to use should be sufficient:

git clone https://github.com/carlonluca/pi
cd pi/PiOmxTextures/tools
./compile_ffmpeg.sh n
cd ../../
mkdir PiOmxTextures-build-rasp-release

cd PiOmxTextures-build-rasp-release 
path_to_qmake/qmake "DEFINES+=CONFIG_APP" ../PiOmxTextures
make -jn


Pay attention that the sample application renders a file whose path is hardcoded in the qml file. Change that if you want to test the sample code.

As said, this is just a proof of concept and I do not have much time to maintain it. If you want to help to fix the code and to merge new changes, issue a merge request!
Hope this code might be of help for other open source project! Feel free to leave a comment if you have any! Bye!

Sunday, February 10, 2013

Animations on a Surface Rendering Video on Android

I've been recently asked to do a little research on how to implement animations on a video surface in Android, somehow similarly to how I did in the previous posts on RPi. It seemed interesting so I tried to do some investigations.

After reading something in the Android's documentation, I took a random Android 4.0 low-cost tablet and I started analyzing the problem.

First thing I tried is creating a simple VideoView and applying some regular Android animations on it. I tried to apply a TranslateAnimation and a ScaleAnimation, but the result was that the video geometry didn't change, only a black square representing the view was animated. Seems to be more or less similar to this.

I also tried to use the 3.1 animation system, but the result was the video actually moving, but leaving a trace behind it. Both this defects might be related to how the video rendering is performed at the lower levels, so it might not be the case for other boards.

The only other thing I tried before starting to dig into the OpenGL world is to actually "create" an animation by changing the layout parameters applied to the VideoView. By interpolating the values like a damped harmonic ascillator I got the result in the video. Implementing it more accurately you might get much better results.

I therefore started to look at something different: starting from API level 11 the SurfaceTexture might be the solution to all the needs. By using this class as a surface for the MediaPlayer it is possible to stream the video to an OpenGL texture. This seems to work pretty well (see the video) and it is not difficult to implement if you know OpenGL.

Anyway, for simple tasks, OpenGL might be overkill, so I tried to look at some Android classes that could let me render the texture without needing to create the entire application in OpenGL. I have not found a way yet (if you do please add a comment!), but I started to think that, once again, Qt might be the answer :-)

The third sample application you see in the video is a custom QML item created by rendering in the Qt OpenGL context the texture provided by the SurfaceTexture class controlled using JNI. The result is very good. The QML code I used is exactly the same used in previous posts for the RPi sample application. The Qt porting for Android I used is the old Necessitas alpha 4

EDIT: If API level 14+ is available, then it is possible to render the texture provided by the SurfaceTexture in a TextureView (thanks Tim for pointing this out!): this is the fourth sample in the video.