Ok, I did this a couple of months ago but I just realized it might be of help to someone who is currently
using Necessitas Qt4 for some project and still cannot use Qt5.
This is a sample code which shows how to create a custom QML component in the Qt4 Necessitas porting to
use hardware acceleration on any Android devices with API level at least 11. The result is pretty good, you
can check the demo I uploaded on youtube a couple of months ago (the third application shown is the one which
is implemented over Qt 4):
The description says it all: "The third sample code uses a custom QML component written in C++ using a Qt 4.8.2 port for Android
(Necessitas). Regular QML animations are then applied to the custom component. The semi-transparent image
is a regular QML Image element with alpha set to 0.5."
The code is available now here on github:
https://github.com/carlonluca/TextureStreaming.
The project TextureStreaming can be opened with Qt Creator and run on a device (assuming API
level constraint is met).
As you can see from the code, a custom QML component is used and placed in the QML scene. That component
instantiates some Java classes through JNI glue code and use the Android standard Media Player to start
decoding video and playing audio. The sink is set to be a
SurfaceTexture instance,
which provides the OpenGL texture that the custom QML component renders in the QML scene. Result is pretty
good.
I've been recently asked to do a little research on how to implement animations on a video surface in
Android, somehow similarly to how I did in the previous posts on RPi. It seemed interesting so I tried
to do some investigations.
After reading something in the Android's documentation, I took a random Android 4.0 low-cost tablet and
I started analyzing the problem.
First thing I tried is creating a simple VideoView and applying some regular Android animations
on it. I tried to apply a TranslateAnimation and a ScaleAnimation, but the result was that the video geometry didn't change, only
a black square representing the view was animated. Seems to be more or less similar to this.
I also tried to use the 3.1 animation system, but the result was the video actually moving, but leaving a trace behind it. Both
this defects might be related to how the video rendering is performed at the lower levels, so it might not be the case for other
boards.
The only other thing I tried before starting to dig into the OpenGL world is to actually "create" an animation by changing the
layout parameters applied to the VideoView. By interpolating the values like a damped harmonic ascillator I got the result
in the video. Implementing it more accurately you might get much better results.
I therefore started to look at something different: starting from API level 11 the SurfaceTexture might be the solution to all
the needs. By using this class as a surface for the MediaPlayer it is possible to stream the video to an OpenGL texture. This seems
to work pretty well (see the video) and it is not difficult to implement if you know OpenGL.
Anyway, for simple tasks, OpenGL might be overkill, so I tried to look at some Android classes that could let me render the
texture without needing to create the entire application in OpenGL. I have not found a way yet (if you do please add a comment!), but I started to think that, once
again, Qt might be the answer :-)
The third sample application you see in the video is a custom QML item created by rendering in the Qt OpenGL context the texture
provided by the SurfaceTexture class controlled using JNI. The result is very good. The QML code I used is exactly the same used
in previous posts for the RPi sample application. The Qt porting for Android I used is the old Necessitas alpha 4
EDIT: If API level 14+ is available, then it is possible to render the texture provided by the SurfaceTexture in a TextureView (thanks Tim for pointing this out!): this is the fourth sample in the video.