Sunday, February 10, 2013

Animations on a Surface Rendering Video on Android

I've been recently asked to do a little research on how to implement animations on a video surface in Android, somehow similarly to how I did in the previous posts on RPi. It seemed interesting so I tried to do some investigations.

After reading something in the Android's documentation, I took a random Android 4.0 low-cost tablet and I started analyzing the problem.

First thing I tried is creating a simple VideoView and applying some regular Android animations on it. I tried to apply a TranslateAnimation and a ScaleAnimation, but the result was that the video geometry didn't change, only a black square representing the view was animated. Seems to be more or less similar to this.

I also tried to use the 3.1 animation system, but the result was the video actually moving, but leaving a trace behind it. Both this defects might be related to how the video rendering is performed at the lower levels, so it might not be the case for other boards.

The only other thing I tried before starting to dig into the OpenGL world is to actually "create" an animation by changing the layout parameters applied to the VideoView. By interpolating the values like a damped harmonic ascillator I got the result in the video. Implementing it more accurately you might get much better results.

I therefore started to look at something different: starting from API level 11 the SurfaceTexture might be the solution to all the needs. By using this class as a surface for the MediaPlayer it is possible to stream the video to an OpenGL texture. This seems to work pretty well (see the video) and it is not difficult to implement if you know OpenGL.

Anyway, for simple tasks, OpenGL might be overkill, so I tried to look at some Android classes that could let me render the texture without needing to create the entire application in OpenGL. I have not found a way yet (if you do please add a comment!), but I started to think that, once again, Qt might be the answer :-)

The third sample application you see in the video is a custom QML item created by rendering in the Qt OpenGL context the texture provided by the SurfaceTexture class controlled using JNI. The result is very good. The QML code I used is exactly the same used in previous posts for the RPi sample application. The Qt porting for Android I used is the old Necessitas alpha 4

EDIT: If API level 14+ is available, then it is possible to render the texture provided by the SurfaceTexture in a TextureView (thanks Tim for pointing this out!): this is the fourth sample in the video.


  1. Not sure if this is what you mean, but you can use TextureView to render an OpenGL scene in a standard android View.

    "A TextureView can be used to display a content stream. Such a content stream can for instance be a video or an OpenGL scene.

    Unlike SurfaceView, TextureView does not create a separate window but behaves as a regular View. This key difference allows a TextureView to be moved, transformed, animated, etc. For instance, you can make a TextureView semi-translucent by calling myView.setAlpha(0.5f)."

  2. Do you have the sample code where you used SurfaceTexture?

    1. No, sorry, it was straightforward from the documentation.

    2. Ok. Thanks anyways!!

  3. Here is also an open source project I've found which shows how to render android's views to OpenGL texture:

  4. Can you please share the source code? Thanks a lot.