Friday, December 7, 2012

Decoding and Rendering Compressed Images with OpenMAX on Raspberry Pi

After building Qt 5.0 on Raspberry Pi, I focused the attention to hardware acceleration using OpenMAX. This is quite an interesting subject to study. The most interesting element I found among those available for VideoCore is the egl_render component. This component should be able to render content coming from the output of the decoder component directly into an EGL image, which is bound to a texture.
Of much interest for me is rendering video into an texture, but I thought for the moment I could start just with decoding an image and placing it to an OpenGL texture for rendering. This can be done by using the image_decode component and the egl_render component. Refer to the documentation in the git for some more information.
Unfortunately not much documentation is available for this task, but I found something very interesting that helped me out much:
  1. This source code by Matt Ownby: I reused much of his excellent work.
  2. This project by Qt
  3. The textures example from Qt sources
The code I wrote is provided in a package below. I ran some tests by using the original way of loading textures:

QGLWidget::bindTexture(QPixmap(fileAbsolutePath, GL_TEXTURE_2D, GL_RGBA));

and the method using the egl_render OpenMAX component. These are the results I got by loading 6 jpegs 1920x1080:
    Average time out of 5 runs without OMX: 6750ms;
    Average time out of 5 runs with OMX: 918ms.
Here is the package containing the sources (version 1.0): PiOmxTextures.tar.bz2. The code links to Qt 5.0 and needs 6 images to be specified on the command line to load. You need to move the images to the same directory (no spaces) and to name those images like:

prefix{0, 1, 2, 3, 4, 5}.jpg

To compile you'll need Qt Creator or at least qmake:

cd PiOmxTextures
your_qmake
make


This is a video showing the performance loading 6 1080p jpegs, first the software decoding is used, second and third run are hardware accelerated:
Sorry the quality is really bad, but the performance can be appreciated anyway. Also it seems that for some reason the software implementation failed to load one image, you can see a black texture on the side of the cube. The same is not happening with the hardware implementation. Didn't investigate the reason.
This is not a ready-to-use code, just some notes that might be useful. I've almost never used OpenMAX or OpenGL so if you happen to have any suggestion or observation, feel free to comment! ;-)

20 comments:

  1. Hi, WOW, fantastic work. Good job. I was wondering if you can help me, I have tried running your code on my raspberry pi and i keep getting the following errors:

    bool QGLShaderPrivate::create(): Could not create shader of type 1.
    bool QGLShaderPrivate::create(): Could not create shader of type 2.
    QGLShaderProgram: could not create shader program
    QGLShaderProgram::uniformLocation( texture ): shader program is not linked
    QGLShaderProgram::uniformLocation( matrix ): shader program is not linked

    I am running Qt5 beta 1 on Raspbian Wheezy, any ideas???

    Thank you very much in advance.

    ReplyDelete
  2. Hi. I'm sure I had the same problem once... But I really can't remember what I did. Can you run the textures sample in the Qt sources?

    ReplyDelete
  3. Oh well that's interesting. I was not able to run the 'textures' example. It gave me the following error:

    EGL Error : Could not create the egl surface: error = 0x3000

    I am able to run QQuick 2.0 fine. Could I possibly be missing something from my Qt build?

    Thanks again.

    ReplyDelete
  4. Hi again. I have sorted out the previous problem:

    EGL Error : Could not create the egl surface: error = 0x3000

    And now I get the same error when running the textures example that I do when I run your code:

    bool QGLShaderPrivate::create(): Could not create shader of type 1.
    bool QGLShaderPrivate::create(): Could not create shader of type 2.
    QGLShaderProgram: could not create shader program
    QGLShaderProgram::uniformLocation( texture ): shader program is not linked
    QGLShaderProgram::uniformLocation( matrix ): shader program is not linked

    Can I ask what version of Qt you are running?

    Thank you.

    ReplyDelete
  5. 0x3000 mean EGL_SUCCESS :-)
    That is bothering... I can't remember what I changed when that happened to me... I'm quite sure that was something trivial. I'm running Qt5 beta 1.

    ReplyDelete
  6. I have fixed the 0x3000 problem.

    I am also running Qt5 beta 1, although I'm really tempted to try the new Qt5 rc.

    I have in my plugins directory (/opt/qt5/plugins/platforms/) the libqeglfs.so. Is that correct?

    ReplyDelete
  7. I forgot to mention that Omx reports that is has successfully decoded and loaded all the images.

    I get the shader errors after that happens.

    ReplyDelete
  8. I also forgot to mention that when I first loaded your project into Qt Creator and tried to build it, it complained about not being able to find a reference for 'clock_gettime' and in order to fix that I added librt.so.1 to the project. Is that correct?

    Many thanks in advance.

    ReplyDelete
  9. It seems something that is not ok in your env... are you passing the -platform eglfs argument? That might not be necessary anyway.
    Also, try to collect some more information, like using: http://doc.qt.digia.com/qt/qglshader.html#log. Might give some hint on what is wrong.

    ReplyDelete
  10. I've tried passing the -platform flag already but it doesn't seem to make a difference.

    I've also tried logging but it returns an empty string. The shader fails during creation even before being compiled.

    Are you using the 512 model or just the standard 256 one?

    ReplyDelete
  11. Yes, I also remember this empty string. 512. I remember I had to increase the memory assigned to the GPU once, but I don't think that was related to this.

    ReplyDelete
  12. There must be something wrong with my Qt configuration. I have tried running your code on different Pis but I keep getting the same error. I just can't see what I'm missing. I will keep trying. Thanks for all your help.

    ReplyDelete
  13. Hi, I've managed to successfully build and run the example - but I used the 2.0 version of the demo app, from the other blog post: http://thebugfreeblog.blogspot.it/2012/12/decoding-and-rendering-to-texture-h264.html

    - ubuntu 12.10 x64
    - installed git and ia32libs
    - compiled Qt5 from the following Guide: http://qt-project.org/wiki/RaspberryPi_Beginners_guide
    (some modules will fail but it's normal, some of them have been moved to qtbase in the meantime)
    - probably this is Qt5-rc2 by now
    - downloaded the 2.0 version of this app, from the other post on this blog (same deal, but video stream)
    - when building the app, it failed with

    arm-linux-gnueabihf/bin/ld: OMXComponent.o: undefined reference to symbol 'clock_gettime@@GLIBC_2.4'
    /opt/Qt5.rpi.build/gcc-4.7-linaro-rpi-gnueabihf/bin/../lib/gcc/arm-linux-gnueabihf/4.7.2/../../../../arm-linux-gnueabihf/bin/ld: note: 'clock_gettime@@GLIBC_2.4' is defined in DSO /mnt/rasp-pi-rootfs/lib/arm-linux-gnueabihf/librt.so.1 so try adding it to the linker command line
    /mnt/rasp-pi-rootfs/lib/arm-linux-gnueabihf/librt.so.1: could not read symbols: Invalid operation
    collect2: error: ld returned 1 exit status
    make: *** [PiOmxTextures] Error 1

    To fix it, I edited the Makefile and added "-lrt" to the LIBS variable and it built successfully

    -got 6 1920x1080 jpgs from the web, renamed them to pic0.jpg to pic5.jpg
    -placed the jpgs in the app dir, copied the dir to the raspberry pi /home/pi/ dir on the SD Card
    - ran the app: ./PiOmxTextures -platform eglfs pic /opt/vc/...path-to-test.h264

    And I got the cube running the video, managed to rotate it with the plugged-in usb mouse.

    ReplyDelete
  14. I build Qt5.1.2 on Raspberry Pi, and try to build and execute PiOmxTextures, but I got the following error message:
    EGL Error: Could not create the egl surface: error = 0x300b

    What went wrong?

    ReplyDelete
    Replies
    1. I doubt this is releated to PiOmxTextures. The egl surface should be created by the Qt platform plugin eglfs. Does any other Qt application work with eglfs?

      Delete
  15. Nice; I stripped QT out of your C++ rewrite of hello_jpeg, and it decodes to GL texture (as per hello_triangle) *very* speedily - thanks for your hard work!

    ReplyDelete
  16. Hi Luca,

    I'm woorking on rendering JPEG stream from an IP camera to a texture. I'm trying to adapt your code, but till now with no luck.
    I'm really new to Qt and OpenGL, that's why I'm asking for your help.
    I moved the "glGenTextures(1, &texture)" to GLWidget::initializeGL() to create only one texture, and then every time a new JPEG is grabbed from camera I pass it to loadWithOMX() function.
    In the OpenMAXILTextureLoader::getEGLImage(...) function, for the first picture I call glTexImage2D(...) and for the following pictures I call glTexSubImage2D(....) to overwrite picture data.
    The first image is displayed, and for the rest I get error 0x3002 EGL_BAD_ACCESS
    Previously I called glGenTextures(...) for each picture, it worked that way but I experienced memory leakage and after some picture the application crashed. Then I tried to modify the code following this link:
    http://stackoverflow.com/questions/11217121/how-to-manage-memory-with-texture-in-opengl
    Am I doing it completely wrong? Please help me find a solution.

    Thank you for your help!
    Attila

    ReplyDelete
  17. Did any one figure out the error:
    bool QGLShaderPrivate::create(): Could not create shader of type 1.
    bool QGLShaderPrivate::create(): Could not create shader of type 2.
    QGLShaderProgram: could not create shader program

    Please, need help!

    ReplyDelete