Saturday, June 7, 2014

Video with Animations on Raspberry Pi and A31s Compared

For many uses, playing video while animations are running is pretty useful and it is also considerably stressful for the GPU. I therefore tested a couple of cheap chips with the same Qt/QML sample app. I created a small demo video comparing Qt on A31s and Raspberry Pi (PiOmxTextures):

In the demo, all the media (video and images) are 720p, and the output is 720p for A31s and 1080p for Raspberry Pi (16bpp). It is pretty hard to tell from the video, but Raspberry seems to still be above A31s.
Bye! ;-)

29 comments:

  1. Hi Luca,
    Thanks again for your work on QT and ARM. It's really appreciated.
    I have a question on my mind since a long time ago. When I reproduce some video using QTMultimedia, it seems like it is a lack of bpp. For example, on the big bug bunny video, when the sky appears on the background, the blue tones are not uniform. This also occur with image or glow effects. I was thinking that the problem was qt used a 24bpp output but in your video, it seem everything it is ok... and you are using 16bpp.
    What do you thing the problem could be?
    I hope me explanation was clear enough.
    Thanks!

    ReplyDelete
    Replies
    1. What backend are you using? gstreamer or PiOmxTextures? For the latter have a look at the comments in the blog. There were similar questions in past articles, like here for instance: http://thebugfreeblog.blogspot.it/2013/02/qml-components-for-video-decoding-and.html.

      Delete
  2. Hi again,
    Thanks for your reply. I found the solution on that post.
    Just one more thing. Now, the color it's fine, but on video, when the image change fastly, it happears some horizontal lines, just like if there was a lack of refresh. Do you thing there is still some problem with the surface config? I am not pretty sure, but it seems your video it is ok, isn't it?
    Just for the record, I'm using 256Mb for the video with PiOmxTextures.
    Regards,

    ReplyDelete
    Replies
    1. I started to notice that as well on recent firmwares. I tried to implement a fix but an issue of the Broadcom API made it difficult to complete the task. It is possible I may try again, but it depends on the time I'll have available.

      Delete
    2. Mmm... I was wondering if it could be solved using SwapBehavior (using the double or triple configuration), but as always, things are more complicated than expected.
      Just for testing, I am planing test your PiOmxTexture on a CubieBoard this weekeng. RPi has a much better GPU, but only for check what happens.
      Thanks Luca, have a nice weekend!

      Delete
    3. If you do tests on other platforms it would be interesting if you could share the results.
      Thanks!

      Delete
    4. I managed to build your latest PiOmxTextures for the Raspberry PI and it works wonderfully, trouble is the Raspberry PI is still a little slow for the project I am working on. Does anybody know if this will work on the Banana PI? Also is there anyway to convert my existing Raspberry PI image over to the Banana? It took a long long time to work out how to build PiOmxTextures correctly and I dont want to repeat it if it can be avoided!

      Delete
    5. Sorry, I don't know Banana Pi. If there is a proper OpenMAX library with the same components something from PiOmxTextures may be portable.

      Delete
    6. Hi everyone,
      If you really want to test it on Banana pi, you could try to download the official img from lemarker.org. There are two versions Debian based: Bananian and Raspbian for BPi. Then, cross-compile QT on it and do exactly the same as you did for RPi... I've no idea if there is any cross-compile tool for BPi.
      I just want to say that Luca made and exellent work with POC Player and the OpenMax library for QT, but as he said sometime, it's just a proof of concept. The code needs to be improved to get better result.
      My recomendation for you is to try to compile QT with Gstream and gst-omx. I've never handled to make it work properly, but if you do, maybe you can shared with us ;).
      I always wonder how the xbmc team do it to make the video works like a charm on their version for RPi... but I suppose they use OpenGL for the GUI and omxplayer instead of QT.
      Regards,

      Delete
    7. Thanks for your replies guys, well I converted my raspberry image to banana PI, and it boots fine but when attempting to run qmlscene I am getting the error..* failed to open vchiq instance

      Currently trying to build a dev environment for working with Android as the Banana PI runs 4.4 nicely and I am hoping QT will be accelerated out of the box so to speak.

      Delete
    8. Thanks for your replies guys, well I converted my raspberry image to banana PI, and it boots fine but when attempting to run qmlscene I am getting the error..* failed to open vchiq instance

      Currently trying to build a dev environment for working with Android as the Banana PI runs 4.4 nicely and I am hoping QT will be accelerated out of the box so to speak.

      Delete
    9. After your comments, I was curious to know how to make qtmultimedia work on the Banana Pi. From http://www.quora.com/How-does-XBMC-run-on-the-Banana-Pi, you can read the following:
      "The Raspberry Pi's Core IV GPU is significantly different from the Mali GPU used on the Banana Pi. You can't reuse the Pi's hardware acceleration code on the Banana! omxplayer won't work on it."

      It seems there is no GPU support on the Banana Pi yet.

      Delete
    10. Well, I've found one possible solution to your problem: sudo chmod a+rw /dev/vchiq
      It works to me.

      Delete
    11. That's most interesting will give that a whirl when I get chance. At the moment I am running android 4.2 on my Banana PI, and QT Multimedia actually works quite well without any fiddling unsure if it is using the GPU or if it is just the raw power increase over the Rasp PI.

      Android 4.4 image is available for the Banana, which supposedly allows XBMC etc to use hardware acceleration through the standard Android MediaCodec API, trouble is the image is rather buggy at the moment!

      I just wish the Raspberry PI foundation would consider releasing a more powerful model, The rumor appears to be that this wont happen until 2017!

      Delete
    12. Just tried sudo chmod a+rw /dev/vchiq, unfortunately it doesn't exist when running my image on the banana, upon further research VCHIQ is a raspberry pi specific thing for communicating with the VideoCore and even if it did exist on my banana image it wouldn't work with Mali :( Going to try and replace qmlscene with a debian version and see if that makes the problem go away though I am not hopeful :(

      Delete
  3. This comment has been removed by the author.

    ReplyDelete
  4. Hi everyone!
    I just want to share with you a way to get an important improve on the openmaxil lib on qt.
    I am involved on a personal iptv project running over qt. My goal is use tvheadend to display some tv channels. I highly recommend you the MOI+ box. It's an amazing device that allows you to convert (in may case) DVB-T/T2 signal to multicast signal.
    On my first try, the results were horrible: the already commented horizontal lines when quick color changes and a "wave effect" when there was quick movement images was ruining the whole thing.
    Well, the "trick" to remove that is going straight way to the omx_textureprovider.cpp file and change the following:

    GLubyte* pixel = new GLubyte[size.width()*size.height()*4]; -->GLubyte* pixel = new GLubyte[size.width()*size.height()*24];
    and
    glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, size.width(), size.height(), 0, GL_RGBA, GL_UNSIGNED_BYTE, pixel); --> glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, size.width(), size.height(), 0, GL_RGB, GL_UNSIGNED_BYTE, pixel);

    Regarding the first change, if I am not wrong, the formula here is GLubyte* pixel = new GLubyte[size.width()*size.height()*bpp]. So, as I configure my environment to work over 24bpp, I think it's the correct confing.
    The second change it's related with the first one. As I don't care about the Alpha, I turned the GL_RGBA to GL_RBG.

    I am not a good programmer, just a curios person, so if there is any expert here and I've done some bad thing, please, let me know.
    Regards,

    PD: Here is the site that I've used to get my "inspiration":
    https://www.opengl.org/sdk/docs/man/html/glTexImage2D.xhtml

    ReplyDelete
    Replies
    1. Interesting: do you mean the horizontal tearing disappears with this change?

      Delete
    2. Hi Luca!
      Well, I've been trying some different configuration around the iptv and it seems to solve the horizontal lines most of the times and definitely improve the waving effect over the moving objects. Anyway, the results are not perfect and this changes are just a little improve.
      Anyway, when I try to play the big bug bunny video the horizontal lines still appears. :(
      After a initial inspect, I've realize that there are some similarity between your project and XBMC (like OMXCore, OMXClock, RPB, etc), so my plan here is to update (or at least try) the common files in order to check if the runtime performance changes... but is hard to figured out how to join all the pieces of the puzzle.
      If I achieve something interesting I will post it here.

      Delete
    3. Have a look at previous posts about this project: you'll see that PiOmxTextures (the portion that is linked to the Qt backend) is intentionally an integration of omxplayer from XBMC. I removed as much code outside that project as possible and confined it to a subdirectory so that it is possible to integrate changes from omxplayer pretty quickly. Merges were done many times to leverage new implementations in omxplayer (have a look at the commits).

      Delete
    4. Yes, the first I thought was to checking the commits. Then, checking both (PiOmxTextures and XBMC), I realized that there were some important file implemented on a totally different way and finally, I get a nice mess on my head. For example, in your backend, RPB.cpp is implemented on a different way. I am electrical engineer, so I have a huge lack of software knowledge, so I would take me a while to understand all this thing, but I will do it. :)

      Delete
  5. After some intense test, horizontal line appears sometimes. It just seems to be random issue. Here you have a video:
    https://www.youtube.com/watch?v=0bt2VzSTUys&list=UUUoHcLBYWP--lCN8znifbrw
    Please, take into account there is 0 optimization behind this.
    First I should take a look to the switch channel connection speed and the stream audio. If there is more than one avaliable language I get these errors:

    [matroska,webm @ 0x1e70e60] Could not find codec parameters for stream 1 (Audio: mp2, 48000 Hz, 2 channels, s16p): unspecified frame size
    Consider increasing the value for the 'analyzeduration' and 'probesize' options
    [matroska,webm @ 0x1e70e60] Could not find codec parameters for stream 2 (Audio: mp2, 48000 Hz, 2 channels, s16p): unspecified frame size
    Consider increasing the value for the 'analyzeduration' and 'probesize' options
    [matroska,webm @ 0x1e70e60] Could not find codec parameters for stream 3 (Audio: mp2, 48000 Hz, 1 channels, s16p): unspecified frame size
    Consider increasing the value for the 'analyzeduration' and 'probesize' options

    ReplyDelete
    Replies
    1. Awsome result :-)
      Those errors seem to say mp2 is not included in ffmpeg: so are you providing the tvheadend URL to the PiOmxTextures-based Qt backend? Or are you using some other backend?

      Delete
    2. If you're using PiOmxTextures and you used to script I provided to build ffmpeg then many decoders were disabled. You may check that. Never tested DVB-S/2 streams which typically have many h262 streams... but I see it is a mkv container... are you really getting the live stream or did you save the recording with tvheadend to file?

      Delete
    3. All you see if 100% live tv streaming. I'm really impress about how you backend works without to much modifications. There is a little trick behind these video. Nowadays, using tvheadend, I get a list of channels with their http link. So, I QMapped previously on QT an used as a fake-pvr pluging. Not more backends but yours.
      After a quick review to your code, I saw some comments of you saying that the stream audio selection is a TODO (if I remember well). So, I guess the problem here is that the backend is not able to select one of them correctly. Keep in mind that the live tv is get from the local air tv, so there is my local language and english audio over the same channel. So, there is multiple audio per channel.
      About the codecs, I configured the channels to be streamed using MPG2 (yes, I had to buy the license) and H264.
      My plan is get the video work well and then, implement audio selection and subtitles. Then, using tvheadend as addon for real channel sorting and updating.

      Delete
    4. Yes, I meant "ability to switch", I don't think I ever tried anything with more than one audio stream anyway.
      As for the tearing effect, as I said in other posts, it is typically related to concurrency issues, but I can't be sure. I started to implement what was needed to fix it, I was almost done but I didn't complete it, so I can't confirm that theory is correct.

      Delete
  6. Hi Luca,
    Sorry fir too much posting here!
    Regarding the EGL rendering, is there any reason to use OMX.broadcom.egl_render instead of OMX.broadcom.video_render apart of the fact you want to decode image using hw?
    Regards,

    ReplyDelete
    Replies
    1. Yes, those are totally different components and do different things.

      Delete
  7. Ok, now I know where to point to. The problem with PiOMXTexture and live tv is named weaving and it's related with the deinterlace process.
    If anyone is curious about it, there is a good explanation about that on this site: http://www.100fps.com/
    Regards,

    ReplyDelete