Results 1 to 3 of 3

Thread: Decoder stops filling buffers after few seconds

  1. #1
    Junior Member
    Join Date
    Oct 2011
    Posts
    5

    Decoder stops filling buffers after few seconds

    Hi,
    I am using Tegra implementation of OpenMAX IL.

    While being able to see the first ~120 frames, the decoder stops filling buffers at that point.
    There are pending requests that I've sent using OMX_AllocateBuffer(), but the callback that says that the buffer filling is done isn't called by the decoder.

    However, the decoder keeps emptying my buffers, and i get callback calls that the device has finished emptying them.


    Does anyone here have an idea what i could be doing wrong?

    I allocate the buffers using OMX_AllocateBuffer() which returns ok.

    Few information details, i hope they will assist in showing what I'm doing wrong:

    I am using a video that I've h.264 compressed (pure NALs sequence, no container)
    I can view the complete video using ffmpeg, and I'm trying to use OpenMAX IL, in order to use the decoding hardware.

    The buffers are allocated in the size that the input port defined for me (which i got using the OMX_GetParameter) which is around 500,000 bytes.

    If i provide full buffers when using OMX_EmptyThisBuffer i get perfect frames, but as i said it outputting frames after ~120 frames.
    If i provide LESS (while making sure that nOffset=0 and nFilledLen=[bytes_i_provided],
    the rendering contains artifacts, but shows much more of the video.
    I'm quite sure that the compressed data that i provide is correct since:
    1. I can view it with ffmpeg on the same device.
    2. I've checked again and again to make sure i provide the compressed data in order
    3. I'm only requesting OMX_EmptyThisBuffer after a callback returned and i'm sure that this buffer is ready for new info.

    I'm stuck on this issue for a few days now, any assistance will be highly appreciated...

    Here's my code:
    http://ideone.com/WhOqg

    I'm aware it's quite a few lines of code, i tried clearing it up.
    The OMX_Decode() (Notice, It's my function, bad choice of prefix... )
    Is where i do the FillThisBuffer/EmptyThisBuffer.
    In the callbacks "decoder_empty_buffer_done","decoder_fill_buffer_d one" i unlock the buffers for re-use.

    Perhaps I'm doing something wrong in the initialization?
    Maybe this is timestamps related?

    Quite lost on this one :/

  2. #2
    Junior Member
    Join Date
    Oct 2011
    Posts
    5

    Re: Decoder stops filling buffers after few seconds

    I think i have a theory of what can be causing this.
    Inside the buffers that i provide, i don't make sure that it's COMPLETE compressed frames.
    I've read in the specification that not all components support this!

    I hope this is it

  3. #3
    Junior Member
    Join Date
    Oct 2011
    Posts
    5

    Re: Decoder stops filling buffers after few seconds

    Ok solved it !

    It was indeed what i suspected.
    It seems that the nvidia tegra decoder expectes full compressed frames.
    (You cannot provide it, for example, 2.5 compressed frames in an EmptyThisBuffer request)

    If anyone wants reference code look at the code i posted in the previous message, but make sure that you provide complete frames data.

    =)

    btw - for some strange reason i cannot edit my original message to add [Solved] in the title...

Similar Threads

  1. WebGL stops rendering(?)
    By Carloto in forum User Hardware, Software Help
    Replies: 0
    Last Post: 04-06-2012, 06:29 PM
  2. JPEG HW decoder API
    By rodneybrooks in forum OpenMAX DL
    Replies: 0
    Last Post: 11-17-2010, 10:32 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •