Writing an avi encoder/decoder for windows that works over custom i/o's. It currently uses vfw.
Now it works great on almost any codec type. mpeg 4 asp (xvid), cinepak (cvid), microsoft's ancient ones (mrle, msvc), those intel indeo ones that are easy to find, etc. Encoding works correctly with all these, and with h.264 using the x264vfw library, set to zero latency, for simplicity.
Decoding on the other hand - it works as expected for all the above except h264. It's weird. Now here's what happens, and I'm struggling to understand it, though I'm sure it's my own ignorance:
- If I just ICDecompress from the most recent keyframe to the frame I want, I end up 4 frames back. Except if I am within 3 frames >= a keyframe. Those turn out black. (ex frames 0, 1, 2, and 3 are black, frame 4 actually displays frame 0)
- If I do this, AND on the last frame of the sequence (that is, the frame I actually want to show) call ICDecompress exactly 4 additional times (so 5 total calls for the last frame), it works. It works on all frames, including keyframes and non-keyframes. Any less than 5 times for that last frame, and it still behaves in similar flavors of the first problem.
I know these are clues to what I'm doing wrong, and I've read about the B-Frames and all that, but I don't know what to do with this information? Plus I think zero-latency is supposed to remove usage of B-Frames, maybe? The encoded video plays just fine in virtualdub and windows media player, but I can't look at virtualdub's source code to see what's going on.
Can anyone help me to understand what is going on, and how do I do this correctly? (I understand either way it's a hack - but this hack seems... less trustworthy than others)
Edited by achild, 17 May 2012 - 11:15 AM.