Video and Gameloop - problem

Started by
8 comments, last by Hoover85 17 years, 9 months ago
Is it possible to draw a video that plays roughtly about 30 fps and a gameloop that plays 85 fps on a single view? The problem is that my video starts flickering, because it only has 30 frames per second and the gameloop runs on 85 fps. I tried to put the Render() -function call where the video gets a new frame, but it made the gameplay sluggish... The gameloop really needs to be faster than 30 fps. On my application the video gets a new frame allways on TextureReadyToRender() which I save to texture and then show the texture on my Render() -function which plays on 85 fps, like I mentioned above. It feels like the easiest way to solve this problem is to put the video rendering into a different function and let it render on it's own pace, but I don't know how to work with two rendering functions that work with different pace. If i just duplicate another Render() -function, that hits every time a new frame kicks on in the video, wouldn't it make trouble with the other Render()-function I allready have, because there allways has to be the device.Clear() -call, that clears the screen. It seems that the only way to solve this problem is to make two layers, that work on theyr own and it should be possible to clear the screen from each layer separately. Is it even possible to make two separately working devices or something similar?
Advertisement
Some brainstorming in the hope you find it useful:
Quote:Original post by Hoover85
It seems that the only way to solve this problem is to make two layers, that work on theyr own and it should be possible to clear the screen from each layer separately.

Or don't clean it at all since you're going to overwrite the data anyway.
Quote:Is it even possible to make two separately working devices or something similar?

Multiple devices are going to kill the performances really badly and they don't share resources by default as far as I know.
Quote:Is it possible to draw a video that plays roughtly about 30 fps and a gameloop that plays 85 fps on a single view?

The problem is that my video starts flickering, because it only has 30 frames per second and the gameloop runs on 85 fps. I tried to put the Render() -function call where the video gets a new frame, but it made the gameplay sluggish... The gameloop really needs to be faster than 30 fps.

Yes, by using the two layers you write above... just swap them, uploading a bit of data to the hidden one in small pieces. When the hidden one is full, swap it just like in double buffering (or blend it if you feel that's not enough).
Quote:On my application the video gets a new frame allways on TextureReadyToRender() which I save to texture and then show the texture on my Render() -function which plays on 85 fps, like I mentioned above. It feels like the easiest way to solve this problem is to put the video rendering into a different function and let it render on it's own pace, but I don't know how to work with two rendering functions that work with different pace.
What should the second render function do? Once you've decoded the next frame, just send it piece by piece to the hidden texture. You should not upload the texture in a single blow (unless it's 64x64 which can be handled nicely).
Quote:If i just duplicate another Render() -function, that hits every time a new frame kicks on in the video, wouldn't it make trouble with the other Render()-function I allready have, because there allways has to be the device.Clear() -call, that clears the screen.

I don't have the whole picture but I think the only difference could be that one does not upload data to texture and the other does. Duplicating the whole func seems a bit overkill to me.

Previously "Krohm"

Thanks for your reply Krohm.
Quote:What should the second render function do?

One would render only the video into a rectangle, frame by frame. Other one would render all the other stuff I'm going to make.
I don't get it.
Why (how) do you exactly render video to a surface?
Why cannot you just take the pixels out of the decoder and fetch'em to the surface?
Calling this a 'render' operation leaves me sort of confused.

Previously "Krohm"

Well firstly I load the video into a Video object when the program starts, then I use the RenderToTexture() for the video and after that I make a Handler that checks when a new frame is starting on the video and take the frame into a texture object. Then I just put the texture into a rectangle and render it in a Render() -loop.

Works fine like this, but the flickering is a huge issue... It looks like some of the frames are missing when you watch it, but in reality I think it's because sometimes when the Render-loop is executed, the texture is being replaces at the same time in the Handler that takes new frame to my texture every time the video's frame is changed. So it tries to draw an empty texture or something and so it looks like the video is flickering.

Here is part of the code from my Render()-loop:

spriteVideo.Begin(SpriteFlags.None)
Dim r As Rectangle = New Rectangle(New Point(0.0F, 0.0F), New System.Drawing.Size(800, 600))
spriteVideo.Draw(textureVideo, r, New Vector3(0.0F, 0.0F, 0.0F), New Vector3(0, 0, 0), Color.White)
spriteVideo.End()

textureVideo is the current frame taken from the video.
Since 85/30 is about 3, how about just doing the video thing every third frame? Or to be more sophisticated, have the video rendering keep track of its rendering so that it only renders 30fps? 1000/30 = 33.3 milliseconds.
--------------------------Most of what I know came from Frank D. Luna's DirectX books
DXnut, that may work, but if the videofile does not have exactly 3x smaller fps the video might start to look funny when it's going too slow or too fast. :/

Hmm.... meaby if I make a timer, that has an interval of the exact time that a single frame is being shown. But then I should somehow calculate that time...

How could I calculate it from a video-object with vb.net? I should somehow get the videofile's fps, but after a quick peek I could't find any commands for that. There was only a lenght of the videofile, that might be some use for calculating it.
As I've said, I recommend you don't go down this route - AudioVideoPlayback is severely bugged in this respect. I believe the texture frame gets lost/corrupted when not used in the event handler. I don't know for sure but I heard this from MVPs/developers.
ya as i heard also AudioVideoPlayback in managed code is buggy so why don't u use direct show for managed code scince u r using VB.net ??? just search google for the Direct Show For managed code and it comes with samples on how to use it .

PS: if u didnot find Direct Show send me a message and i will send it to u as soon as i go home

Best Regards,
jad_salloum, I don't understand what kind of bug would this be, because basicly the only problem I'm having is that I should somehow be able to say "don't try to get new texture now when I'm rendering" for the Handler which picks up new textures from a video.

Meaby it's better that I describe the whole procedure that I use in my game, so you have better idea of the problem.

I have a Render() that hits as fast as computer can handle (there is vsync on, so basicly it's 85 fps on my machine). On the Render() function I make all the drawing to screen, including the drawing of the texture, that allways has the latest frame from video.

The process I use for getting new frames into my texture is taken care by a Handler, that kicks on every time when my video-objecs frame has changed. The frame is saved into a global texture-object and I use it in the Render() function.

The problem is that sometimes these two try to do their thing at the SAME time, that's why I should somehow tell the Handler to not try to change the texture when Render() is doing it's things.

This topic is closed to new replies.

Advertisement