Flicker in Direct3D9 video

Started by
1 comment, last by dario_ramos 11 years, 1 month ago

My application needs to capture and play a live video stream from an Epix imaging board hooked up to a camera. I use an unmanaged Direct3D9 device, with a vertex shader and pixel shader associated to an Effect.

In my development box, I emulate the imaging board by loading a video from a file into memory and launching a thread which supplies frame by frame to the application when requested. This works and looks great in my development box (Windows 7 x64, two monitors, each one hooked up to a NVIDIA GeForce 210 - two monitor, two video cards).

In our production box, however, the live capture exhibits a quite noticeable flicker. The production box has ONE NVidia GeForce 210 card which two outputs hooked to two monitors. Asides from that, hardware specs are a little below the development box (Dev has an i7 CPU, 4 GB RAM and Win7 x64, while Prod has an Intel G2020 with 2 GB RAM and Win XP x86).

I tried installing and running PIX in Prod, but it crashes when I exit my app. I tried all configurations I could think of but couldn't capture a single frame. First of all, I want to ascertain if this is really a performance issue. If it is so, I'd struggle with PIX or try nvPerfHUD to determine if the app is CPU or GPU bound.

But I'm at a loss now. How could I see if this is really a performance problem, or maybe I'm messing up some property in my Direct3D9 device?


// Set up the structure used to create the D3DDevice. We will create a
// device with a zbuffer.
ZeroMemory( &m_d3dpp, sizeof( m_d3dpp ) );
m_d3dpp.Windowed = TRUE;
m_d3dpp.SwapEffect = D3DSWAPEFFECT_DISCARD;
m_d3dpp.BackBufferFormat = D3DFMT_UNKNOWN;
m_d3dpp.EnableAutoDepthStencil = TRUE; // Let Direct3D create and manage z-buffer
m_d3dpp.AutoDepthStencilFormat = D3DFMT_D16;
 

The camera is capturing at 25 FPS. My simulator works as fast as the CPU lets it, and in my dev box it runs at almost 60 FPS.

Edit: I render my images as textured quads; texture size is configurable. If an image is larger than the texture size, more than one quad will be needed.

Edit2: My application uses Windows Forms for its GUI, and the direct3d part is done as a C++/CLI class which inherits System.Windows.UserControl and wraps an unmanaged class which does the actual rendering. But I managed to make a smaller, all unmanaged test which reproduces the problem. I used plain WinAPI to render inside a window created with the CreateWindowEx function. And it still flickers...

Advertisement

I got PIX working now; my test crashed on exit and it seems that messed up PIX's sampling. So now I'm learning how to profile and analyze the results properly, since I only used PIX's "Debug this pixel" feature before. I'll get back with results later.

OK, I ran PIX and here are some results. First, I selected "Statistics for each frame, using counterset" with a custom counter set which displayed FPS and %Processor Time. I got this:

9088792_bigthumb.PNG

Yellow is %Processor Time, red is FPS. So if I understand correctly, virtually all of the frame time is consumed by the CPU. On the other hand, FPS oscillates wildly between 60 and below 25. That seems consistent with the flicker I experience (it doesn't happen all the time; it fluctuates).

Then I closed the test program and looked at the timeline:

9088793_bigthumb.PNG

So it would seem I'm CPU bound, which seems to be consistent with %Processor Time being at 100%.

First of all, is my analysis correct? Second, why does FPS vary so wildly? And then, how should I go about looking further into this? I thought of enabling draw timing and looking at a frame where instantaneous FPS is low, to see which calls consume more time.

This topic is closed to new replies.

Advertisement