Jump to content

  • Log In with Google      Sign In   
  • Create Account


dario_ramos

Member Since 17 Nov 2009
Offline Last Active May 31 2013 02:12 PM

Topics I've Started

Flicker in Direct3D9 video

11 March 2013 - 10:02 AM

My application needs to capture and play a live video stream from an Epix imaging board hooked up to a camera. I use an unmanaged Direct3D9 device, with a vertex shader and pixel shader associated to an Effect.

 

In my development box, I emulate the imaging board by loading a video from a file into memory and launching a thread which supplies frame by frame to the application when requested. This works and looks great in my development box (Windows 7 x64, two monitors, each one hooked up to a NVIDIA GeForce 210 - two monitor, two video cards).

 

In our production box, however, the live capture exhibits a quite noticeable flicker. The production box has ONE NVidia GeForce 210 card which two outputs hooked to two monitors. Asides from that, hardware specs are a little below the development box (Dev has an i7 CPU, 4 GB RAM and Win7 x64, while Prod has an Intel G2020 with 2 GB RAM and Win XP x86).

 

I tried installing and running PIX in Prod, but it crashes when I exit my app. I tried all configurations I could think of but couldn't capture a single frame. First of all, I want to ascertain if this is really a performance issue. If it is so, I'd struggle with PIX or try nvPerfHUD to determine if the app is CPU or GPU bound.

 

But I'm at a loss now. How could I see if this is really a performance problem, or maybe I'm messing up some property in my Direct3D9 device?

 

// Set up the structure used to create the D3DDevice. We will create a
// device with a zbuffer.
ZeroMemory( &m_d3dpp, sizeof( m_d3dpp ) );
m_d3dpp.Windowed = TRUE;
m_d3dpp.SwapEffect = D3DSWAPEFFECT_DISCARD;
m_d3dpp.BackBufferFormat = D3DFMT_UNKNOWN;
m_d3dpp.EnableAutoDepthStencil = TRUE; // Let Direct3D create and manage z-buffer
m_d3dpp.AutoDepthStencilFormat = D3DFMT_D16;
 

The camera is capturing at 25 FPS. My simulator works as fast as the CPU lets it, and in my dev box it runs at almost 60 FPS.

 

Edit: I render my images as textured quads; texture size is configurable. If an image is larger than the texture size, more than one quad will be needed.

 

Edit2: My application uses Windows Forms for its GUI, and the direct3d part is done as a C++/CLI class which inherits System.Windows.UserControl and wraps an unmanaged class which does the actual rendering. But I managed to make a smaller, all unmanaged test which reproduces the problem. I used plain WinAPI to render inside a window created with the CreateWindowEx function. And it still flickers...


Setting world matrix on shader not working

07 March 2013 - 09:21 AM

I used to handle transformation matrices (world, view, projection) at "device level", to say so. That is, I used calls like this:

 

m_pd3dDevice->SetTransform( D3DTS_WORLD, pNewVal );
m_pd3dDevice->SetTransform( D3DTS_VIEW, &matView );
m_pd3dDevice->SetTransform( D3DTS_PROJECTION, &matProj ); 

Everything was working perfectly that way. But when I added support for the HLSL shader model 3.0, I was forced to define a vertex shader (my effect wouldn't compile). So far, I hadn't needed  a vertex shader because there was no need for custom transformations at that level. Anyway, it seems that when you define your vertex shader, you must perform the world-view-projection transformation there. That was the way I understood it, correct me if I'm wrong. So my vertex and pixel shader ended up like this:

 

sampler2D Texture0;
float4x4 g_matProjection;
float4x4 g_matView;
float4x4 g_matWorld;

struct VertexShaderInput{
    float4 Position : POSITION;
    float2 TexCoord : TEXCOORD0;
};
struct VertexShaderOutput {
    float4 Position : POSITION;
    float2 TexCoord : TEXCOORD0;
};
VertexShaderOutput BasicVertexShader( VertexShaderInput input ) {
   VertexShaderOutput output;
   output.Position = mul( input.Position, g_matWorld );
   output.Position = mul( output.Position, g_matView );
   output.Position = mul( output.Position, g_matProjection );
   output.TexCoord = input.TexCoord; //Just pass it along
   return output;
}

struct PixelShaderOutput {
    float4 color : COLOR0;
};
PixelShaderOutput PixelShaderFirstPass( VertexShaderOutput input ) {
    //Magic happens here
    //We use the tex2D function to sample Texture0
}

 

To make this work, I replaced the SetTransform calls with calls that set the appropiate effect variable, like so:

 

m_pd3dDevice->SetTransform( D3DTS_WORLD, pNewVal ); // Now  m_pEffect->setParameter( "g_matWorld", pNewVal );
m_pd3dDevice->SetTransform( D3DTS_VIEW, &matView );  //Now m_pEffect->setParameter( "g_matView", &matView );
m_pd3dDevice->SetTransform( D3DTS_PROJECTION, &matProj );  //Now m_pEffect->setParameter( "g_matProjection", &matProj );

void CEffect::setParameter( const string& strName, const D3DXMATRIXA16 * newVal ) {
   if( !m_pEffect ) return;
   D3DXHANDLE handle;
   assert( handle = m_pEffect->GetParameterByName( NULL, strName.c_str() ) );
   assert( SUCCEEDED( m_pEffect->SetMatrix( handle, newVal ) ) );
} 

With this new setup, pan and zoom work, but when I draw my textured quads, they all end up in the same position, one on top of the other (I verified this using PIX, and I also checked that the world matrix is set right before drawing each quad, and with different values). I'm at a loss now. Did I misunderstand something about the way this should be done? The view and projection matrices are set BEFORE calling m_pEffect->Begin(), and the world transform is set between m_pEffect->Begin() and m_pEffect->End(). Could this be the reason?


Present fails randomly on C++ Direct3D9 app

25 April 2012 - 09:11 AM

I have a mixed .NET application (managed and unmanaged code bridged by C++/CLI wrappers) which uses unmanaged Direct3D9 9.0c to do rendering inside a control which is wrapped inside a C++/CLI class which extends System.WIndows.Forns.UserControl.

Everything has been working fine for years now, but I recently put another instance of this Direct3D9 control inside a new Windows Form. That is, I have two concurrent instances of the control. I had done that before and it worked, but now I have issues. Perhaps the problem was always there, but because it's related to a race condition, it started happening now...

Ok, the details: When I try to render an image over a texture on the new instance of the control, sometimes it works, sometimes it doesn't. And when it doesn't, nothing is rendered, and from that point onwards, all Present calls fail (I log those failures). If I close the form and open it again, it might work or not. Nasty.

If I use DXGetErrorString and DXGetErrorDescription, all I get is something like:

EFAILED Undetermined error

Not really helpful... To make matters worse, I can't reproduce the issue in my development environment; I only see it in a production box.

As far as I know, the main reason Present fails like this is a Lost Device. But I implemented the usual scheme to handle that (I can post the code if necessary), and it seems like the device is not being lost... What else can I try?

Adding 3.0 Shader Model support

30 November 2011 - 07:04 AM

I have an application which renders using Direct3D9, more specifically Pixel Shaders via the Effect framework. I have a .fx file, with techniques for 2.0, 2.a and 2.b models. All these techniques have just a pixel shader compiled against the respective model, and I select the best one for the given GPU using the FindNextValidTechnique function. So my techniques look like this:


technique compileFor2_b {
	pass P0 {
		PixelShader = compile ps_2_b ps2FirstPassIgnoringBackground();
	}
}


However, if I try to write one of these for 3.0 or above, I get an error saying I must implement a vertex shader. The problem is that, according to what I read, and please correct me if I'm wrong, that would imply that view and projection would now have to be done in the vertex shader. Right now, I'm doing this:


void CD3DDevice::Configure( const CCamera& camera ) {
	D3DXMATRIXA16 matView = camera.calculateViewMatrix();
	m_pd3dDevice->SetTransform( D3DTS_VIEW, &matView );
	D3DXMATRIXA16 matProj = camera.calculateProjectionMatrix();
	m_pd3dDevice->SetTransform( D3DTS_PROJECTION, &matProj );
}


So many questions...

  • Do I have to delete this method and implement its equivalent in a Vertex Shader for all techniques?
  • If the answer to 1 is yes, how do I do that? Or kindly point me to some book or resource. I've done some pixel shaders, but never a vertex shader.
  • What about cards which support Pixel Shaders but don't support Vertex Shaders? How do I provide a fallback mechanism?

Ignoring redundant SetSamplerState

22 March 2011 - 03:24 PM

Hi,
I use Direct3D9, in Windows XP machines. I use Effects and Pixel Shaders (3.0).

I'm getting a lot of these when running with the Debug Runtime. After googling around, most people opt to ignore these warnings. But others say that these redundant calls can hurt performance quite a bit. Since I'm currently wringing out every ounce of performace out of my app and ran out of ideas, I thought I'd try to solve this.

My current leads:

1) All these warnings appear when I call the End() method for the Effect.
2) I read somewhere that calling SetTechnique() for the effect on every render cycle can cause this. But it seems that calling SetTechnique like that is the usual thing. Is it like that? Didn't test it because it would require some dangerous refactoring
3) I also read that using Direct3D State Blocks can solve this issue. But the MSDN documentation is kinda vague as usual. If anyone knows of a good tutorial or post aimed at using State Blocks to solve these warnings, please pass me a link

PARTNERS