• Announcements

    • khawk

      Download the Game Design and Indie Game Marketing Freebook   07/19/17

      GameDev.net and CRC Press have teamed up to bring a free ebook of content curated from top titles published by CRC Press. The freebook, Practices of Game Design & Indie Game Marketing, includes chapters from The Art of Game Design: A Book of Lenses, A Practical Guide to Indie Game Marketing, and An Architectural Approach to Level Design. The GameDev.net FreeBook is relevant to game designers, developers, and those interested in learning more about the challenges in game development. We know game development can be a tough discipline and business, so we picked several chapters from CRC Press titles that we thought would be of interest to you, the GameDev.net audience, in your journey to design, develop, and market your next game. The free ebook is available through CRC Press by clicking here. The Curated Books The Art of Game Design: A Book of Lenses, Second Edition, by Jesse Schell Presents 100+ sets of questions, or different lenses, for viewing a game’s design, encompassing diverse fields such as psychology, architecture, music, film, software engineering, theme park design, mathematics, anthropology, and more. Written by one of the world's top game designers, this book describes the deepest and most fundamental principles of game design, demonstrating how tactics used in board, card, and athletic games also work in video games. It provides practical instruction on creating world-class games that will be played again and again. View it here. A Practical Guide to Indie Game Marketing, by Joel Dreskin Marketing is an essential but too frequently overlooked or minimized component of the release plan for indie games. A Practical Guide to Indie Game Marketing provides you with the tools needed to build visibility and sell your indie games. With special focus on those developers with small budgets and limited staff and resources, this book is packed with tangible recommendations and techniques that you can put to use immediately. As a seasoned professional of the indie game arena, author Joel Dreskin gives you insight into practical, real-world experiences of marketing numerous successful games and also provides stories of the failures. View it here. An Architectural Approach to Level Design This is one of the first books to integrate architectural and spatial design theory with the field of level design. The book presents architectural techniques and theories for level designers to use in their own work. It connects architecture and level design in different ways that address the practical elements of how designers construct space and the experiential elements of how and why humans interact with this space. Throughout the text, readers learn skills for spatial layout, evoking emotion through gamespaces, and creating better levels through architectural theory. View it here. Learn more and download the ebook by clicking here. Did you know? GameDev.net and CRC Press also recently teamed up to bring GDNet+ Members up to a 20% discount on all CRC Press books. Learn more about this and other benefits here.
Sign in to follow this  
Followers 0
Crosire

DX11
[SOLVED] Post Processing BackBuffer using Effects 11

13 posts in this topic

I'm currently trying to add post processing to an existing C++ DirectX11 application. The task is to apply a dynamic FX shader loaded from a file directly to the backbuffer, all rendering is already done before. In short: I have to get a copy of the backbuffer, apply some post processing effects and overwrite the buffer with those afterwards. "IDXGISwapChain->Present" is called at last.

 

This was pretty easy to achieve in DirectX9, however I'm running into multiple issues with DirectX11 now. I only found one example for post processing on D3D11 on MSN and it was written for a Windows 8 app, which I cannot build or run on my Windows 7 desktop and the code did not make a lot of sense to me.

The first one, a missing effects framework, could be solved by using "Effects 11" you can download from MSN already.

 

Now I'm a bit confused on how to continue. The basic idea on how it could work is as following:

  1. Load the effect from a ".fx" HLSL file and compile it using "D3DX11CompileEffectFromFile"
  2. Get a copy of the backbuffer using "IDXGISwapChain->GetBuffer" and save it to a "ID3D11Texture" object
  3. Get the description of the texture, to have width and height of the screen
  4. Create a new render target and tell the device to use it
  5. Create a fullscreen quad and draw the screen texture on it (using the loaded effect)
  6. Reset the render target and update the backbuffer with the drawn quad

 

The following code for step one and two already works:

[spoiler]

1:

ID3DX11Effect *g_pEffect;

...

HRESULT hr;
ID3DBlob *m_pCompileBuffer;
ID3DInclude *m_pInclude = new CCustomInclude();

// Load effect file
hr = D3DX11CompileEffectFromFile(_T("shader.fx"), NULL, m_pInclude, NULL, NULL, device, &g_pEffect, &m_pCompileBuffer);
	
if (FAILED(hr))
{
	if (m_pCompileBuffer && m_pCompileBuffer->GetBufferSize() > 0)
	{
	    Log(1, "Error while compiling shader:\r\n\r\n%s\r\n", (char*)m_pCompileBuffer->GetBufferPointer());
	}
	else
	{
	    Log(1, "Error while loading shader");
	}
}

2 & 3:

ID3D11Texture2D *g_pScreenTexture;

...

hr = swapchain->GetBuffer(0, __uuidof(g_pScreenTexture), (PVOID*)&g_pScreenTexture);

if (SUCCEEDED(hr))
{
	D3D11_TEXTURE2D_DESC m_pDesc;
	g_pScreenTexture->GetDesc(&m_pDesc);
}

[/spoiler]

 

I'm having problems with applying the effect and drawing the fullscreen quad though ...

 

The vertex struct and layout for the quad is declared as following:

[spoiler]

// Vertex Structure
struct Vertex
{
	public:
		Vertex() { }
		Vertex(float x, float y, float z, float rhw) : pos(x, y, z, rhw) { }
		Vertex(float x, float y, float z, float rhw, float tex1, float tex2) : pos(x, y, z, rhw), tex(tex1, tex2) { }
		Vertex(float x, float y, float z, float rhw, D3DXVECTOR2 tex) : pos(x, y, z, rhw), tex(tex) { }
		Vertex(D3DXVECTOR4 pos, float tex1, float tex2) : pos(pos), tex(tex1, tex2) { }
		Vertex(D3DXVECTOR4 pos, D3DXVECTOR2 tex) : pos(pos), tex(tex) { }
		Vertex(D3DXVECTOR4 pos) : pos(pos) { }

		static const D3D11_INPUT_ELEMENT_DESC layout[];

	private:
		D3DXVECTOR4 pos;
		D3DXVECTOR2 tex;
};

// Vertex Layout
const D3D11_INPUT_ELEMENT_DESC Vertex::layout[] =
{
	{ "POSITION", 0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 0, D3D11_INPUT_PER_VERTEX_DATA, 0 },
	{ "TEXCOORD", 0, DXGI_FORMAT_R32G32_FLOAT, 0, 16, D3D11_INPUT_PER_VERTEX_DATA, 0 }
};

[/spoiler]

 

The fullscreen quad is set up like this:

[spoiler]

ID3D11Buffer *g_pVertexBuffer;

...

// Create the vertex buffer
Vertex quad[] =
{
	Vertex(D3DXVECTOR4(-1.0f, 1.0f, 0.5f, 1.0f), D3DXVECTOR2(0.0f, 0.0f)),
	Vertex(D3DXVECTOR4(1.0f, 1.0f, 0.5f, 1.0f), D3DXVECTOR2(1.0f, 0.0f)),
	Vertex(D3DXVECTOR4(-1.0f, -1.0f, 0.5f, 1.0f), D3DXVECTOR2(0.0f, 1.0f)),
	Vertex(D3DXVECTOR4(1.0f, -1.0f, 0.5f, 1.0f), D3DXVECTOR2(1.0f, 1.0f))
};

D3D11_BUFFER_DESC m_pVertexDesc = { sizeof(Vertex) * ARRAYSIZE(quad), D3D11_USAGE_DEFAULT, D3D11_BIND_VERTEX_BUFFER, 0, 0 };
D3D11_SUBRESOURCE_DATA m_pVertexData = { quad, 0, 0 }; 
device->CreateBuffer(&m_pVertexDesc, &m_pVertexData, &g_pVertexBuffer);

[/spoiler]

 

Drawing currently looks similar to the next code:

[spoiler]

// Update vertex declaration
UINT m_iStrides = sizeof(Vertex);
UINT m_iOffsets = 0;
context->IASetVertexBuffers(0, 1, &g_pVertexBuffer, &m_iStrides, &m_iOffsets);
context->IASetPrimitiveTopology(D3D10_PRIMITIVE_TOPOLOGY_TRIANGLESTRIP);

// Begin drawing
D3DX11_TECHNIQUE_DESC m_pTechDesc;
g_pEffect->GetTechniqueByIndex(0)->GetDesc(&m_pTechDesc);

for (UINT iPass = 0; iPass < m_pTechDesc.Passes; iPass++)
{
	g_pEffect->GetTechniqueByIndex(0)->GetPassByIndex(iPass)->Apply(NULL, context);
	context->Draw(4, 0);
}

[/spoiler]

 

I'm clearing the render target to the color red before drawing the quad, but the screen is red then only, so sadly it doesn't seem to draw the quad at all...

I don't know if there is any way to check that.

 

Also, the a post processing shader normally requires me to pass the screen texture to it, so it can apply the effects on every pixel of it. The Effects Framework provides the "ID3DX11Effect->GetVariableByName" and allows one to set the variable to a specific object / data, but it has no definition for a texture. The nearest thing I found is "Variable->AsShaderResource()->SetResource(&MyResource)", but is that really the most efficient way, to create a shader resource for the screen texture to pass it to the pixel shader?

 

I'm sorry for the tons of code in this post, but I found it the easiest way to show what I got already. In DirectX9 the steps described earlier worked without any problems, but it wasn't required to create a vertex buffer here anyway, so the whole thing was shorter and easier to achieve.

I hope somebody has done something similar / post processing before in D3D11/D3D10 and can help me out here, I would really appreciate it.

 

Thank you and cheers,

Crosire

Edited by Crosire
0

Share this post


Link to post
Share on other sites

It looks like the declared position format in your vertex layout is too small.  You are using a 4 component float position, but only declaring a 3 component format.  Since your offset in the layout to the texture coordinates is 16 bytes, this is likely the source of your problems.  If you switch to having a DXGI_FORMAT_R32G32B32A32_FLOAT for your position, that should help.

 

Did you get any output in the debug window?  If your shader was expecting a float4 then you vertex layout would not have matched and it should have complained about that...  Is the debug layer enabled in the device you are using?

2

Share this post


Link to post
Share on other sites
Yep, create your device with the D3D11_CREATE_DEVICE_DEBUG. We should actually make that a big red blinking sticky in this subforum wink.png

The nearest thing I found is "Variable->AsShaderResource()->SetResource(&MyResource)", but is that really the most efficient way, to create a shader resource for the screen texture to pass it to the pixel shader ?

It's the only way. The non-effect counterpart would be ?SSetShaderResources (in your case PSSetShaderResources), so you need to create a shader resource view of your backbuffer texture.

For this to work you also need to create your swapchain with DXGI_USAGE_SHADER_INPUT, not only with DXGI_USAGE_RENDER_TARGET_OUTPUT. Alternatively you could render your scene first to a offscreen texture with both D3D11_BIND_RENDER_TARGET and D3D11_BIND_SHADER_RESOURCE.

Be aware when switching targets and resources that you unbind (set NULL) things first: one can not set a texture as a target and input simultaneously, the (debug) pipeline will complain and undo such an attempt (usually in the non-intended way). This is a bit more of a challenge with the effect framework since you have to find out which slots are currently used, but you can explicitly set the slots with the HLSL register keyword.

Also: Show your shader code, please.
2

Share this post


Link to post
Share on other sites

Thank you for your answers already! I now set the position layout to a float4, using "DXGI_FORMAT_R32G32B32A32_FLOAT", makes a bit more sense.

 

I'm used to PS 2.0, so the new HLSL format is a bit strange to me and I'm sure there is a mistake in there.

 

The shader is just a simple monochrome testing effect:

[spoiler]

Texture2D colorTex : register(t0);
SamplerState _sampler { Filter = MIN_MAG_MIP_POINT; AddressU = Clamp; AddressV = Clamp; };

float4 PostProcess_PS(float3 normal : NORMAL, float2 coord : TEXCOORD0) : SV_TARGET
{
    float4 result = colorTex.Sample(_sampler, coord);
    result.rgb = dot(float3(0.18, 0.41, 0.41), result.rgb);
    result.a = 1.0;
    return result;
}

technique10 PostProcess
{
    pass p0
    {
        SetPixelShader(CompileShader(ps_4_0, PostProcess_PS()));
    }
}

[/spoiler]

Edited by Crosire
0

Share this post


Link to post
Share on other sites
Use [tt]technique11[/tt] not [tt]technique10[/tt] in your effect file. This is a nitpick of the effect framework. Also: Where's your vertex shader ? You need one. While you're at it: Make sure the signatures match: The vertex shader must output the [tt]SV_Position[/tt] system value semantic.

I recommend testing the post process in a separate application (e.g. by loading a screenshot from your app as source) to enable the debug layer. Edited by unbird
1

Share this post


Link to post
Share on other sites

Thanks so far, you helped me a lot. The quad is now drawing fine and the shader / effect gets loaded and executed too.

 

FX:

[spoiler]

Texture2D colorTex;
SamplerState colorSampler { AddressU = Clamp; AddressV = Clamp; };

struct VS_INPUT
{
    float4 pos : POSITION;
    float2 txcoord : TEXCOORD0;
};
struct VS_OUTPUT
{
    float4 pos : SV_POSITION;
    float2 txcoord : TEXCOORD0;
};

VS_OUTPUT PostProcess_VS(VS_INPUT IN)
{
    VS_OUTPUT OUT;
	
    OUT.pos = IN.pos;
    OUT.txcoord = IN.txcoord;
	
    return OUT;
}

float4 PostProcess_PS(VS_OUTPUT IN) : SV_TARGET
{
    float4 color;

    //color = colorTex.Sample(colorSampler, IN.txcoord);
    //color.rgb = dot(float3(0.18, 0.41, 0.41), color.rgb);
    color.r = 1.0;
    color.g = 0.0;
    color.b = 0.0;
    color.a = 1.0;

    return color;
}

technique11 PostProcess
{
    pass p0
    {
        SetPixelShader(CompileShader(ps_4_0, PostProcess_PS()));
        SetVertexShader(CompileShader(vs_4_0, PostProcess_VS()));
    }
}

[/spoiler]

 

The full screen gets covered in red as expected now. The only thing left is to pass the screen texture to the pixel shader and let it do its job.

 

However, as soon as I remove the comments from the two lines above and comment out the other color code, I just get a black screen. I tried some different shader code and it's always just black, lowering the alpha will show the original image a bit through the now transparent quad, as long as I disable render target clearing.

 

I'm sending the texture to the shader with this code:

ID3D11ShaderResourceView *m_pResource;
D3D11_SHADER_RESOURCE_VIEW_DESC m_pResourceDesc = { DXGI_FORMAT_R32_FLOAT, D3D10_SRV_DIMENSION_TEXTURE2D };
m_pResourceDesc.Texture2D.MipLevels = 1;
m_pResourceDesc.Texture2D.MostDetailedMip = 0;
device->CreateShaderResourceView(g_pScreenTexture11, &m_pResourceDesc, &m_pResource);

g_pEffect11->GetVariableByName("colorTex")->AsShaderResource()->SetResource(m_pResource);

"g_pScreenTexture11" contains the full screen, checked that via "D3DX11SaveTextureToFile". "g_pEffect11" also links to the effect, the other shader code works perfectly fine. The shader layout is the same as the vertex input layout declared in the C++ code too (as seen earlier), so I'm a bit lost here.

 

Thanks again for all help so far, it's greatly appreciated. I'm really close to the target now ... Just this little thing.

 

Cheers,

Crosire

0

Share this post


Link to post
Share on other sites
Good, you narrowed it somewhat down: The sampling creates black - at least that's what I get.
 
Your easiest approach to narrow down the problem further would be to use a graphics debugger (PIX or the VS 2012 graphics debugger). Check the post-VS values, check if the texture is actually bound (the shader resource view, that is), debug pixels, etc.
 
Alternatively:
  • Check if the texcoords by outputting float4(IN.txcoord, 0, 1). This should give a black-red gradient from left-to-right and a black-green gradient from top to bottom (and yellow bottom right).
  • Your sampler is a bit bare for my taste. Really set all the needed values (not sure what the effects use per default). Or set the sampler explicitly to NULL (you should find the corresponding variable type), this will give you the default sampler. If that gives you something useful, work from those values. For a postprocess effect you e.g. want point sampling.
Currently I don't see anything obvious - yet wink.png
1

Share this post


Link to post
Share on other sites

Texcoords are set correctly, I get the expected green/red gradient image with no blue and alpha at 100%.

 

I tried it by using "D3DX11CreateShaderResourceViewFromFile" and sent an existing image to the shader by that and I get that image with the effects applied drawn on the screen, so the shader is fully working. It's just not reading the data from the texture object (which has it, tested) and using that one when sending it to the shader.

 

In code:

This works:

ID3D11ShaderResourceView *m_pResource;
D3DX11CreateShaderResourceViewFromFile(device, L"tex.bmp", NULL, NULL, &m_pResource, NULL);
g_pEffect11->GetVariableByName("colorTex")->AsShaderResource()->SetResource(m_pResource);

The code from my previous post not:

ID3D11ShaderResourceView *m_pResource;
D3D11_SHADER_RESOURCE_VIEW_DESC m_pResourceDesc = { DXGI_FORMAT_R32_FLOAT, D3D10_SRV_DIMENSION_TEXTURE2D };
m_pResourceDesc.Texture2D.MipLevels = 1;
m_pResourceDesc.Texture2D.MostDetailedMip = 0;

device->CreateShaderResourceView(g_pScreenTexture11, &m_pResourceDesc, &m_pResource);

g_pEffect11->GetVariableByName("colorTex")->AsShaderResource()->SetResource(m_pResource);

 

After looking at it again, I found some mistakes already:

  • D3D11_SRV_DIMENSION_TEXTURE2D instead of D3D10_SRV_DIMENSION_TEXTURE2D (which outputs the same value in the end (4), but it just makes more sense)
  • DXGI_FORMAT_R32G32B32A32_FLOAT instead of DXGI_FORMAT_R32_FLOAT (The image isn't made of red color only obviously, I want the full RGBA color code here)

Still it doesn't work with the corrected code. Does anybody see another error in here?

 

And I just have to thank unbird again, you pushed me in the right directions :)

0

Share this post


Link to post
Share on other sites
This is really strange. Can you force the debug layer through the control panel ? Already tried a GPU debugger ? I'm definitively out of clues...
0

Share this post


Link to post
Share on other sites

I think it might be because the screen texture gets created by the swapchain interface in "IDXGISwapChain->GetBuffer" as I pass a null texture interface object to it. It then probably gets initialized without "D3D11_BIND_SHADER_RESOURCE", which makes it unusable for the shader resource view object.

 

I tried a different approach now:

Code which gets executed on startup:

// Create screen resource
D3D11_TEXTURE2D_DESC m_pTextureDesc = { m_pDesc.Width, m_pDesc.Height, 1, 1, DXGI_FORMAT_R32G32B32A32_FLOAT, 1, 0, D3D11_USAGE_DEFAULT, D3D11_BIND_SHADER_RESOURCE, 0, 0 };
device->CreateTexture2D(&m_pTextureDesc, NULL, &g_pScreenTexture11);

// Create shader resource view
D3D11_SHADER_RESOURCE_VIEW_DESC m_pResourceDesc = { m_pTextureDesc.Format, D3D11_SRV_DIMENSION_TEXTURE2D };
device->CreateShaderResourceView(g_pScreenTexture11, &m_pResourceDesc, &g_pScreenView11);

Code which gets executed every frame:

ID3D11Resource *m_pBackBuffer;

hr = swapchain->GetBuffer(0, __uuidof(m_pBackBuffer), (LPVOID*)&m_pBackBuffer);

if (SUCCEEDED(hr) && g_pScreenTexture11)
{
	// Update screen texture
	context->CopyResource(g_pScreenTexture11, m_pBackBuffer);

	// Update effect parameters
	g_pEffect11->GetVariableByName("colorTex")->AsShaderResource()->SetResource(g_pScreenView11);

	...
}

 

I thought the shader view interface only stores a pointer to the screen texture, so it will retrieve the updated data, when I change the texture object it was initalized with.

 

The code now fails at "context->CopyResource(g_pScreenTexture11, m_pBackBuffer);" though. The texture is just empty.

 

I tried this one too:

ID3D11Resource *tex;
g_pScreenSurface11->GetResource(&tex);
D3DX11SaveTextureToFile(context, g_pScreenTexture11, D3DX11_IFF_BMP, L"tex.bmp");

But it crashes the application at the second line already, I have no idea why.

 

I'm going to try and put together a quick testing program as I cannot bind PIX to any other software with my project applied. PIX tries to hook the DirectX Exported Functions which was already done by my hook, so it just crashes.

Edited by Crosire
0

Share this post


Link to post
Share on other sites

I thought the shader view interface only stores a pointer to the screen texture, so it will retrieve the updated data, when I change the texture object it was initalized with.

Nope, the view is tightly bound to the resource, you can't change that after creation.
 

The code now fails at "context->CopyResource(g_pScreenTexture11, m_pBackBuffer);" though. The texture is just empty.

If you look at the docs of this function you will have to check the compatibility of source and target resource. I already suspected some format problem: Is your backbuffer e.g. really 32F ?
 

I tried this one too:
*snip*
But it crashes the application at the second line already, I have no idea why.

BMP can't cope with float formats, use DDS instead.

I'm going to try and put together a quick testing program as I cannot bind PIX to any other software with my project applied. PIX tries to hook the DirectX Exported Functions which was already done by my hook, so it just crashes.

That leaves bare logging, hopefully. Dump the description of the resources and views in question using GetResource and GetDesc and compare them.
0

Share this post


Link to post
Share on other sites

If you look at the docs of this function you will have to check the compatibility of source and target resource. I already suspected some format problem: Is your backbuffer e.g. really 32F ?

And that is the problem. DirectX 9 had the "StretchRect" function, which did the job. DirectX 11 is missing any real replacement. I don't see how I can copy the resource of the backbuffer to my texture, which has a different format?!

 

"CopyResource" and "ResolveSubresource" both require compatible formats. If I set up my texture with the same format as the backbuffer, those work of course, but the Shader View is not created properly again (see below).

Whatever I'm trying, either one or the other thing work, but not both together ...

 

BMP can't cope with float formats, use DDS instead.

It's not the save texture line that makes it crash, it's the "GetResource" and that's because the shader resource view object is NULL (got my testing application working and can successfully debug it with the VS 11 graphics debugger) even though it was created earlier.

 

Tried to create it every frame too:

SAFE_RELEASE(g_pScreenSurface11);
device->CreateShaderResourceView(g_pScreenTexture11, NULL, &g_pScreenSurface11);
ID3D11Resource *tex;
g_pScreenSurface11->GetResource(&tex);

Crashes, it is still NULL. It works when I create the texture from a file instead of using the backbuffer, so I asume either the format is the issue or the backbuffer is missing the "D3D11_BIND_SHADER_RESOURCE" flag (which is probably the case).

The solution would be to create a texture with that flag set and copy the backbuffer contents onto it. But all my tries failed there yet, as shown above.

 

I hope I'm not too annoying here smile.png

Edited by Crosire
0

Share this post


Link to post
Share on other sites
Yeah, you need the copy resource, and you need D3D11_BIND_SHADER_RESOURCE on your target texture to then be able to sample from. Don't see where you fail, so I just reiterate:

- Your backbuffer is given, so D3D11_BIND_SHADER_RESOURCE is not available
- create a compatible texture, same format (and MS type!), same dimension, this time with D3D11_BIND_SHADER_RESOURCE
- create the view thereof
- use it as source for your postprocess
0

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0

  • Similar Content

    • By lonewolff
      Hi Guys,
      I am revisiting an old DX11 framework I was creating a while back and am scratching my head with a small issue.
      I am trying to set the pixel shader resources and am getting the following error on every loop.
      As you can see in the below code, I am clearing out the shader resources as per the documentation. (Even going overboard and doing it both sides of the main PSSet call). But I just can't get rid of the error. Which results in the render target not being drawn.
      ID3D11ShaderResourceView* srv = { 0 }; d3dContext->PSSetShaderResources(0, 1, &srv); for (std::vector<RenderTarget>::iterator it = rtVector.begin(); it != rtVector.end(); ++it) { if (it->szName == name) { //std::cout << it->srv <<"\r\n"; d3dContext->PSSetShaderResources(0, 1, &it->srv); break; } } d3dContext->PSSetShaderResources(0, 1, &srv);  
      I am storing the RT's in a vector and setting them by name. I have tested the it->srv and am retrieving a valid pointer.
      At this stage I am out of ideas.
      Any help would be greatly appreciated
       
    • By bowerbirdcn
      hi, guys, how to understand the math used in CDXUTDirectionWidget ::UpdateLightDir 
      the  following code snippet is taken from MS DXTU source code
       
        D3DXMATRIX mInvView;
          D3DXMatrixInverse( &mInvView, NULL, &m_mView );
          mInvView._41 = mInvView._42 = mInvView._43 = 0;
          D3DXMATRIX mLastRotInv;
          D3DXMatrixInverse( &mLastRotInv, NULL, &m_mRotSnapshot );
          D3DXMATRIX mRot = *m_ArcBall.GetRotationMatrix();
          m_mRotSnapshot = mRot;
          // Accumulate the delta of the arcball's rotation in view space.
          // Note that per-frame delta rotations could be problematic over long periods of time.
          m_mRot *= m_mView * mLastRotInv * mRot * mInvView;
          // Since we're accumulating delta rotations, we need to orthonormalize 
          // the matrix to prevent eventual matrix skew
          D3DXVECTOR3* pXBasis = ( D3DXVECTOR3* )&m_mRot._11;
          D3DXVECTOR3* pYBasis = ( D3DXVECTOR3* )&m_mRot._21;
          D3DXVECTOR3* pZBasis = ( D3DXVECTOR3* )&m_mRot._31;
          D3DXVec3Normalize( pXBasis, pXBasis );
          D3DXVec3Cross( pYBasis, pZBasis, pXBasis );
          D3DXVec3Normalize( pYBasis, pYBasis );
          D3DXVec3Cross( pZBasis, pXBasis, pYBasis );
       
       
      https://github.com/Microsoft/DXUT/blob/master/Optional/DXUTcamera.cpp
    • By YixunLiu
      Hi,
      I have a surface mesh and I want to use a cone to cut a hole on the surface mesh.
      Anybody know a fast method to calculate the intersected boundary of these two geometries?
       
      Thanks.
       
      YL
       
    • By hiya83
      Hi, I tried searching for this but either I failed or couldn't find anything. I know there's D11/D12 interop and there are extensions for GL/D11 (though not very efficient). I was wondering if there's any Vulkan/D11 or Vulkan/D12 interop?
      Thanks!
    • By lonewolff
      Hi Guys,
      I am just wondering if it is possible to acquire the address of the backbuffer if an API (based on DX11) only exposes the 'device' and 'context' pointers?
      Any advice would be greatly appreciated
  • Popular Now