• Advertisement
Sign in to follow this  

Render To Texture only color, no texture

This topic is 1924 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I am trying to render my scene to a texture, and then use that texture to map a cube in my scene. (like portal type hehe)

This is My FrameMethod that first render to Texture and then my scene.

[source lang="cpp"]void RenderFrame(void)
{
RenderToTexture();
// clear the back buffer to a deep blue
devcon->ClearRenderTargetView(backbuffer, D3DXCOLOR(0.0f, 0.2f, 0.4f, 1.0f));
// clear the depth buffer
devcon->ClearDepthStencilView(zbuffer, D3D11_CLEAR_DEPTH, 1.0f, 0);

RenderScene();
// switch the back buffer and the front buffer
swapchain->Present(0, 0);
}[/source]


This is my Render To Texture Function
[source lang="cpp"]void RenderToTexture() {

m_RenderTexture.SetRenderTarget(devcon, zbuffer);
//Clear the render to texture background to blue so we can differentiate it from the rest of the normal scene.
// Clear the render to texture.
m_RenderTexture.ClearRenderTarget(devcon, zbuffer, 0.0f, 1.0f, 0.0f, 1.0f);

// Reset the render target back to the original back buffer and not the render to texture anymore.
RenderScene();
// Render the scene now and it will draw to the render to texture instead of the back buffer.
// set the render target as the back buffer
devcon->OMSetRenderTargets(1, &backbuffer, zbuffer);

}[/source]

And this is my function to render my scene

[source lang="cpp"]void RenderScene(void) {


D3DXMATRIX matView, matProjection, matFinal;

// create a view matrix
D3DXMatrixLookAtLH(&matView,
&D3DXVECTOR3(0.0f, 3.0f, 15.0f), // the camera position
&D3DXVECTOR3(0.0f, 0.0f, 0.0f), // the look-at position
&D3DXVECTOR3(0.0f, 1.0f, 0.0f)); // the up direction

// create a projection matrix
D3DXMatrixPerspectiveFovLH(&matProjection,
(FLOAT)D3DXToRadian(45), // field of view
(FLOAT)SCREEN_WIDTH / (FLOAT)SCREEN_HEIGHT, // aspect ratio
1.0f, // near view-plane
100.0f); // far view-plane

// set the various states
devcon->RSSetState(pRS);
devcon->PSSetSamplers(0, 1, &pSS);

devcon->OMSetBlendState(BS_NO_ALPHA, 0, 0xffffffff);
m_model_level.CopyAndSetBuffers(devcon);
m_texture_shader.Render(dev,devcon,matView,matProjection,m_model_level.GetIndexCount(),true,pTexture);

devcon->OMSetBlendState(BS_ALPHA, 0, 0xffffffff);
m_model_cube.SlideUp();
m_model_cube.CopyAndSetBuffers(devcon);
m_color_shader.Render(dev,devcon,matView,matProjection,m_model_cube.GetIndexCount(),true);

devcon->OMSetBlendState(BS_NO_ALPHA, 0, 0xffffffff);
m_model_cube.CopyAndSetBuffers(devcon);
m_texture_shader.Render(dev,devcon,matView,matProjection,m_model_cube.GetIndexCount(),true,m_RenderTexture.GetShaderResourceView());


}[/source]

the problem is that i get the color of my RenderToTexture->Clear but no texture is displayed.

Is there a quick way of seeing how the texture looks like? can i save it to disk like a screenshot?

Share this post


Link to post
Share on other sites
Advertisement
Here is a code to save back buffer to a texture. I guess you can take the usable parts to save any texture to a file.


ID3D10Resource *backbufferRes;
_defaultRenderTargetView->GetResource(&backbufferRes);
D3D10_TEXTURE2D_DESC texDesc;
texDesc.ArraySize = 1;
texDesc.BindFlags = 0;
texDesc.CPUAccessFlags = D3D10_CPU_ACCESS_READ;
texDesc.Format = backbufferSurfDesc.Format;
texDesc.Height = backbufferSurfDesc.Height;
texDesc.Width = backbufferSurfDesc.Width;
texDesc.MipLevels = 1;
texDesc.MiscFlags = 0;
texDesc.SampleDesc = backbufferSurfDesc.SampleDesc;
texDesc.Usage = D3D10_USAGE_STAGING;
ID3D10Texture2D *texture;
HRESULT hr;
V( _device->CreateTexture2D(&texDesc, 0, &texture) );
_device->CopyResource(texture, backbufferRes);
V( D3DX10SaveTextureToFile(texture, D3DX10_IFF_DDS, filename) );
texture->Release();


Otherwise, have you checked your debug output? have you run the program with PIX?

Typical errors with render targets are such as :

- forgetting to unbind them as textures / rendertargets before using them else where for texture/render target.
- Invalid viewport settings.
- Invalid depth/stencil target with the render target (since D3D10 they must exactly have the same dimensions)

Cheers! Edited by kauna

Share this post


Link to post
Share on other sites
Thank you, i will try to output the texture if i can´t get this to work.
This is how i create the texture from the backbuffer and then make it to a ShaderResourceView:

I think there is just some small error somewhere and the whole is correct. =(

[source lang="cpp"]bool Initialize(ID3D11Device* device, int textureWidth, int textureHeight)
{
D3D11_TEXTURE2D_DESC textureDesc;
HRESULT result;
D3D11_RENDER_TARGET_VIEW_DESC renderTargetViewDesc;
D3D11_SHADER_RESOURCE_VIEW_DESC shaderResourceViewDesc;


// Initialize the render target texture description.
ZeroMemory(&textureDesc, sizeof(textureDesc));

// Setup the render target texture description.
textureDesc.Width = textureWidth;
textureDesc.Height = textureHeight;
textureDesc.MipLevels = 1;
textureDesc.ArraySize = 1;
textureDesc.Format = DXGI_FORMAT_R32G32B32A32_FLOAT;
textureDesc.SampleDesc.Count = 1;
textureDesc.Usage = D3D11_USAGE_DEFAULT;
textureDesc.BindFlags = D3D11_BIND_RENDER_TARGET | D3D11_BIND_SHADER_RESOURCE;
textureDesc.CPUAccessFlags = 0;
textureDesc.MiscFlags = 0;

// Create the render target texture.
result = device->CreateTexture2D(&textureDesc, NULL, &m_renderTargetTexture);
if(FAILED(result))
{
return false;
}

// Setup the description of the render target view.
renderTargetViewDesc.Format = textureDesc.Format;
renderTargetViewDesc.ViewDimension = D3D11_RTV_DIMENSION_TEXTURE2D;
renderTargetViewDesc.Texture2D.MipSlice = 0;

// Create the render target view.
result = device->CreateRenderTargetView(m_renderTargetTexture, &renderTargetViewDesc, &m_renderTargetView);
if(FAILED(result))
{
return false;
}

// Setup the description of the shader resource view.
shaderResourceViewDesc.Format = textureDesc.Format;
shaderResourceViewDesc.ViewDimension = D3D11_SRV_DIMENSION_TEXTURE2D;
shaderResourceViewDesc.Texture2D.MostDetailedMip = 0;
shaderResourceViewDesc.Texture2D.MipLevels = 1;

// Create the shader resource view.
result = device->CreateShaderResourceView(m_renderTargetTexture, &shaderResourceViewDesc, &m_shaderResourceView);
if(FAILED(result))
{
return false;
}

return true;
}[/source]

Share this post


Link to post
Share on other sites
And to make things easier, this is my Render Frame:

[source lang="cpp"]void RenderFrame(void)
{
devcon->OMSetRenderTargets(1, &m_RenderTexture.m_renderTargetView, zbuffer);
devcon->ClearRenderTargetView(m_RenderTexture.m_renderTargetView, D3DXCOLOR(0.0f, 0.2f, 0.4f, 1.0f));
devcon->ClearDepthStencilView(zbuffer, D3D11_CLEAR_DEPTH, 1.0f, 0);

RenderScene();

devcon->OMSetRenderTargets(1, &backbuffer, zbuffer);
devcon->ClearRenderTargetView(backbuffer, D3DXCOLOR(0.0f, 0.2f, 0.4f, 1.0f));
devcon->ClearDepthStencilView(zbuffer, D3D11_CLEAR_DEPTH, 1.0f, 0);

RenderScene();

swapchain->Present(0, 0);
}[/source] Edited by KurtO

Share this post


Link to post
Share on other sites
Not really an error, but are you sure that you'll need all the precision of DXGI_FORMAT_R32G32B32A32_FLOAT target? (just a reality check).

Well, another problem related to the render target texture can be found at your "void RenderScene(void)" function.

m_texture_shader.Render(dev,devcon,matView,matProjection,m_model_cube.GetIndexCount(),true,m_RenderTexture.GetShaderResourceView());

Namely the line above, you are calling that particular function also when rendering to the render target. I think that the case is described in the D3D docs (ie. trying to use the rendertarget as texture while rendering to it). This may or may not be your problem. Just to point out some code which has a logical flaw.

Cheers! Edited by kauna

Share this post


Link to post
Share on other sites
Yeah, i see what you mean, i cant use the GetShaderResourceView when rendering to my backbuffer the first time.
I made the RenderScene with a parameter, but still no luck. The buffer color is there, but not the texture...

[source lang="cpp"]void RenderFrame(void)
{
devcon->OMSetRenderTargets(1, &m_RenderTexture.m_renderTargetView, zbuffer);
devcon->ClearRenderTargetView(m_RenderTexture.m_renderTargetView, D3DXCOLOR(0.0f, 0.2f, 0.4f, 1.0f));
devcon->ClearDepthStencilView(zbuffer, D3D11_CLEAR_DEPTH, 1.0f, 0);

RenderScene(pTexture);

devcon->OMSetRenderTargets(1, &backbuffer, zbuffer);
devcon->ClearRenderTargetView(backbuffer, D3DXCOLOR(0.0f, 0.2f, 0.4f, 1.0f));
devcon->ClearDepthStencilView(zbuffer, D3D11_CLEAR_DEPTH, 1.0f, 0);

RenderScene(m_RenderTexture.GetShaderResourceView());
// switch the back buffer and the front buffer
swapchain->Present(0, 0);
}[/source]

Share this post


Link to post
Share on other sites
Well, that leaves only few options : first enable debugging and check your debug output for any d3d related errors. After run pix and take a frame capture to analyze what's going on.

Cheers!

Share this post


Link to post
Share on other sites
WOW PIX is really cool tool for DirectX!

Now i can see how my first buffer is rendering and it iiiisss rendering a capture of my scene. So far so good.
I guess that something must be wrong with my Init of the Class that connects my first Buffer to a ShaderResourceView
becuase there is only background coming into my Shader.

Is there a way that you can look at variables in Pix?

It must be something wrong with my connection because i saved the texture i want to capture to disk, and it also showed only green.

So this is the situation:

In PIX, the buffer is rendering correct but i cant save the backbuffer to texture, only the backbuffer color is saved.

So, is there a way of checking any of these variables?

[source lang="cpp"]bool Initialize(ID3D11Device* device, int textureWidth, int textureHeight)
{
D3D11_TEXTURE2D_DESC textureDesc;
HRESULT result;
D3D11_RENDER_TARGET_VIEW_DESC renderTargetViewDesc;
D3D11_SHADER_RESOURCE_VIEW_DESC shaderResourceViewDesc;


// Initialize the render target texture description.
ZeroMemory(&textureDesc, sizeof(textureDesc));

// Setup the render target texture description.
textureDesc.Width = textureWidth;
textureDesc.Height = textureHeight;
textureDesc.MipLevels = 1;
textureDesc.ArraySize = 1;
textureDesc.Format = DXGI_FORMAT_R32G32B32A32_FLOAT;
textureDesc.SampleDesc.Count = 1;
textureDesc.Usage = D3D11_USAGE_DEFAULT;
textureDesc.BindFlags = D3D11_BIND_RENDER_TARGET | D3D11_BIND_SHADER_RESOURCE;
textureDesc.CPUAccessFlags = 0;
textureDesc.MiscFlags = 0;

// Create the render target texture.
result = device->CreateTexture2D(&textureDesc, NULL, &m_renderTargetTexture);
if(FAILED(result))
{
return false;
}

// Setup the description of the render target view.
renderTargetViewDesc.Format = textureDesc.Format;
renderTargetViewDesc.ViewDimension = D3D11_RTV_DIMENSION_TEXTURE2D;
renderTargetViewDesc.Texture2D.MipSlice = 0;

// Create the render target view.
result = device->CreateRenderTargetView(m_renderTargetTexture, &renderTargetViewDesc, &m_renderTargetView);
if(FAILED(result))
{
return false;
}

// Setup the description of the shader resource view.
shaderResourceViewDesc.Format = textureDesc.Format;
shaderResourceViewDesc.ViewDimension = D3D11_SRV_DIMENSION_TEXTURE2D;
shaderResourceViewDesc.Texture2D.MostDetailedMip = 0;
shaderResourceViewDesc.Texture2D.MipLevels = 1;

// Create the shader resource view.
result = device->CreateShaderResourceView(m_renderTargetTexture, &shaderResourceViewDesc, &m_shaderResourceView);
if(FAILED(result))
{
return false;
}

return true;
}[/source]

Share this post


Link to post
Share on other sites
When i change my DepthBufferDescription D3D11_DEPTH_STENCIL_VIEW_DESC from DXGI_FORMAT_D32_FLOAT to DXGI_FORMAT_D24_UNORM_S8_UINT
it is working, but the culling is then not working. my triangles are now showing correctly within other meshes.

I have no idea which format to use?

any tips?

Share this post


Link to post
Share on other sites
The difference between those two flags is that:

DXGI_FORMAT_D32_FLOAT specifies a depth buffer where the depth is stored as a 32-bit floating point value.

DXGI_FORMAT_D24_UNORM_S8_UINT specifies that the depth buffer is stored as a 24-bit floating point value. The remaining 8 bits (the _S8_UINT section) specifies the stencil test value is stored as an 8-bit unsigned integer.
You'll need to have another look through your set up of the depth-stencil test and how you create your graphics device and back buffer to ensure you've got depth testing AND stencil testing enabled along with the back face culling.

These are some good tutorials with excellent explanation of the parameters etc that should help you get to grips with setting this stuff up:
http://www.directxtutorial.com/Tutorial11/tutorials.aspx
This one is also a more complete example but a bit harder to follow.
http://www.rastertek.com/tutdx11.html

Hope this helps!

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement