Jump to content
  • Advertisement

HomerSimpson

Member
  • Content Count

    7
  • Joined

  • Last visited

Community Reputation

193 Neutral

About HomerSimpson

  • Rank
    Newbie
  1. the y coordinate in the sample is always either greater than 1.0f or less than -1.0f, so that triangles would not be visible   I am going to bed now, it is 3 AM here in germany^^
  2.   It is definitely not correct if it outputs the same data on all slices while it is not supposed to do so...   Are you sure that you are using a different projection for every slice?
  3. HomerSimpson

    Blurring in HLSL

    You need to have another texture since you cannot read from and write to the same texture at the same time (in that case). So, as far as I know, that is the only way...   Edit: Maybe if you synchronize all threads after reading and doing the writing afterwards, that might work
  4. HomerSimpson

    Blurring in HLSL

    You could just create another texture of the same size as the shadow texture and set it as the render target. Then draw the shadow texture using that blur-shader, the rendered texture then contains the blurred shadow texture.
  5. Well, turns out I'm just stupid. The blendstate was incorrect, but was magically working on the back buffer... Can anyone maybe explain to me why? I am relatively new to Direct3D, so be nice to me  blendDesc.RenderTarget[0].SrcBlend = D3D11_BLEND_SRC_ALPHA; blendDesc.RenderTarget[0].DestBlend = D3D11_BLEND_INV_SRC_ALPHA; blendDesc.RenderTarget[0].BlendOp = D3D11_BLEND_OP_ADD; blendDesc.RenderTarget[0].SrcBlendAlpha = D3D11_BLEND_ONE; blendDesc.RenderTarget[0].DestBlendAlpha = D3D11_BLEND_ZERO; // changed to: D3D11_BLEND_INV_SRC_ALPHA; blendDesc.RenderTarget[0].BlendOpAlpha = D3D11_BLEND_OP_ADD; Is that correct now?   Thanks and sorry that I wasted your time
  6. So I have set up a render target with its texture created identically as the back buffer of a swap chain, except for the bindflags, which are set to Render_Target as well as Shader_Resource:   DXGI_FORMAT format = DXGI_FORMAT_R8G8B8A8_UNORM; D3D11_TEXTURE2D_DESC td; ZeroMemory(&td, sizeof(td)); td.Width = width; td.Height = height; td.MipLevels = 1; td.ArraySize = 1; td.Format = format; td.SampleDesc.Count = 1; td.SampleDesc.Quality = 0; td.Usage = D3D11_USAGE_DEFAULT; td.BindFlags = D3D11_BIND_RENDER_TARGET | D3D11_BIND_SHADER_RESOURCE; td.CPUAccessFlags = 0; td.MiscFlags = 0; hr = g_pd3dDevice->CreateTexture2D(&td, 0, &pRenderTargetTexture); if (FAILED(hr)) return hr; hr = g_pd3dDevice->CreateRenderTargetView(pRenderTargetTexture, NULL, &pRenderTargetView); if (FAILED(hr)) return hr;   When I am rendering onto the back buffer of the swap chain, the alpha blending works perfectly. But as soon as I change the render target to the one above it is not applied anymore, only AlphaToCoverage still works...   Do you have any idea if that is related to the BindFlags or am I missing something?   Thanks.
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!