Jump to content

  • Log In with Google      Sign In   
  • Create Account


We have 4 x Pro Licences (valued at $59 each) for 2d modular animation software Spriter to give away in this Thursday's GDNet Direct email newsletter.

Read more in this forum topic or make sure you're signed up (from the right-hand sidebar on the homepage) and read Thursday's newsletter to get in the running!


Member Since 04 May 2006
Offline Last Active Dec 21 2014 04:24 PM

Topics I've Started

Sampling the depth buffer in a shader in DX11

16 November 2014 - 07:50 AM



I want to sample the depth buffer in a screen space effect in a shader that draws a full screen quad, using the back buffer and depth buffer as input. Basically I want to adjust the alpha for every pixel based on its distance from the camera, so nearby objects get almost masked out while far away objects have alphas closer to 1.0. I create my depth-stencil buffer and view like this:

D3D11_TEXTURE2D_DESC depthStencilDesc;

depthStencilDesc.Width = mRenderWindow->getClientAreaWidth();
depthStencilDesc.Height = mRenderWindow->getClientAreaHeight();
depthStencilDesc.MipLevels = 1;
depthStencilDesc.ArraySize = 1;
depthStencilDesc.Format = DXGI_FORMAT_D24_UNORM_S8_UINT;
depthStencilDesc.SampleDesc.Count = 1;
depthStencilDesc.SampleDesc.Quality = 0;
depthStencilDesc.Usage = D3D11_USAGE_DEFAULT;
depthStencilDesc.BindFlags = D3D11_BIND_DEPTH_STENCIL;
depthStencilDesc.CPUAccessFlags = 0; 
depthStencilDesc.MiscFlags = 0;

VERIFY(mD3DDevice->CreateTexture2D(&depthStencilDesc, 0, &mDepthStencilBuffer));
VERIFY(mD3DDevice->CreateDepthStencilView(mDepthStencilBuffer, 0, &mDepthStencilView));

I have a few questions.


1. Since I don't need to write to the depth buffer as I read from it in a shader, do I need to make any changes to the creation code above, or is it enough for me to unbind the depth buffer before binding it as a shader resource view?


2. Can I bind the mDepthStencilView as a shader resource view directly? Up till this point I have not sampled the depth buffer directly, only let the API use it in the depth test, so I have never thought of using it as an explicit shader input until now.


3. Since the format is DXGI_FORMAT_D24_UNORM_S8_UINT, that means that there are 24bits for the depth, right? How do I turn those into the floating point depth value between 0 and 1 in the pixel shader? Is there a special sampler I need to create or do I sample the xyz values and somehow combine those into a single value using bitwise operations in the shader?


Thanks for the help!

"Expanding" a bezier curve to a "pipe"

29 October 2014 - 01:04 PM

My plan is to make a game where the level is sort of a half-pipe. The shape of the level (or course as I call it) is given by a cubic bezier curve. I can generate nice bezier curves to define the course layout, but I am a little unsure of how I would "expand" the curve into a 3D pipe so that I can generate physics and graphics meshes out of the lower half of that pipe.


The course consists of a number of bezier curves where the start tangent is mirrored from the end tangent of the previous curve, so that they curves form a nice continuous curve. I would like to specify a number of points along a segment (eg. 10 or so) at which points I want to generate "edge loops".  These will then be the vertices of my level meshes.


Any ideas how I should go about generating the edge loop points? It is very important that the inner surface of the pipe remains smooth.



Looking for cheap AO baking solutions

17 October 2014 - 01:24 AM



We have a project where a game level is built from tiles. We are using Unity but we don't export levels as Unity scenes, but instead we bundle the tile sets with the game and create new levels on the fly (basically info about where the tiles are and how they are rotated) and send the level data across the network to the game.


The game needs to run on fairly low-end mobile devices so we cannot afford any SSAO or something like that but the levels would look so much nicer with some sort of AO. I am now roaming the net, looking for cheap AO or pseudo-AO solutions that we could use. When exporting the levels we have access to a lot of power (for instance I calculate occlusion culling data using compute shaders) and we can do some heavy lifting on the tools side, but the runtime needs to be ass cheap.


Anything come to mind? You are free to suggest anything that even remotely looks like proper SSAO. Thanks!

Find distinct colors in texture, using a Compute Shader

10 October 2014 - 11:58 PM



I have a scenario where I need to find the distinct colors in a large number of textures. Currently I am speeding this up by splitting the textures into four parts and handing off each part to one CPU thread. However, a coworker suggested I could do it on the GPU with a Compute Shader. Now, I have some understanding of compute shaders but I am still wondering how this would be done. Eg. to find the distinct colors on the CPU I have a list of "already encountered" colors that I fill up as I need to. Can this be done on a Compute Shader using shared memory within the thread group? Also, I am not quite sure how to split up the work so that I get the best performance out of the Compute Shader.


Has anyone done something like this, or do you have any good online resources for me? I am using Unity, so I should have access to most of the DX11 Compute Shader features.



How do I change the rotation mode in Maya (LT) to world space?

29 May 2014 - 03:38 AM

The docs for Maya says that the rotation tool can operate in different spaces:




Currently I can rotate my model in local space, but I would like to rotate it in world space, and I cannot figure out how and where to change the rotation mode.


I tried googling but with little success. Can you help a poor guy out? Thanks!


EDIT: Nevermind, found that double clicking on the tool icon opens up the tool options menu.