Jump to content

  • Log In with Google      Sign In   
  • Create Account

carkey

Member Since 13 Sep 2013
Offline Last Active Sep 18 2013 07:26 AM

Topics I've Started

CopySubresourceRegion copying wrong part on some devices

17 September 2013 - 04:01 AM

I'm working on a project when I want to use GPU picking, basically it's all set up fine, assigning unique colors to objects etc. but my current problem is that when creating the 1x1 texture, it doesn't do this where the pointer is for some reason, but about -250 pixels in the x and y.

I start by rendering the scene to my own ID3D11RenderTargetView and then use GetResource() to copy this buffer data to a ID3D11Resource.

 

This all works fine and I know it is capturing it correctly because I'm using the DirectXTK function SaveWICTextureToFile() to save off this resource to a PNG - which I can open on the desktop and it looks fine.

 

I then create a D3D11_TEXTURE2D_DESC of height 1px and width 1px and call CreateTexture2D() to create this texture.

I then create a D3D11_BOX whose left is at mouse position X and top is at mouse position Y and whose width and height are both 1.

I then call CopySubresourceRegion() like so:

 

CopySubresourceRegion(texture1x1, 0, 0, 0, 0, textureFromRenderTargetView, 0, box1x1);

 

I then look at the pixel RGBA value but it's always wrong.

 

I thought I might be able to debug it by creating a larger texture and seeing where it thinks the "mouse pointer" is:

So I changed box to be 200x200 pixels rather than 1x1 and use the DirectXTK function to save it to a PNG.

 

If I compare the original buffer texture PNG and the new 200x200 PNG there is something weird going on. The 200x200's top left corner is nowhere near where the mouse point is, it is about -250 in the X and Y axes. And even weirder, the 200x200 image seems like it has been scaled up, when I overlay it onto the original buffer texture PNG the objects are definitely larger.

Does anyone know what is going on here and what I can do to solve it?

 

Below is my D3D11_TEXTURE2D_DESC I use for the 1x1 texture and also its D3D11_SUBRESOURCE_DATA:

 

 

D3D11_TEXTURE2D_DESC desc;

desc.Width = 1;
desc.Height = 1;
desc.MipLevels = 1;
desc.ArraySize = 1;
desc.SampleDesc.Count = 1;
desc.SampleDesc.Quality = 0;
desc.Usage = D3D11_USAGE_STAGING;
desc.Format = DXGI_FORMAT_B8G8R8A8_UNORM;
desc.BindFlags = 0;
desc.CPUAccessFlags = D3D11_CPU_ACCESS_READ | D3D11_CPU_ACCESS_WRITE;
desc.MiscFlags = 0;

D3D11_SUBRESOURCE_DATA subData;
float buff[4] = {0,0,0,0};
subData.pSysMem = (void *)buff;
subData.SysMemPitch = 4;
subData.SysMemSlicePitch = 4;

device->CreateTexture2D(&tdesc, &subData, &texture1x1);

 

I then create the box to have a 1 pixel height and width, starting at the pointer position.

 

The strange thing is, this works fine on some devices but not others. When on my desktop and laptop it works fine but on my tablet (windows) it does this strange 200pixels off thing. Could it be something to do with resolution or something?

 

I really can't work out what's going on on this one particular device (which is the target device).

Any ideas?

 

Thanks for your time.

 

P.S. disclaimer: this is a cross-post from stackoverflow but I put that question up a week ago and have had no response so I thought I'd try here.

 


Scaling instance data individually in vertex shader

16 September 2013 - 05:54 AM

Hi,

 

I'm trying to scale instance data individually in the vertex shader with directX and hlsl. have an input layout that takes a bool and if this is true I want to scale the instance.

 

The bool goes across to the shader fine but the scaling seems to be wrong.

 

Basically, I have a model matrix that exists in the shader but before I do:

position = mul(input.position, model);
position = mul(input.position, view);
position = mul(input.position, projection);

I first ask if this bool is true, if so I do:

model = mul(model, scaleMatrix);

scaleMatrix is just a simple scale matrix where the scale values are 0.5.

 

My problem is that it doesn't seem to be scaling the instance at it's own origin (which is what I want), it is scaling it at some other origin but I don't know why this is the case.

 

Any ideas?

 

I don't have much experience with instancing so any help would be greatly appreciated.

 

Thanks.


Trying to write a smoothing algorithm for meshes

13 September 2013 - 10:14 AM

Hi all,

 

I'm working on a small project and I've got to a point where I want to be able to smooth the meshes I've got.

 

I've had a quick google around and found something called Laplacian Smoothing http://en.wikipedia.org/wiki/Laplacian_smoothing but I'm not really sure how it works.

 

Does anyone have any good resources of what smoothing algorithms are out there, their pros/cons etc?

 

Thanks for your time.


PARTNERS