Jump to content

  • Log In with Google      Sign In   
  • Create Account


Reading texture pixel data


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
3 replies to this topic

#1 wh1sp3rik   Members   -  Reputation: 248

Like
0Likes
Like

Posted 10 August 2012 - 12:56 PM

Hello,

I need to read pixels from a texture, but it just don't work for me.

ID3D11Resource * texture;
D3DX11CreateTextureFromFile(Engine::GetInstance()->GetDevice(),L"./Binary/Media/Textures/height.dds",NULL,NULL,&texture,NULL);
D3D11_MAPPED_SUBRESOURCE mr;
Engine::GetInstance()->GetContext()->Map( texture, 0, D3D11_MAP_READ, 0, &mr);
Engine::GetInstance()->GetContext()->Unmap( texture,0);

After mapping, i am not getting any valid pointer to pixel data, just null in mr variable.

This DDS texture contains one 16bit floating-point channel , which is used as heightmap for a terrain.
What i am doing wrong?

Thank you very much.

Edited by wh1sp3rik, 10 August 2012 - 12:57 PM.

DirectX 11, C++


Sponsor:

#2 Ripiz   Members   -  Reputation: 529

Like
0Likes
Like

Posted 10 August 2012 - 01:30 PM

You need to set D3DX11_IMAGE_LOAD_INFO::CpuAccessFlags to D3D11_CPU_ACCESS_READ and pass structure as 3rd argument, however you'll need to set other structure's values as well, research yourself what they should be.

More info:
http://msdn.microsof...6(v=vs.85).aspx
http://msdn.microsof...7(v=vs.85).aspx

Edited by Ripiz, 10 August 2012 - 01:32 PM.


#3 wh1sp3rik   Members   -  Reputation: 248

Like
0Likes
Like

Posted 10 August 2012 - 02:44 PM

I see :) i missed this parameter, thank you

DirectX 11, C++


#4 MJP   Moderators   -  Reputation: 10632

Like
2Likes
Like

Posted 10 August 2012 - 03:03 PM

Not all resources can be accessed by the CPU. It depends on the flags you passes when you created the resource, specifically the "Usage" and "CPUAccessFlags" members. There's a chart here that shows the valid combinations of these flags.

The D3DX texture loader will load your texture using D3D11_USAGE_IMMUTABLE, which gives you the fastest possible GPU performance for reading the texture. If you don't need to access the texture on the GPU, then what you can do instead is pass a D3DX11_IMAGE_LOAD_INFO structure through the pLoadInfo parameter, and specify that you want D3D11_USAGE_STAGING and D3D11_CPU_ACCESS_READ. Then you should be able to Map your texture and read it on the CPU. If you need to read the texture on GPU, you load a separate version of the texture that will use D3D11_USAGE_IMMUTABLE.

If you enable the debug version of the runtime, you should get error messages in the debug output telling you if make mistake like this. You can turn it on by passing the D3D11_CREATE_DEVICE_DEBUG flag when creating your device.




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS