Jump to content
  • Advertisement
Sign in to follow this  
ferr

XNA 3.0, GetData from RenderTarget2D

This topic is 3585 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm trying to create a List<> of Texture2D's from RenderTarget2D.GetTexture() so that I can [do something with all of the textures] after the scene has finished rendering. There is a shallow copy issue with simply adding the texture to the list, so I'm thinking using Texture2D.GetData<> is the only way to deep copy it over. There is a problem with this, though, from an MS article:
Quote:
On Windows, GetData and SetData will fail if Texture2D.ResourceManagementMode is ResourceManagementMode.Manual and the format cannot be used as a render target.
Is the second part of the sentence conditional on the first or are they just randomly saying that RenderTarget2D textures are incompatible with GetData? I'm getting "The type you are using for T in this method is an invalid size for this resource." when trying to call GetData. On top of that they removed ResourceManagementMode from XNA with the release of 2.0, and apparently replaced it with TextureUsage and BufferUsage.
PresentationParameters pp = device.PresentationParameters;
renderTarget = new RenderTarget2D(device, pp.BackBufferWidth, pp.BackBufferHeight, 1, device.DisplayMode.Format);
...
Texture2D deepCopyTexture = null;
Texture2D originalTexture = renderTarget.GetTexture(); //RenderTarget2D
Color[] textureData = new Color[originalTexture.Width * originalTexture.Height];
originalTexture.GetData<Color>(textureData);
deepCopyTexture.SetData<Color>(textureData);

I messed around with the Type GetData uses a lot to see if it's something as simple as that, but it's always the same error.. Noticed this in one of the GetData articles:
Quote:
An InvalidOperationException is thrown if an attempt is made to modify (for example, calls to the GetData or SetData methods) a resource that is currently set on a graphics device.
I call device.SetRenderTarget(0, null); before GetData so I don't think that's the issue. [Edited by - ferr on July 29, 2008 11:05:39 PM]

Share this post


Link to post
Share on other sites
Advertisement
Quote:
Original post by ferr
Quote:
On Windows, GetData and SetData will fail if Texture2D.ResourceManagementMode is ResourceManagementMode.Manual and the format cannot be used as a render target.

Is the second part of the sentence conditional on the first or are they just randomly saying that RenderTarget2D textures are incompatible with GetData? I'm getting "The type you are using for T in this method is an invalid size for this resource." when trying to call GetData.


ResourceManagementMode.Manual means the surface is in the default memory pool which makes locking (needed for Get/SetData) tricky. A render target surface is always created in the default memory pool IIRC, so I think these remarks are basically equivalent.

I *think* RenderTarget2D.GetTexture() uses IDirect3DDevice9::GetRenderTargetData under the hood though, which moves the texture data into the system memory pool so it can be locked. That's a bit of speculation, but from my experience it seems to work that way. At least you can happily render the returned texture onto the render target again, which would fail if you were rendering the target onto itself.

As for your error, I don't know why the Color struct doesn't work (seeing it's basically a uint), but you could try using an int/uint as the type for the Get/SetData() calls and the temporary array. Assuming the texture format you're using is 32 bit per pixel, any 32 bit value type like int will work. It's also possible to extract color data from these ints, but I'll spare you that mess since it looks like you don't need it [smile] So to recap, this should work:


int[] textureData = new int[originalTexture.Width * originalTexture.Height];
originalTexture.GetData<int>(textureData);
deepCopyTexture.SetData<int>(textureData);



Quote:

On top of that they removed ResourceManagementMode from XNA with the release of 2.0, and apparently replaced it with TextureUsage and BufferUsage.


Yeah, that bugged me too for a bit, but they rolled in strongly typed replacements like RenderTarget2D for example. It deviates a bit from D3D, but I think it is indeed more logical to have a strongly typed RenderTarget2D which is guaranteed to be a render target, rather than working with some generic surface object which happens to have some creation flag set.


Quote:
Noticed this in one of the GetData articles:
Quote:
An InvalidOperationException is thrown if an attempt is made to modify (for example, calls to the GetData or SetData methods) a resource that is currently set on a graphics device.

I call device.SetRenderTarget(0, null); before GetData so I don't think that's the issue.


Actually, I ran into this issue just yesterday, but I think it's unrelated to your problem.

After I set a texture as a shader parameter (constant), this error kept popping up whenever I tried to use Get/SetData(). Setting the shader parameter to null didn't seem to unbind the texture, so how are we supposed to unbind textures from the device then? The only thing I could think of was to temporarily bind another texture to the shader, but that's nothing too elegant. Would anyone have any ideas on this?

Share this post


Link to post
Share on other sites
Quote:
As for your error, I don't know why the Color struct doesn't work (seeing it's basically a uint), but you could try using an int/uint as the type for the Get/SetData() calls and the temporary array. Assuming the texture format you're using is 32 bit per pixel, any 32 bit value type like int will work. It's also possible to extract color data from these ints, but I'll spare you that mess since it looks like you don't need it So to recap, this should work:

int[] textureData = new int[originalTexture.Width * originalTexture.Height];
originalTexture.GetData<int>(textureData);
deepCopyTexture.SetData<int>(textureData);

Thanks for your help. I had tried using GetData with Color, uint, and Int32. I'll mess around with other types later to see if that's the problem. The thing that bugs me is that there are a handful of tutorials out there showing off how to use GetData (although I can't find a single one that is getting data from a RenderTarget2D's texture) and they use either Color, uint, or Int32.

I actually ran into a post (which I couldn't find further confirmation for) that said something to the nature of XNA storing texture color in 8 byte data and that you would need to supply an array to GetData that has a size of Height*Width/2 (didn't work). Here's the thread: http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=981013&SiteID=1

Seems there's something about the texture you retrieve from RenderTarget2D that is not easily compatible with GetData.

Share this post


Link to post
Share on other sites
Quote:
Original post by remigius

After I set a texture as a shader parameter (constant), this error kept popping up whenever I tried to use Get/SetData(). Setting the shader parameter to null didn't seem to unbind the texture, so how are we supposed to unbind textures from the device then? The only thing I could think of was to temporarily bind another texture to the shader, but that's nothing too elegant. Would anyone have any ideas on this?


Did you call CommitChanges after setting the effect parameter? An Effect won't actually change any device state until you call that or BeginPass.

Share this post


Link to post
Share on other sites

Quote:
Original post by ferr
I actually ran into a post (which I couldn't find further confirmation for) that said something to the nature of XNA storing texture color in 8 byte data and that you would need to supply an array to GetData that has a size of Height*Width/2 (didn't work). Here's the thread: http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=981013&SiteID=1

Seems there's something about the texture you retrieve from RenderTarget2D that is not easily compatible with GetData.


You *could* store the data in 8 bytes (instead of the 4 byte int), but that's probably even more counter-intuitive than using an int in the first place. Key here is that XNA doesn't care about what format you want to put your data in, as long as there's enough room. You could for example also use an array of Height*Width*4 bytes. But with the default SurfaceFormate.Color using 32 bits per pixel, a 32 bit int is probably the most logical choice. I can imagine 8 bytes being useful for more exotic formats, but with that array size of Height*Width/2 I think it's just abusing the leniency of XNA.

Anyway, I wrote up a quick test and the code below seems to be working. Your snippet seems to be missing the creation code for deepCopyTexture, could you post that?


RenderTarget2D rt2d = new RenderTarget2D(GraphicsDevice, 800, 600, 1, SurfaceFormat.Color);
GraphicsDevice.SetRenderTarget(0, rt2d);
GraphicsDevice.Clear(Color.Red);
GraphicsDevice.SetRenderTarget(0, null);

Texture2D t2d = rt2d.GetTexture();

int[] data = new int[800 * 600];
t2d.GetData<int>(data);


Quote:
Original post by MJP

Did you call CommitChanges after setting the effect parameter? An Effect won't actually change any device state until you call that or BeginPass.


Yep, I actually called it twice for good measure [smile]

It seems XNA is a little particular about what is set on the device anyway... With the above example, calling rt2d.GetTexture() failed with the same error when I hadn't set the rt2d on the device *at all* yet. After setting it and 'resolving' it by setting the device rendertarget to null, it did work as expected.

Share this post


Link to post
Share on other sites
Quote:
Original post by remigius

Quote:
Original post by ferr
I actually ran into a post (which I couldn't find further confirmation for) that said something to the nature of XNA storing texture color in 8 byte data and that you would need to supply an array to GetData that has a size of Height*Width/2 (didn't work). Here's the thread: http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=981013&SiteID=1

Seems there's something about the texture you retrieve from RenderTarget2D that is not easily compatible with GetData.


You *could* store the data in 8 bytes (instead of the 4 byte int), but that's probably even more counter-intuitive than using an int in the first place. Key here is that XNA doesn't care about what format you want to put your data in, as long as there's enough room. You could for example also use an array of Height*Width*4 bytes. But with the default SurfaceFormate.Color using 32 bits per pixel, a 32 bit int is probably the most logical choice. I can imagine 8 bytes being useful for more exotic formats, but with that array size of Height*Width/2 I think it's just abusing the leniency of XNA.

Anyway, I wrote up a quick test and the code below seems to be working. Your snippet seems to be missing the creation code for deepCopyTexture, could you post that?


RenderTarget2D rt2d = new RenderTarget2D(GraphicsDevice, 800, 600, 1, SurfaceFormat.Color);
GraphicsDevice.SetRenderTarget(0, rt2d);
GraphicsDevice.Clear(Color.Red);
GraphicsDevice.SetRenderTarget(0, null);

Texture2D t2d = rt2d.GetTexture();

int[] data = new int[800 * 600];
t2d.GetData<int>(data);

With deepCopyTexture's creation code I planned to just cross that bridge when I got there. You've got GetData working with the texture from a RenderTexture2D, that was my problem, I'll test out that code with SurfaceFormat.Color when I can get back to my code.

I ended up using something that seems a little simpler, though. I'm just creating a Texture from device.ResolveBackBuffer() and adding the returned Texture to my list. It seems to work fine for what I'm trying to do.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

Participate in the game development conversation and more when you create an account on GameDev.net!

Sign me up!