Jump to content

  • Log In with Google      Sign In   
  • Create Account


Direct3D9 Invalid Texture Format


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
No replies to this topic

#1 Programmdude   Members   -  Reputation: 125

Like
0Likes
Like

Posted 19 August 2012 - 01:18 AM

I have a project with three rendering backends, D3D9, D3D10 and GL2, and currently have a sample project, where I render a texture. I know, groundbreaking stuff I am doing Posted Image
Via DX10 I use the R8G8B8A8_UNorm format, works perfectly fine. Via GL2 I use GL_RGBA. Also works perfectly fine.
However with DX9 I either use A8R8G8B8 and the colors are reversed(b = r, r = b) or I use A8B8G8R8 and have it crash. According to MSDN they are identical to each other:
http://msdn.microsof...4(v=vs.85).aspx

The debug error messages are:
Direct3D9: (ERROR) :Invalid format specified for texture
Direct3D9: (ERROR) :Failure trying to create a texture

The code to get the pixels is the same, and so is pretty much every other bit of code.

CheckDeviceFormat() returns false, so my question is why does the exact same format work in DX10 and OpenGL but is not supported in DX9? I have searched and I cannot find anything on google or MSDN that implies that A8B8G8R8 is a bad evil format to work with.

Sponsor:



Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS