Jump to content
  • Advertisement
Sign in to follow this  
Alex F

Bad performance of D3DX10CreateTextureFromMemory vs. D3DXCreateTextureFromFileInMemoryEx

This topic is 2470 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I am translating existing program from DX9 to DX10. DX9 code:

D3DXCreateTextureFromFileInMemoryEx(
g_pd3dDevice, // LPDIRECT3DDEVICE9
g_pMemoryBmpFile, // BYTE*
g_nFileSize,
D3DX_DEFAULT,
D3DX_DEFAULT,
1,
0,
D3DFMT_UNKNOWN,
D3DPOOL_MANAGED,
D3DX_DEFAULT,
D3DX_DEFAULT,
0,
NULL,
NULL,
&g_pTexture ); // LPDIRECT3DTEXTURE9

Execution time: about 2 ms on my computer.
DX10 code:

D3DX10CreateTextureFromMemory(
m_pDevice, // ID3D10Device*
m_pMemoryBitmapFile, // BYTE*
m_nMemoryBitmapFileSize,
NULL,
NULL,
&pRes, // ID3D10Resource*
NULL);

Execution time: 40-60 ms! Memory file is the same: grayscale image in BMP format, 1024*1024. What happens in DX10, is it possible to create a texture from memory with the same performance as in DX9?

Share this post


Link to post
Share on other sites
Advertisement
That time difference seems extreme but you should be aware that overall resource creation in D3D10 should be expected to be slower than in D3D9, owing to creation-time validation (rather than run-time). The tradeoff of course is better performance at run-time (assuming you get everything else right).

Share this post


Link to post
Share on other sites
kubera - thanks, I will try .dds format. Talking about .DDS files, what is the difference between P8 (An 8-bit color indexed format) and A8 (An 8-bit luminance-only format)? I need 8 bpp grayscale format, so that every pixel in an image has value 0 ... 255 and matches RGB color (0,0,0) ... (255,255,255).

mhagain - I am talking about runtime performance. My program creates textures dynamically at runtime for 2D animation.

Share this post


Link to post
Share on other sites
Ah, OK. Rather than creating textures dynamically at runtime you should create them once at load time and then update them at runtime - see http://developer.amd.com/assets/GDC_2008_Ultimate%20Graphics%20Performance%20for%20DirectX%2010%20Hardware.pdf (slide 6 is the relevant one).

Share this post


Link to post
Share on other sites
This is impossible in my case, animation frames contents is unknown at build and at load time.

Share this post


Link to post
Share on other sites
I do not know the indexed format, but maybe it is legacy.
A8 is the alpha-blending format.
You would need L8 or similar for the grayscale.

P. S.
You would consider generating Mip-Maps in .DDS

Share this post


Link to post
Share on other sites
I'm afraid you're just going to have to deal with this. Switching to a different format is extremely unlikely to help as your bottleneck isn't the amount of data you must send to the GPU but resource creation itself. Resource creation in D3D10/11 is slower than in D3D9 and the only way around it is to just not create resources at runtime. This is what MS advise, this is what the hardware vendors advise, and there is no magic button that will make it go away.

So - if you have a single texture and you know the size in advance you create a texture during startup then update it's contents at runtime.
If you have multiple textures and you don't know their sizes in advance you create a pool of textures during startup then pick the most appropriate fit and update it at runtime.
If you can, pre-load as much as possible during startup.

But you need to get away from the "create resources at runtime" mentality as it's just not performant under D3D10/11 owing to the architecture changes.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!