Create a texture size of 4096x4096

Started by
7 comments, last by ET3D 16 years, 3 months ago
My graphic card is NVIDIA GeForce 7900 GT with 256MB memory. I want to create a texture with the size of 4096x4096. But when I CreateDepthStencilSurface, it returns the error called D3DERR_OUTOFVIDEOMEMORY. How do I solved the problem?

#define SHADOW_MAP_SIZE=4096
#define SHADOW_MAP_FORMAT	D3DFMT_R32F

			// setup shadow map objects
			V_RETURN(pd3dDevice->CreateTexture(
				SHADOW_MAP_SIZE,
				SHADOW_MAP_SIZE,
				1, 
				D3DUSAGE_RENDERTARGET,
				SHADOW_MAP_FORMAT, 
				D3DPOOL_DEFAULT,
				&m_pShadowMapTex,
				NULL
				));

			V_RETURN(m_pShadowMapTex->GetSurfaceLevel(0, &m_pShadowMapSurf));

			V_RETURN(pd3dDevice->CreateDepthStencilSurface(
				SHADOW_MAP_SIZE,
				SHADOW_MAP_SIZE, 
				D3DFMT_D24S8,
				D3DMULTISAMPLE_NONE, 
				0,
				TRUE,
				&m_pShadowMapZ,
				NULL
				));


akira32 編程之家 Yahoohttp://tw.myblog.yahoo.com/akira32-akira32
Advertisement
Are you using the debug runtimes? Any relevant debug output? Have you created any more resources before this? What size is your backbuffer? How many backbuffers do you have (I.e. what's the value of your present params BackBufferCount member)? Have you created a depth stencil surface automatically (EnableAutoDepthStencil member of present params)? Have you checked your card caps to make sure it actually supports 4096x4096 textures (Although I think all revent NVidia cards do)?
Every Geforce starting from the 6xxx Series does support 4096^2 textures.

Your code shows two allocations: one rendertarget, one depth buffer. Did I understand you correctly that the rendertarget allocation succeeds but the depth stencil allocation did not? If yes, maybe you're indeed out of video memory. a 4096^2 R32F texture eats some 64MB of video memory, the depth stencil surface of that size eats *at least* another 64MB. If you did allocate some other resources and or some high-resolution backbuffer before, the 256MB video memory might be exhausted just like the error code says.
----------
Gonna try that "Indie" stuff I keep hearing about. Let's start with Splatter.
Quote:Original post by Schrompf
Every Geforce starting from the 6xxx Series does support 4096^2 textures.

Your code shows two allocations: one rendertarget, one depth buffer. Did I understand you correctly that the rendertarget allocation succeeds but the depth stencil allocation did not? If yes, maybe you're indeed out of video memory. a 4096^2 R32F texture eats some 64MB of video memory, the depth stencil surface of that size eats *at least* another 64MB. If you did allocate some other resources and or some high-resolution backbuffer before, the 256MB video memory might be exhausted just like the error code says.
Oh, I didn't notice that.

If you already have a swap chain, you have a frontbuffer and backbuffer. If they're both 4096x4096, then that's 128MB gone. The depth-stencil buffer is another 64MB and the render target texture is 64MB. That's all your 256MB gone in just pixel data, and the card will need to keep some more memory aside for bookeeping information.
Thank you,Evil Steve!

Are you using the debug runtimes?
No

Any relevant debug output?
CreateDepthStencilSurface returns the HRESULT of D3DERR_OUTOFVIDEOMEMORY

Have you created any more resources before this?
CreateTexture with 4096x4096

What size is your backbuffer?
800x600

How many backbuffers do you have (I.e. what's the value of your present params BackBufferCount member)?
2

Have you created a depth stencil surface automatically (EnableAutoDepthStencil member of present params)?
0

Have you checked your card caps to make sure it actually supports 4096x4096 textures (Although I think all revent NVidia cards do)?
Yes, my graphic card supports the max size of 4096x4096

I think that
4096x4096 texture with 32-bits costs 64MB
and Create DepthStencil costs 64MB
Back Buffer (800x600) *2 costs 18.6MB
===========================================
146.6MB
akira32 編程之家 Yahoohttp://tw.myblog.yahoo.com/akira32-akira32
You might consider not using a shadow map that big anyway. If your view distance is at all far, multiple smaller shadow maps would look nicer and save a lot of memory.
Quote:Original post by akira32
Are you using the debug runtimes?
No
Why not? Writing DX code without the debug runtimes is like compiling your code and not testing it at all. Just because it works on your card, doesn't mean it'll work on other cards (and vice-versa). The debug runtimes help with that.

Quote:Original post by akira32
Have you created any more resources before this?
reateTexture with 4096x4096
Is that in addition to the one in the source you posted, or is it referring to that one?

Quote:Original post by akira32
What size is your backbuffer?
800x600

How many backbuffers do you have (I.e. what's the value of your present params BackBufferCount member)?
2

Have you created a depth stencil surface automatically (EnableAutoDepthStencil member of present params)?
0

Have you checked your card caps to make sure it actually supports 4096x4096 textures (Although I think all revent NVidia cards do)?
Yes, my graphic card supports the max size of 4096x4096

I think that
4096x4096 texture with 32-bits costs 64MB
and Create DepthStencil costs 64MB
Back Buffer (800x600) *2 costs 18.6MB
===========================================
146.6MB
Close. You have 2 backbuffers, and one front buffer (I.e. you're currently tripplebuffering), and 800x600x32bpp = just over 1.8MB; although the graphics card may pad that to a power of two internally. It might allocate 1024x1024, which is 4 MB. You have 2 backbuffers and one frontbuffer, so 3 of those buffers = 12MB.

Although that still doesn't explain why it's saying it's running out of video memory. The debug runtimes might tell you more, perhaps theres some other error, and the closest error code D3D has to describe it is "out of video memory".
Well, that was interesting. I was running my own code and got D3DERR_OUTOFVIDEOMEMORY - turns out it's because I had GLQuake running in the background. You don't have any other 3D apps running do you?

And to go on about how good the debug runtimes are again, this is my debug output:
Quote:
Direct3D9: (ERROR) :Failed to create driver surface
Direct3D9: (ERROR) :Failed to initialize primary swapchain
Direct3D9: (ERROR) :Failed to initialize Framework Device. CreateDevice Failed.

D3D9 Helper: IDirect3D9::CreateDevice failed: D3DERR_OUTOFVIDEOMEMORY
I'd suggest creating just the depth texture and seeing if this causes a problem by itself. If so, it might mean a different limitation is being encountered, maybe something to do with fast Z processing.

This topic is closed to new replies.

Advertisement