textures and video cards

Started by
7 comments, last by Nik02 13 years, 6 months ago
1) How can I check to see what's the largest texture my graphics card can support?
2) What does the video card do if I try to create/use a texture that is bigger than it can support?
Advertisement
Depends on what library you use.
Are you referring to #1 or both?

Either OpenGL or DirectX.
1: glGetInteger with GL_MAX_TEXTURE_SIZE for 1D, 2D and cubemap textures, or GL_MAX_3D_TEXTURE_SIZE for 3D textures.
2: The command to glTexImage is effectively ignored (as if you never called it, with respect to the OpenGL state vector), and the GL_INVALID_VALUE error bit is set.
Thanks. Can anyone answer those with regards to Direct3D?
You can check using the caps util that comes with the directx sdk, i couldnt tell you have to do it in program tho, if thats what you wanted.
2: In Direct3D, you can check the return value of the resource creation methods to determine whether or not the request to create the texture succeeded. If the return value indicates a failure, the texture interface pointer that you pass to the function should be considered invalid (though the methods shouldn't modify it in case of failure).

It is always best to check the capabilities of the hardware at runtime instead of assuming anything. All versions of D3D have methods of querying the capabilities of a graphics device.

In D3D9: IDirect3D9::GetDeviceCaps will get you a structure from which you can determine the maximum texture dimensions, among other things. The dimensions vary greatly across different hardware, but most cards that have D3D9-level drivers available at all will support at least 1024x1024.

In D3D10: There are fixed limits. 2D texture maximum dimensions are 8192x8192.

In D3D11: There are fixed limits per device feature level; with D3D11 hardware, the maximum 2D texture dimensions are 16384x16384.

Niko Suni

Quote:
In D3D10: There are fixed limits. 2D texture maximum dimensions are 8192x8192.

In D3D11: There are fixed limits per device feature level; with D3D11 hardware, the maximum 2D texture dimensions are 16384x16384.


Thanks, Nik02. Where did you get those numbers from? I'm assuming that if I try to create a texture that's bigger than the limit the API call will just return some sort of error message, or will it actually go through and I'll just get some unexpected rendering issue?
The numbers are defined in the D3D headers, as per the API specifications. It is possible that the hardware could actually support bigger dimensions, but the API itself doesn't. Therefore, it is highly unsupported to create textures bigger than the public limits.

I don't remember off the top of my head if it is the API or the driver that enforces the maximum resource sizes in practice. In D3D9, it was always the driver's responsibility to report the correct values via the caps. However, since in D3D10 and D3D11 the resource management is virtualized thru DXGI, it is entirely possible that the limits are strictly imposed by that system itself.

Since the video memory is indeed virtualized with DXGI, the actual resource limits - as supported by the hardware - are not very relevant from the application developer's perspective anyway.

That said, you do get a HRESULT with value E_OUTOFMEMORY from the texture creation methods if the system cannot allocate memory for the resource for any reason. Further, you can inspect the D3D debug stream to dig up the cause. In general, it is a good practice to handle any error from the methods by using the FAILED macro, because you don't have a valid texture anyway if any failure is indicated.

Niko Suni

This topic is closed to new replies.

Advertisement