Texture Sizes In 2005 And Beyond

Started by
12 comments, last by GameDev.net 19 years, 4 months ago
I'm working on several projects for 2005 and I'm trying to "guesstimate" the largest texture size I can use. I have 2 dev machines, a low end machine with a GeForce2 MX/MX 400 (around 4 years old) and I just bought a new computer that has on board Intel 82845G Graphics Controller. (I design 2D projects, which when using larger size textures, allows you to have more sprites, cells, per texture.) Both cards can handle up to 64mb of video memory. Both cards can handle up to textures sizes of 2048x2048. As far as end user comptability goes, what percentage do you think would be able to handle these Texture sizes? My guesses are in ( ) . 2048x2048 (20%) 1024x1024 (40%) 512x512 (60%) 256x256 (80%) 128x128 (90+%) I'd LOVE to use 1024x1024 textures for my projects, but based on my guesstimate above, I would severly limit my program compatibility. Opinions please, thanks!
Advertisement
>=8192x>=8192 (<5%)
4096x4096 (30%)
2048x2048 (50%)
1024x1024 (80%)
512x512 (80%)
256x256 (99.99%)
128x128 (99.99%)
My guess. Based on the fact that even onboard controllers handle up to 1k2 and 2k2 is the standard for GeForce I and later cards.
IIRC my old Geforce 3 ate up to 4k2 textures.
Given that 2005 is about 2 weeks away, I'd say the same as tody.
But why not use a dynamic system which sets everything up on the hardware the system is running on?
Is there any 3d card that can't do 256x256?
AFAIK even the first 3dfx could do those. Before that the only usable card was the Matrox Mystique, and i doubt you'll find anyone who still got one.

If you code a safety path that'll work with 256x256 you should get it running on every accellerating card out there.

Fruny: Ftagn! Ia! Ia! std::time_put_byname! Mglui naflftagn std::codecvt eY'ha-nthlei!,char,mbstate_t>

Valve's Source Engine Survey Granted, this isn't for 2005 - but its a good indication of whats currently in use. It is of use to you in that, at least for mass market, you can expect that what is high-end popular now is likely to become more mainstream over the next year or so.

Although, pure speculation, you wont get anything higher than 4k*4k for a while yet. The next logical step up, an 8k*8k surface with a 32bit pixel format consumes a meaty 256mb uncompressed... and there are only a couple of chipsets that have that much RAM in total at the high end... [grin]

Your "love" for wanting to use 1024x1024 is probably fairly safe though... At least the top-10 in the survey I linked should support this already, and theres no good reason why any new cards to be released would not support it!

hth
Jack

<hr align="left" width="25%" />
Jack Hoxley <small>[</small><small> Forum FAQ | Revised FAQ | MVP Profile | Developer Journal ]</small>

I'm using 1024x1024, and its handled by the 10 machines of my colleagues and friends, even the ones that have only 32 mb early G2.
I dont expect to run my app on older computers, this would lead to a too slow FPS anyway.
Delphi::Athena
I think you'll have to worry about total memory usage just as soon as about texture size. When I recently added a few 1024x1024 textures to my latest version, I noticed the game could no longer run on my back-up dev machine, my laptop with a Geforce 2 Go, 16Mb. I had to include alternate textures that dropped all the way back down to 256x256 before everything would load again, without lag. What I do now is check the total video memory at initialization, and if it's low, like below 32Mb or 64Mb (for my next project), that's when I start cutting back and shrinking the textures dynamically.
VideoWorks Softwarehttp://www.3dpoolpro.com
Thanks for the replies, a real eye opener! 1024x1024 pretty much lets me do whatever I want so I think I'll stick with that. ;)
Hooray for oversized textures - now our videogames can require multiple DVD's of game data! Yippeee!

I'm having bad memories of logging on to Unreal Tournament servers and being faced with 20 minutes of downloads - for skins, of all things. Lossy texture compression and procedural textures are the future.

On that subject, are there any hardware architectures that allow for varying level-of-detail in the texturemap i.e. storing it in a quadtree instead of bitmap (not just mipmapping)?
-- Single player is masturbation.
Quote:now our videogames can require multiple DVD's of game data! Yippeee!
just think of the fun/chaos (delete as appropriate) that we can have when blu-ray/HD-DVD with 20-30gb of storage rolls out! [evil]

Quote:are there any hardware architectures that allow for varying level-of-detail in the texturemap
I dont think there are, short of maybe some high end workstations. I remember reading that Pixar's deep shadow maps (ideal for rendering of hair) only work on textures with variable texel size (essentially what you'd want), and as such couldn't be ported to consumer level hardware.

hth
Jack

<hr align="left" width="25%" />
Jack Hoxley <small>[</small><small> Forum FAQ | Revised FAQ | MVP Profile | Developer Journal ]</small>

This topic is closed to new replies.

Advertisement