Jump to content

  • Log In with Google      Sign In   
  • Create Account


Loading Uncompressed Textures

  • You cannot reply to this topic
10 replies to this topic

#1 Vincent_M   Members   -  Reputation: 616

Like
0Likes
Like

Posted 12 August 2014 - 04:20 PM

My new engine editor allows me to tweak my textures' properties such as filtering, wrap modes, max size on a per-platform basis so my artists can make high-res textures, optional mipmap generation, etc... I will also allow the user determine which format the texture data will be stored as (32-bit RGBA, 16-bit RGBA_5551/RGB_565, 8-bit alpha, etc, PVRTC 2/4-bit via provided compress tools, etc). Then, I'll write its properties and its pixel data to my own custom texture format. The textures are loaded in-editor using the FreeImage library. Would this be a good way to go about it, or should I consider compression encoding non-PRVTC textures? This would require extra data pools to stream in temporary compressed data so it can be uncompressed leading to longer load times than a single freed(). Textures can get big quickly, for example: a single 1024x1024 texture takes up 4MB of disk space if stored uncompressed plus its small header data. I could get around that by putting all my files into zip archives. Does a custom file format sound like a good idea?

 

My texture system extends beyond typical 2D textures as well. For example, I'll have a cube-map editor that'll put all my cube maps together into a single "atlas" of sorts. I'll also have a 3D texture editor and 1D texture support. The idea is that the data is serialized into file formats that my engine can readily load up quickly with only a handful of fread() calls. They're large cuz I'm banking on uncompressed texture support (except for PVRTC), but possible. Of course, loading zips from an APK on Android will prove annoying as I'll have to copy all of my zips from the APK (which is a zip archive) into a cached directory.

 



Sponsor:

#2 frob   Moderators   -  Reputation: 20131

Like
3Likes
Like

Posted 12 August 2014 - 04:54 PM

1) The textures are loaded in-editor using the FreeImage library. Would this be a good way to go about it, or should I consider compression encoding non-PRVTC textures?

2) Does a custom file format sound like a good idea?

 

1) As a general purpose engine, supporting more image types is generally a good idea. The typical workflow for artists is to use photoshop images (psd files) for their work and export textures as one of the compressed formats supported by cards, such as DXT1 or DXT5 or whatever. These formats are usually supported directly by the card so no further processing is necessary.  Some people are tempted to support jpeg and gif and similar formats; those are great for the Internet or places where size is more important than fidelity, not so much in games. Uncompressed raw images are something of a last resort used for compatibility when all else has failed.

 

2) Files are how you store data. If you need to store data your options are fairly limited. Either rely on someone else's format, or use a markup language (XML, YAML, whatever) or make up your own format. Those are the only options. If you decide to make your own format a frequent suggestion is to make it exactly match the layout used in memory so you don't need to parse it, you can load it directly and set your pointers to offsets within the data.


Check out my personal indie blog at bryanwagstaff.com.

#3 SeanMiddleditch   Members   -  Reputation: 5070

Like
1Likes
Like

Posted 12 August 2014 - 05:58 PM

It's not uncommon for games to have two build modes with one for the final shipping game executable and one for developers (and content creators) where the former can only load a small handful of useful types (PNG for UI, DXTC for textures, for example) and the latter can load anything.

In yet other engines, there's a separate tool for conversion of images and the like and the game will (in editor mode) invoke that utility on demand when asked to load unconverted resources or when a resource setting is changed.

Yet other engines just have separate game builds and editor builds.

#4 Vincent_M   Members   -  Reputation: 616

Like
0Likes
Like

Posted 12 August 2014 - 06:43 PM

 

 

1) As a general purpose engine, supporting more image types is generally a good idea. The typical workflow for artists is to use photoshop images (psd files) for their work and export textures as one of the compressed formats supported by cards, such as DXT1 or DXT5 or whatever. These formats are usually supported directly by the card so no further processing is necessary.  Some people are tempted to support jpeg and gif and similar formats; those are great for the Internet or places where size is more important than fidelity, not so much in games. Uncompressed raw images are something of a last resort used for compatibility when all else has failed.

You bring up a good point about DXT1/5, etc. I've always stored my textures as a png, then used an image library to load my data. The thing is, I'm starting to think the release version of my game doesn't need a code base as extensive as FreeImage on mobile/embedded platforms just to load images when it should have some sort of serialized, unified pipeline. The format would match the data structures' format in-game so loading will be simple.

 

 

 

It's not uncommon for games to have two build modes with one for the final shipping game executable and one for developers (and content creators) where the former can only load a small handful of useful types (PNG for UI, DXTC for textures, for example) and the latter can load anything.

In yet other engines, there's a separate tool for conversion of images and the like and the game will (in editor mode) invoke that utility on demand when asked to load unconverted resources or when a resource setting is changed.

Yet other engines just have separate game builds and editor builds.

This is along the lines of what I'd like to go with. For example, the users would import their textures into the editor, and the Inspector would provide it with image format options based on its deployment platform. For example, if it's iOS, you could compress for PVRTC 2 or 4-bit by selecting it from the format combo box, and the PVRTC command line tool would run in the background a deployment-time (as needed) to spit out new info.



#5 Ohforf sake   Members   -  Reputation: 1788

Like
5Likes
Like

Posted 13 August 2014 - 09:23 AM

The typical workflow for artists is to use photoshop images (psd files) for their work and export textures as one of the compressed formats supported by cards, such as DXT1 or DXT5 or whatever.

This is a very bad idea. Don't let your artists produce or export to the final format. Ever.

It will all start innocent, the artists commit the DXT compressed textures and you are happy. Then you realize, that all textures are compressed as DXT5 instead of DXT1, even those without an alpha channel, because DXT5 has twice the bitrate and thus MUST HAVE twice the quality. You ask them to remedy the mistake but going through all the textures in the repo and reexporting them manually is too much work and apparently too error prone.
At this point, you can forget any efforts to get an automatic export running, because the DXT compressed versions, and the (presumably) source material somehow got out of sync. And there are multiple copies of the source material and noone really knows which ones are to most recent ones.

And then you consider to change the way the textures are compressed, or even the compression format, but your only viable source material are the already lossy DXT1/5 compressed textures.
 

In yet other engines, there's a separate tool for conversion of images and the like and the game will (in editor mode) invoke that utility on demand when asked to load unconverted resources or when a resource setting is changed.


This is what you want to have. Have the artists commit the textures in a lossless compressed form (TIF, PNG, ...) optionally annotate them so you can differentiate between diffuse maps, normal maps, etc. Then write some automatic content processing tool that compresses the textures for the individual platforms. Whenever you change your mind about if and how the textures should be compressed for a specific platform, or when ever you add a new platform, all you need to do is adapt the content processing tool and rerun it overnight.

The same obviously also holds for other things like (for example) audio files.

#6 frob   Moderators   -  Reputation: 20131

Like
2Likes
Like

Posted 13 August 2014 - 08:00 PM

 

The typical workflow for artists is to use photoshop images (psd files) for their work and export textures as one of the compressed formats supported by cards, such as DXT1 or DXT5 or whatever.

This is a very bad idea. Don't let your artists produce or export to the final format. Ever.

It will all start innocent, the artists commit the DXT compressed textures and you are happy. Then you realize, that all textures are compressed as DXT5 instead of DXT1, even those without an alpha channel, because DXT5 has twice the bitrate and thus MUST HAVE twice the quality. You ask them to remedy the mistake but going through all the textures in the repo and reexporting them manually is too much work and apparently too error prone.
At this point, you can forget any efforts to get an automatic export running, because the DXT compressed versions, and the (presumably) source material somehow got out of sync. And there are multiple copies of the source material and noone really knows which ones are to most recent ones.

And then you consider to change the way the textures are compressed, or even the compression format, but your only viable source material are the already lossy DXT1/5 compressed textures.
 

 

It depends on your workflow and the discipline that is kept in the office.

 

Everywhere I've been the art budget was fixed in stone in advance. The maximum size of each model, both meshes and textures, is specified and is included in the acceptance criteria. If a single asset is over budget it is well known because it shows up on the feature dashboard that shows all the metrics of all the assets; it needs to be approved by the art lead, the art director, the feature designer, the tech lead, and the project manager.  When it happens it is usually just "this is an important model" followed by "yup", "yup", "yup", "go ahead". 

 

Since the asset sizes show up in several metrics and are reviewed daily, it would be fairly hard for an artist to slip in a 4MB dxt file without at least one person noticing. 

 

If your art process is such that nothing gets reviewed and there is not accountability, sure, I'll agree it is a bad thing.  But that is bad because of a (lack of) policy and a lack of discipline.


Check out my personal indie blog at bryanwagstaff.com.

#7 Hodgman   Moderators   -  Reputation: 29304

Like
6Likes
Like

Posted 14 August 2014 - 12:16 AM

Have the artists commit the textures in a lossless compressed form (TIF, PNG, ...) optionally annotate them so you can differentiate between diffuse maps, normal maps, etc. Then write some automatic content processing tool that compresses the textures for the individual platforms. Whenever you change your mind about if and how the textures should be compressed for a specific platform, or when ever you add a new platform, all you need to do is adapt the content processing tool and rerun it overnight.

^This.
The data should be 'compiled' just like the code is, via an automated build system. I personally use RGB PNGs for almost every texture (with each logical bit of data - diffuse/alpha/roughness/etc - in it's own file, instead of packed together), which are then automatically compiled into DXT/etc. If an artist saves out a new PNG, the build system on their PC immediately detects the change, converts to DDS in the background, and if the game is running, sends a packet to the game telling it to reload the file.
The game always works with optimized formats, and the connection in the middle is mostly invisible automation that the artists need not worry about.

 

If there's special conversion requirements, then these can be expressed in the build rules -- e.g. textures named "*_raw.*" are always uncompressed, or textures named "*_nrm.*" use a special normal-map compression scheme...

This lets you keep the game/engine extremely lightweight, instead of having tonnes of extra loading/processing baggage in the code-base... and instead you've got a plugin-based build system where you can keep adding new formats as required.

This is extremely important when you want to support a new platform -- e.g. you suddenly require big-endian files as well as little endian ones, or suddenly require PVRT as well as DXT format textures!
It's also extremely important as your requirements change. In my engine, if the graphics programmer suddenly decides that they want to move the translucency data out of the 'diffuse' texture's Alpha channel, and instead move it into the Red channel of an auxiliary texture (maybe along with roughness and specular-mask in Green and Blue), then they just have to edit a small configuration file, which results in the textures being automatically rebuilt.
If your artists are exporting DXT files by hand, then they'll have to re-export every texture, manually rearranging the channels as they go. Same for model formats, etc...
You never want to hear the line "We could do that, but it would require a re-export..."! biggrin.png

 

If you're going by strict asset budgets, the build system can spit out the nice reports that you need, or even automatically refuse to create infringing files.


Edited by Hodgman, 14 August 2014 - 12:20 AM.


#8 SmkViper   Members   -  Reputation: 614

Like
0Likes
Like

Posted 14 August 2014 - 09:33 AM

Another strike against having art export DXT files themselves is that there are a few new DXT formats that most art tools can't even read, let along write, but which are important for getting better compression or fewer artifacts out of your files (Example: BC5, BC6, and BC7). Heck, your artist is not going to want to sit there for several minutes with their machine completely tied up (as in your mouse is unresponsive) just to compress a single BC7 file.

Edited by SmkViper, 14 August 2014 - 09:34 AM.


#9 Vincent_M   Members   -  Reputation: 616

Like
0Likes
Like

Posted 14 August 2014 - 05:27 PM

Ok, so it sounds like using engine-processed data would be a better way to go than using typical image formats such as png or jpeg because you miss out on hardware compression, load times could be higher and extra memory will be needed for temporary memory pools. One thing though: I wouldn't see us hardware-compressing algorithmic textures, such as normal maps, since its data needs to be as accurate as possible.



#10 SmkViper   Members   -  Reputation: 614

Like
0Likes
Like

Posted 15 August 2014 - 08:21 AM

Ok, so it sounds like using engine-processed data would be a better way to go than using typical image formats such as png or jpeg because you miss out on hardware compression, load times could be higher and extra memory will be needed for temporary memory pools. One thing though: I wouldn't see us hardware-compressing algorithmic textures, such as normal maps, since its data needs to be as accurate as possible.


BC5 is specifically designed to compress normal maps with a minimal amount of artifacting. It is able to avoid the obvious blockiness at lower resolutions you get with older DXT formats while being smaller because it only stores two channels.

#11 Hodgman   Moderators   -  Reputation: 29304

Like
2Likes
Like

Posted 16 August 2014 - 12:01 AM

^As well as that, the old school solution was to throw out the Z channel of your normal maps (because you can reconstruct it from the other two), and then putting the X/Y channels into the Green and Alpha channels of a DXT5 (with nothing in Red/Blue).
This gives better quality than naively using RGB, because DXT5 compresses RGB and Alpha data seperately, so by using G and A, you make sure there's no cross-talk during compression. The alternative to achieve the same memory savings is to reduce the resolution by half in both width and height (1/4 the pixels).

If you're targeting hardware without much memory, then compression or low resolution is a requirement. We used config files in our build system to swap between the two to compare quality, and also we let the artists specify a choice per asset if required.





PARTNERS