Jump to content

  • Log In with Google      Sign In   
  • Create Account


mattropo1is

Member Since 14 Jun 2007
Offline Last Active Mar 10 2013 12:28 PM

Posts I've Made

In Topic: Mapping a DXT1 compressed texture

19 December 2012 - 03:14 PM

Thanks guys.

Yes, I can confirm if you just lock DXT compressed textures, the data you get back is compressed.

As for D3DXLoadSurfaceFromSurface(), you might also find (as I did) that D3DX11LoadTextureFromTexture() actually works better/easier. Yes, it is deprecated for Win8, but it was ok for our purposes. However, there is a new MS DirectXTex library for doing these operations that is compatible with Win8.

Thanks for the help!

In Topic: D3DX11CreateShaderResourceViewFromFile fails to load sRGB textures correctly?

09 January 2012 - 05:38 PM

Well, after much investigation and experimentation - it comes down to much of what has been said.
-The internal DX texture functions do NOT seem to understand or honor sRGB/gamma properly in png/jpg/dds files. The DX functions pretty much just load things assuming they are non-sRGB into unorm/standard rgb component internal formats.
-The nvidia photoshop dds plugin doesn't seem to mark things as sRGB properly. Sadly, this doesn't really matter as it's probably not honored anyway.
-You will have to have some sort of meta-data file, or a policy, that tells you which files need to be created using an sRGB texture surface/buffer.
-Make sure your content generation tools are outputing to sRGB. Adobe has a color space setting for this. Some tools do NOT have sRGB output. Make sure you can get your favorite content into the sRGB format you want before you make the jump.
-If you wish to use DX texture loading routines, you need to use the sRGB load flags on D3DXGetImageInfoFromFile() (i.e. D3DX11_IMAGE_LOAD_INFO Info.Filter = D3DX11_FILTER_NONE | D3DX11_FILTER_SRGB_IN | D3DX11_FILTER_SRGB_OUT). This preforms the correct interpretation of the data.
-Both your back buffer AND the textures need to be a DXGI_FORMAT_xxxx_SRGB format type.
-There are very few DXGI_FORMAT_xxxx_SRGB formats. Only 6 to be exact, and 4 are BCx compressed types. If you're not very careful, odds are very good you'll pay for painful load-time conversion because most of the old formats are ARGB, but the SRGB formats are RGBA. If you're looking for single-channel formats - you'll be sorely dissappointed.
-It's useful at first to change your pixel shader to do an pow(2.2) on each pixel (using the old textures) to see what you SHOULD be getting. Then, convert the textures and make the format and loading changes, then compare the results. They should look nearly the same.

It certainly does appear that at some point, you'll likely want to come up with your own texture format or get by using some extra meta-data file to specify whether the image is sRGB (diffuse channels, etc), or not (normal maps, data maps, etc).

Thanks for the help.

In Topic: D3DX11CreateShaderResourceViewFromFile fails to load sRGB textures correctly?

06 December 2011 - 07:18 PM

... The loader will properly handle DDS files, so you also have the option of creating a preprocessing pipeline that converts and compresses your textures in advance so that you don't need to do anything fancy at load time (or using a plugin for your content creation software that will save as DDS).


I have Photohop cs5, and used the nVidia dds plugin to save the file. I had Photoshop set to sRGB space, and when I went to save using the nVidia DDS plugin, it didn't show any obvious checkboxes to create an sRGB dds texture. I saved it, and the D3DX11CreateShaderResourceViewFromFile still created the non-sRGB UNORM texture.

Have you gotten D3DX11CreateShaderResourceViewFromFile to properly return an sRGB texture? What was your tool/process for creating the sRGB texture?
M

In Topic: D3DX11CreateShaderResourceViewFromFile fails to load sRGB textures correctly?

05 December 2011 - 06:01 PM

Yeah, it won't load as an sRGB format by default. You have to instruct the function to load it using an sRGB format (and not convert it) using the loadInfo parameter. Something like this should work:

D3DX11_IMAGE_LOAD_INFO loadInfo;
loadInfo.Width = D3DX11_DEFAULT;
loadInfo.Height = D3DX11_DEFAULT;
loadInfo.Depth = D3DX11_DEFAULT;
loadInfo.FirstMipLevel = D3DX11_DEFAULT;
loadInfo.MipLevels = D3DX11_DEFAULT;
loadInfo.Usage = D3D11_USAGE_IMMUTABLE;
loadInfo.BindFlags = D3D11_BIND_SHADER_RESOURCE;
loadInfo.CpuAccessFlags = 0;
loadInfo.MiscFlags = 0;
loadInfo.Format = DXGI_FORMAT_R8G8B8A8_UNORM_SRGB;
loadInfo.Filter = D3DX11_FILTER_SRGB_IN  | D3DX11_FILTER_SRGB_OUT | D3DX11_FILTER_NONE ;
loadInfo.MipFilter = D3DX11_DEFAULT;
loadInfo.pSrcInfo = NULL;   
D3DX11CreateShaderResourceViewFromFile(pd3dDEvice, pFilename, &loadInfo, NULL, &pTextureResourceView, NULL);


Thanks. Yes, this does get the desired result for this file, but the problem is that this code *forces* the jpg into sRGB space (it does look correct). But what I want is for D3DX11CreateShaderResourceViewFromFile to create the *correct* resource view based on what the texture file *says* it is.

If the texture is not sRGB (i.e. normal map/albieto/etc), then it should load as the RGBA8_UNORM. If it IS saved sRGB, then I'd want the RGBA8_UNORM_SRGB version...
Any ideas on that one?

In Topic: Memory leak reported by Vis Studio CRT report that's before main starts

27 October 2011 - 12:17 AM

Maybe I'm just confused (or sleep deprived!) but I don't see how this is a bug necessarily. You allocate memory pre-main() via invoking string constructors, via your static object. So if you check for leaks at the end of main(), you're not going to see that memory freed. You have to check for leaks after static destruction occurs, which is the purpose of the _CrtSetDbgFlag() call; if you don't get a report at program exit, it means you have no leaks - which, in this case, is exactly what I'd expect, after you've seen that result from other tools.


In almost every other case, I would say you're right. But the problem is this is an infrastructure library that many people will be using. I can imagine people using the CRT leak detection tools and seeing those two 'leaks' and assuming I'd been too lazy to clean up a trivial memory leaks. It's a professionalism issue IMHO. My other motivation is that if you don't REALLY know what is causing the leak, you could be masking much bigger problems later. Because I root caused it, changed the code, and it fixed it legitimately - I can NOW say with 100% confidence that it wasn't a real bug and have left it. But before, when I didn't know 100%, I was acting on faith. And in my experience, faith tends to blow up on you at 1am the night of code freeze.:)

Further, the MS documentation EXPLICITLY calls out the STL containers as having had this problem fixed. This is one case it's clearly not fixed. I'll be submitting a bug tomorrow on it.

That and it's just good nerdy fun to sort these things out. Now I can run all the configurations of my program without a peep out of the leak reporter and say that 'it doesn't leak' without the 'well, besides this leak that's not really a leak'. :)

Thanks guys for all the help!

PARTNERS