Jump to content

  • Log In with Google      Sign In   
  • Create Account


mattropo1is

Member Since 14 Jun 2007
Offline Last Active Mar 10 2013 12:28 PM
-----

Topics I've Started

Mapping a DXT1 compressed texture

06 December 2012 - 08:06 PM

Hi guys,

I have some textures that when created, were created as DXT1 compressed. I'd like to map them, and get the data back out. To that end, I created a staging buffer, do a copy of the original resource to the staging buffer, then map the staging buffer. This works just fine for non-DXT compressed textures; and the data all looks correct with the test textures I used.

questions:
1. When I map a DXT1 compressed texture - am I seeing compressed bits?
2. If they are not the compressed bits, what format are they in (RGB8)?
3. If they ARE compressed bits,
3a. Can I specify the staging buffer they are mapping to have a different format such as RGB8 to get the non-compressed bits (i.e use it as a decompressor)?
3b. If I cannot do 3A, then are there software decompressors around in a library that I might use to get the raw bits back out?

Thanks in advance

D3DX11CreateShaderResourceViewFromFile fails to load sRGB textures correctly?

05 December 2011 - 05:07 PM

I've got a block of code that's loading what I'm pretty sure is an sRGB jpg (as the file properties on it say it is enocded sRGB color format, and I selected the sRGB profile when saving it in Photoshop) that I wish to use as a texture.

 hResult = D3DX11CreateShaderResourceViewFromFile( pd3dDevice, pFilename, NULL, NULL, &pTextureResourceView, NULL );
 pTextureResourceView->GetDesc( &desc );
 desc.Format;

When i run the above code on my texture, it always loads as DXGI_FORMAT_R8G8B8A8_UNORM, not DXGI_FORMAT_R8G8B8A8_UNORM_SRGB as I would expect. Further, when I use the texture, it's washed out (too bright) as if it's been loaded as RGB. If I do a pow(sampleColor, 2.2) in the pixel shader, then the color looks right.

Am I missing something on loading the texture correctly to get it to recognize it as sRGB?

Memory leak reported by Vis Studio CRT report that's before main starts

26 October 2011 - 02:44 PM

Hi guys.

I am cleaning up code and got my leaks down to only 2 16-byte leaks (64x build). Visual Studio's CRT memory leak detector gives me the allocation #'s, and I've had great luck cleaning up all the other leaks using the _CrtSetBreakAlloc() with this method. However, with these two leaks - it never hits the breakpoint and just starts up. So, right in my main, I do this:


int WINAPI wWinMain( HINSTANCE hInstance, HINSTANCE hPrevInstance, LPWSTR lpCmdLine, int nCmdShow )
{
int* pi = new int();
}
which obviously leaks, but it reports that leak as allocation #261. However, the reported leaks are allocations #259 and 260 - which is before main starts. Looking at the dumped data, I had previously had 5 additional leaks of static global std:string objects that showed up allocated before they started. Since I had the dump of the memory and could read the strings, and was able to find them. But with these two, I can't get a line reference number and they appear to be holding pointer data (possibly?).

I'm using the
#define DBG_NEW new ( _NORMAL_BLOCK , __FILE__ , __LINE__ )
to retrieve file/line numbers in the output dump, but am not getting a report. My guess is because static allocations don't new()?

Is there some cool trick of CRT memory functionality that could report back where these are happening? Am I just not instrumenting the right file(s) to report the line number back (I have a lot of files in this solution) - or do static allocations like this not get line numbers?

Thanks in advance

Backfacing lines

23 October 2011 - 02:20 PM

I'm drawing a lot of lines (think a vector-based type drawing of objects):

m_pDevice->IASetVertexBuffers( 0, 1, &pMesh->m_pVertexBuffer, &stride, &offset );
m_pDevice->IASetIndexBuffer( pMesh->m_pIndexBuffer, DXGI_FORMAT_R32_UINT, 0 );
m_pDevice->IASetPrimitiveTopology( D3D10_PRIMITIVE_TOPOLOGY_LINELIST );

The thing is, I'd like to be able to draw the 'backfacing' line segments in a different/muted color. I know the obvious solution would be to sort the list and build up two index buffers - one for one color, and another index buffer for the second color. However, that requires re-building the index lists every frame. Is there a better way?

Thanks in advance

LPCSTR's and unicode/wstring's

29 August 2011 - 11:02 AM

There are a number of structs in DirectX9/10/11 that still take LPCSTR's. i.e. the vertex layout structure:
D3D11_INPUT_ELEMENT_DESC -> LPCSTR SemanticName

Digging down the #defines all the way to WinNT.h (with UNICODE enabled in my code), I find that LPCSTR is still defined as:
typedef char CHAR;
typedef __nullterminated CONST CHAR *LPCSTR;

Since I'm compiling with UNICODE enabled on a Win7 box and using std::wstring()/TCHAR's, this seems like the wrong thing to be casting too. The question is, can you pass in wstrings/unicode TCHAR's into the DirectX API - or do I need to maintain that these strings are all char*/std:string based?

Thanks in advance

PARTNERS