RGBA to XRGB (or ARGB) macro incorrect?

Started by
5 comments, last by sirlemonhead 16 years, 2 months ago
Hi, I've got texture data in RGBA 32 bit format. The alpha is unused. I've got directx making 16 bit D3DFMT_R5G6B5 format textures from this data fine, using this macro: #define RGB16(r, g, b) ( ((r >> 3) << 11) | ((g >> 2) << 5) | (b >> 3)) I want to get 32bit texture loading working, with the D3DFMT_X8R8G8B8 or D3DFMT_A8R8G8B8 format. Either'll do. I intially tried this macro: //#define RGB32 (r << 24) | (g << 16) | (b << 8) | a); and what it does, is give me a completely blue tinted image. There are no other colours. All different shades of colour are represented by blue only. so I try this macro: #define RGBA_MAKE(r, g, b, a) ((D3DCOLOR) (((a) << 24) | ((r) << 16) | ((g) << 8) | (b))) same deal. So i look through the DirectX header files and find these #define D3DCOLOR_ARGB(a,r,g,b) ((D3DCOLOR)((((a)&0xff)<<24)|(((r)&0xff)<<16)|(((g)&0xff)<<8)|((b)&0xff))) #define D3DCOLOR_RGBA(r,g,b,a) D3DCOLOR_ARGB(a,r,g,b) I use D3DCOLOR_RGBA which'll 'map' to D3DCOLOR_ARGB same deal. blue images. I am correctly changing my back buffer format when I switch colour depth. I've got d3dx outputting image files so I can see that the textures are blue without having any render code affect them. I can't use d3dx to load the textures in the first place. I store my source data in "unsigned char *"'s for the 32bit source data. I use "unsigned short *" for the 16 bit destination data, and I'm using "unsigned char *"'s for the 32 bit destination data. here's my code for writing to the d3d texture
[source code="cpp"]

unsigned char *destPtr, *srcPtr;

srcPtr = (unsigned char *)tex->buf; // 32 bit RGBA format

		for (int y = 0; y < original_height; y++)
		{
			destPtr = (((unsigned char *)lock.pBits) + y*lock.Pitch);

			for (int x = 0; x < original_width; x++)
			{
				// >> 3 for red and blue in a 16 bit texture, 2 for green
				*destPtr = D3DCOLOR_RGBA(srcPtr[0], srcPtr[1], srcPtr[2], 0xff);

				destPtr+=4;
				srcPtr+=4;

			}
		}
		

i've tried 255 as a value for Alpha in the macro, ive tried 0. Ive tried different pointer increment values for destPtr, as i've had that problem before.. I'm assuming my macro's aren't suitable?
Advertisement
destPtr is an unsigned char, but the macros return a 32-bit integer and you increment both pointers by 4. Basically you're only using the lowest byte of the 32-bit value which is why everything becomes blue.
Can you explain the pointer incrementation to me? I had assumed you increment the unsigned char* src pointer by 4 because (according to sizeof() ) the size of that is 4 bytes, and 4 bytes per pixel.

But, I don't get why an unsigned short * that would be 2 bytes, when doing my 16 bit texture write loop, only increments by 1?

What other data types would you recommend I use?

I'm trying to get my head around this stuff but it's taking a LONG time.. I'm getting a lot of this stuff working by trial and error unfortunately.
*destPtr = D3DCOLOR_RGBA(srcPtr[0], srcPtr[1], srcPtr[2], 0xff);

Because destPtr is a char pointer, you're only writing back a single byte.

Change it to:

*(D3DCOLOR*)destPtr = D3DCOLOR_RGBA(srcPtr[0], srcPtr[1], srcPtr[2], 0xff);
Quote:Original post by sirlemonhead
Can you explain the pointer incrementation to me? I had assumed you increment the unsigned char* src pointer by 4 because (according to sizeof() ) the size of that is 4 bytes, and 4 bytes per pixel.


Well you should increment by 4, but your reasoning is wrong. The reason to increment by 4 is because you're using a 32-bit texture, thus 4 bytes per pixel. However the size of a char is 1 byte, a pointer(of any type) is 4 bytes because it's an integer and stores an address.

To clarify some more on my above post, *destPtr is the same as destPtr[0]. destPtr[1], [2], and [3] go unused. You should either make destPtr an integer pointer and do:

*destPtr = D3DCOLOR_RGBA(srcPtr[0], srcPtr[1], srcPtr[2], 0xff);
destPtr++;
srcPtr+=4;

or just copy each byte manually:

// assumes destPtr is RGBA and srcPtr is ARGB.
destPtr[0] = srcPtr[1]; // r
destPtr[1] = srcPtr[2]; // g
destPtr[2] = srcPtr[3]; // b
destPtr[3] = srcPtr[0]; // a
destPtr+=4;
srcPtr+=4;

Quote:Original post by sirlemonhead
But, I don't get why an unsigned short * that would be 2 bytes, when doing my 16 bit texture write loop, only increments by 1?


When you increment a pointer, you actually increment the address by the size of the type. Since a short is 2 bytes, incrementing a short pointer will add 2 to the address.

Be aware that converting between 32-bit and 16-bit is going to be more involved then the above examples.
Would it ever be advisable to hard code your increment value, or should you increment by, for example, destPtr+=sizeof(short); ?

"Be aware that converting between 32-bit and 16-bit is going to be more involved then the above examples."

Could you tell me a bit more about that? I've got 16 bit textures being created fine from 32 bit data. The outputted result looks perfect. I just have trouble going from 32bit to 32bit with different colour ordering :
by "make destPtr an integer pointer" I assume you mean do "unsigned int *destPtr;" unsigned or not? would you not increment by 4 then rather than 1 if you increment by the size of the type?

I tried keeping the unsigned char, and doing

destPtr[0] = srcPtr[3]; // r
destPtr[1] = srcPtr[0]; // g
destPtr[2] = srcPtr[1]; // b
destPtr[3] = srcPtr[2]; // a

(srcPtr is RGBA and destPtr is XRGB)

but I now get blue on my images where I should be getting black, and green where I should be getting red. I at least have white now..
I just tried Jan's method, and that worked perfectly. Is that just casting the output of the macro to a DWORD?

This topic is closed to new replies.

Advertisement