I managed to get somthing up and running, my code now generates an environment class with a list of environment "chunks", each of these chunks contains information for where it is placed and what texture it relates to.
But the issue arose that all the textures were the same, i was like wtf. This isnt right, Oluseyi thought it was my memcpy so i spent an hour or so figuring it out with him. no luck. (btw thanks Oluseyi)
I sat and thought and looked and debugged for a further hour and realised it was down to DX. Tricking me!!!!! god damn.
HRESULT WINAPI D3DXCreateTextureFromFileEx( LPDIRECT3DDEVICE9 pDevice,
This function has a parameter D3DXIMAGE_INFO *pSrcInfo as stated by the SDK:
[in, out] Pointer to a D3DXIMAGE_INFO structure to be filled in with a description of the data in the source image file, or NULL
A fsking DESCRIPTION OF THE DATA IN THE IMAGE FILE. Ok, so i forgot about this for a while.
An hour later it turns out, after calling these few functions:
the actual surface level of the texture is auto-matically scaled down to fit the graphics card. It would have been nice if the data contained within the D3DIMAGE_INFO was correct, then i wouldnt have spent 5 hours on this 1 damn bug.
god damn microsoft.
on a lighter note, can anyone suggest a way to disable this auto-scaling, or maybe some way of loading an "A1R5G5B5" texture, there is no documentation on this texture type as a bitmap, but for some reason windows and everything else can load it. (btw i haveto load this texture type)
On this note, i must sleep, my arms hurt, my fingers hurt, my eyes hurt and im annoyed, DirectX let me down.