Sign in to follow this  
Milkies

LPDIRECT3DTEXTURE9 questions :)

Recommended Posts

Hi all, I want to load in images to use as textures using LPDIRECT3DTEXTURE9. Once ive used D3DXCreateTextureFromFile to load the texture into the LPDIRECT3DTEXTURE9 object is there away that i can determine the height and width of the texture? Also if the image ive loaded in is infact a grid of seperate images is there away i can use LPDIRECT3DTEXTURE9 to only show a specific image in the loaded texture? I can see that LPDIRECT3DTEXTURE9 has variables "Height" and "Width" but there doesnt seem to be a member function to retrieve them. Thanks

Share this post


Link to post
Share on other sites
Use D3DXGetImageInfoFromFile() to get image information (inc. height and width) direct from the file before loading, or use IDirect3DTexture9::GetSurfaceLevel(0) followed by IDirect3DSurface9::GetDesc() to retrieve the dimensions of the texture after it has been loaded. I mention both as it's quite possible D3DX loaders will resize images [wink]

Quote:
if the image ive loaded in is infact a grid of seperate images is there away i can use LPDIRECT3DTEXTURE9 to only show a specific image in the loaded texture?
You don't use LPDIRECT3DTEXTURE9 to do this. Think of your texture as just a big array of data - same as you'd have an array in normal C/C++. It's the geometry that defines which bit of the array/data is shown via the texture coordinates. When creating the geometry you use to render the texture just set the texture coordinates to use a subset (e.g. 0.25f->0.50f) of the texture.

Quote:
I can see that LPDIRECT3DTEXTURE9 has variables "Height" and "Width" but there doesnt seem to be a member function to retrieve them.
If these are the members included by D3D_DEBUG_INFO then you need to be VERY careful as they only exist in debug builds running against the debug runtime. They're strictly for debugging purposes only and relying on them for production code is a very easy way to enter a world of compiler and runtime errors!

hth
Jack

Share this post


Link to post
Share on other sites
Use the functions in the texture object to get it's size.

D3DSURFACE_DESC desc;
pTex->GetLevelDesc(0, &desc); // Level 0 is top most.
wid = desc.Width;
hei = desc.Height;


Jollyjeffers answered how to use subimages, but his answer isn't the clearest (and neither is mine, I'm sure). As he said, you don't call something in the texture to use a subimage, much like in C++, you don't do anything to an array to only refer to parts of it... it's how you refer to the image that chooses which subimage to look at.

In 3D, you have vertices that make up your object. The vertices have texture coordinates that indicates which part of the texture is at that point in space. Texture coordinates are usually 2D values from (0,0) to (1,1), referred to as U and V.

u = (texture x coord)/(texture width);
v = (texture y coord)/(texture height);

In simple quad (2D square, or sprite) examples you'll see they set the UVs to (0,0), (0,1), (1,0), and (1,1), which correspond to the top left, top right, bottom left, and bottom right corners to the texture.

Share this post


Link to post
Share on other sites
Thanks for the help guys :-)

Let me just confirm this grid image thing. Basically if i had an image which i load in and want to use it as a grid image (say a 2x2 image) and each of the 4 images in the 2x2 grid are the same size (all rectangles), then if i wished to select the top left image in the grid then my u and v alues would look like thus

topleft corner = 0.0, 0.0
top right corener = 0.5, 0.0
bottom right corner = 0.5 0.5
bottom left corner = 0.0,0.5

Is this correct or have i just completely misunderstood? Cheers

Share this post


Link to post
Share on other sites
You got it. There's one further level of complexity, regarding how the texture gets sampled and filtered during render.

In D3D9 and under, these UVs will be on the corner of the pixel in the texture (texel).

In D3D10 and OpenGL these UVs will be on the center of the pixel in the texture.

I don't have the time to explain what that means for you right now (my daughter is desperatly trying to get my attention), but I'll try to remember to come back to this thread, or maybe someone else will explain it for you.

Share this post


Link to post
Share on other sites
Ok, so what's the difference between the two? Lets use a 3x3 texture, for the sake of being able to include it here.

+---+---+---+
|0,0|1,0|2,0|
+---+---+---+
|0,1|1,1|2,1+
+---+---+---+
|0,2|1,2|2,2+
+---+---+---+

So, in D3D1 to D3D9, a UV of (0,0) is the corner of the first texel, marked with an asterisk.

*---+---+---+
| | | |
+---+---+---+
| | | +
+---+---+---+
| | | +
+---+---+---+

In D3D10, and OpenGL, a UV of (0,0) is the center of the first texel, marked with an asterisk

+---+---+---+
| * | | |
+---+---+---+
| | | +
+---+---+---+
| | | +
+---+---+---+

Lets assume we're drawing the image unscaled, as a 3x3 quad. The difference being that if filtering is enabled, which it usually is, OpenGL/D3D10 would fetch just that texel, as we're sampling it exactly at the center, while D3D9 would blend between adjacent texels, since we're sampling it halfway between texels. This will appear blurry, and, along the top and left edges of the sprite we'd see a 50% blend with the bottom and right hand edges (if wrapping is enabled).

For most models, you'd never notice. For sprites, huds, fonts, etc, it's very important. For your texture grid/texture atlas it means you'll see bluring, and bleeding between each subimage. To counter this, the UVs should be offset by half a texel.

U = (texture x coord + 0.5) / (tex width);
V = (texture y coord + 0.5) / (tex height);

Another source of bleeding between tiles can occur when scaling down, if the tiled images aren't a power of 2 size. Most mipmap filters simply average a 2x2 chunk of texture. If your tiles were say, 8x8 your mip chain would look like this, for a 2 by 1 set of tiles

aaaaaaaabbbbbbbb aaaabbbb aabb ab x
aaaaaaaabbbbbbbb aaaabbbb aabb
aaaaaaaabbbbbbbb aaaabbbb
aaaaaaaabbbbbbbb aaaabbbb
aaaaaaaabbbbbbbb
aaaaaaaabbbbbbbb
aaaaaaaabbbbbbbb
aaaaaaaabbbbbbbb

The last few mip levels in your texture atlas will be a fairly useless random assortment of colors, as the tiles bleed together.

Now lets say your tiles were 6x4, errors would show up sooner

aaaaaabbbbbb aaabbb axb x
aaaaaabbbbbb aaabbb
aaaaaabbbbbb
aaaaaabbbbbb

Share this post


Link to post
Share on other sites
Guest Anonymous Poster
Quote:
In D3D9 and under, these UVs will be on the corner of the pixel in the texture (texel).
In D3D10 and OpenGL these UVs will be on the center of the pixel in the texture.



Please tell me that's a sick joke! They're going to move the UV coords?

Share this post


Link to post
Share on other sites
Quote:
Original post by Anonymous Poster
Please tell me that's a sick joke! They're going to move the UV coords?

Hardware vendors probably consider the current setup a sick joke, as do game companies targetting both types of systems. For a while, hardware will still need to support both modes for DX9 applications, but eventually, it will get dropped, I'm sure.

I was surprised to see it changing, but it's a good change.

Share this post


Link to post
Share on other sites
@ Anonymous Poster: Any more trolling and you can kiss your limited GDNet priveledges goodbye. Don't forget moderators have x-ray vision and are perfectly capable of knowing what you're upto...



D3D10 (or might just be DXGI/WDDM) define "default" and "legacy" modes now. Legacy mode being for D3D9 compatability and, contrary to namethatnobodyelsetook, I'd expect it to be around for quite a while given that Vista's GUI is based on D3D9 - that is, unless D3D9Ex also uses the new coordinate system.

Quote:
In D3D9 and under, these UVs will be on the corner of the pixel in the texture (texel).

In D3D10 and OpenGL these UVs will be on the center of the pixel in the texture.
Are you sure? I thought it was the other way around and the specification I've got seems to concur:
Quote:
[default mode] defines the origin as the upper-left corner of the RenderTarget
...
[legacy mode] defines the origin as the center of the upper-left pixel in the RenderTarget.
It has been a long day though, so maybe I'm just being dense and mis-reading what you've posted [smile]

Cheers,
Jack

Share this post


Link to post
Share on other sites
When I say I expect it will eventually be dropped, I don't mean "in a year or two". It'll be a while before nobody cares about DX9 compatibility, but that say will come, and hardware will be freed of one more silly DX vs. OpenGL burden.

What you've found seems to imply a change to how vertex positions are interpreted, which I was unaware of. I'm talking about UV positions. DX9 and under definately require the half texel offset in order to render sprites, while OpenGL does not. I'm certain I was that D3D changed their UV handling... I'll have to dig through the docs to find a reference.

If both the pixel locations for rasterization are changing, and the UV handling is changing, perhaps the half texel offset will still end up being needed.

Whatever happens, it should be minor to tweak your app to work the new way.

Share this post


Link to post
Share on other sites
The documentation I've got specifically refers to it being useful for rendering screen-aligned quads, so I assumed it was rasterization/uv rather than position...

Given that it's screen-aligned quads that cause all the trouble addressed in our favourite "Directly mapping texels to pixels" article it makes sense [grin]

Jack

Share this post


Link to post
Share on other sites
I can't find anything in the D3D10 documents that are public, but in the old D3D10 Functional spec the MVPs got (D3D10 Functional Spec v1.04\D3D10FunctionalSpec.htm), I see what you're talking about...

The UVs didn't move. A UV of (0,0) is still the corner of the texel, but since the rasterization point moved, if your quad starts at (0,0), it won't be rasterized there, it will get rasterized at (0.5, 0.5), at which point the UVs will be interpolated to being at the texel center, and it will all work out.

So yeah, UVs didn't move in D3D10, but the rasterization rules changed such that it's effectively the same thing. Don't add half a texel in D3D10 or OpenGL.

Share this post


Link to post
Share on other sites
Quote:
Original post by Namethatnobodyelsetook
I can't find anything in the D3D10 documents that are public
Indeed, I can't find it in the two main public sources (here or here).

Quote:
Original post by Namethatnobodyelsetook
the rasterization rules changed such that it's effectively the same thing. Don't add half a texel in D3D10 or OpenGL.
If only I'd of earnt £1 for every post where I mentioned the 'directly mapping texels to pixels' article I'd be a millionaire. If they've now made it redundant then my chances of being a millionaire are much diminished [sad]

Jack

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this