Archived

This topic is now archived and is closed to further replies.

mipmap bias problem

This topic is 5510 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

This stagestate setting is supposed to heavily blur my scene: float bias =3.0f; mydev->SetTextureStageState(0, D3DTSS_MIPMAPLODBIAS,*( (LPDWORD) (&bias) ) ); Well, it doesn''t (at all) ! Am I forgetting something here ?

Share this post


Link to post
Share on other sites
btw question: What is D3DTSS_MIPMAPLODBIAS??? Could I set up what resolutions my mipmaps should have? That''s what I want to do, could anybody please explain me this texture stage?

Sorry for using your topic for another question, I couldn''t help

Share this post


Link to post
Share on other sites
No need to be sorry :-)

A mimapped texture really is a bunch of surfaces
Big - not so big - small - even smaller and so forth
ideally down to the 1x1 pixel image
Hence the term mipmap chain.

By default directx picks the apropriate magnification
depending on how near or far your textured primitive is,
thus improving performance (and also eliminating ugly
distortion)

The aforementioned texturestagestate will skew
the default choise of magnification by the selected number,
thus sort of blurring all your textures.

Positive number for blurring
Negative for Sharpening

The ugly casting of the parameter is required, due to the fact
that it takes a float parameter, but the SetTextureStageState
function demands a DWORD.

I tried to call GetTextureStage to see if the value was really
set, and yes, it was !

This should be plain and simple, only it has no effect at
all over here. I''m at a loss. Heeeelp !!

Please try the same thing and tell me if it worked
It really should !

I''ve set the D3DTSS_MIP/MIN/MAGFILTER all to D3DTEXF_POINT
(also tried D3DTEXF_LINEAR)
and the D3DTSS_TEXTURETRANSFORMFLAGS to D3DTTFF_COUNT2

Shouldn''t that be all ?

Share this post


Link to post
Share on other sites
No need to be sorry :-)

A mimapped texture really is a bunch of surfaces
Big - not so big - small - even smaller and so forth
ideally down to the 1x1 pixel image
Hence the term mipmap chain.

By default directx picks the apropriate magnification
depending on how near or far your textured primitive is,
thus improving performance (and also eliminating ugly
distortion)

The aforementioned texturestagestate will skew
the default choise of magnification by the selected number,
thus sort of blurring all your textures.

Positive number for blurring
Negative for Sharpening

The ugly casting of the parameter is required, due to the fact
that it takes a float parameter, but the SetTextureStageState
function demands a DWORD.

I tried to call GetTextureStage to see if the value was really
set, and yes, it was !

This should be plain and simple, only it has no effect at
all over here. I''m at a loss. Heeeelp !!

Please try the same thing and tell me if it worked
It really should !

I''ve set the D3DTSS_MIP/MIN/MAGFILTER all to D3DTEXF_POINT
(also tried D3DTEXF_LINEAR)
and the D3DTSS_TEXTURETRANSFORMFLAGS to D3DTTFF_COUNT2

Shouldn''t that be all ?

Share this post


Link to post
Share on other sites
For float numbers in render-states D3D uses longs too, you have just to write them in another way, here is an explanation for the ALPHAREF-Renderstate, it uses an "float" too:

D3DRS_ALPHAREF
Value specifying a reference alpha value against which pixels are tested when alpha testing is enabled. This can be a 16:16 fixed point value (D3DFIXED) ranging from 0 to 1, inclusive, where 1.0 is represented as &H00010000. The default value is 0.0.

Perhaps this helps...

Share this post


Link to post
Share on other sites
ahh sorry for not responding

No, your idea can not be relevant here
as I have tested to see if my value was set with
a call to GetRenderState. It is set properly!

I have not yet been able to get any effect from
this &%¤ bias state.

However, my texture->SetLod call just started working
after I created my texture as D3DPOOL_MANAGED (I used to use
D3DPOOL_DEFAULT) pooltype as this:

D3DXCreateTexture(device,width,height,miplevels,0,D3DFMT_A8R8G8B8,D3DPOOL_MANAGED,&texture);

D3DPOOL_DEFAULT is no good for multiple reasons, the most
important perhaps being that the surfaces cannot be locked :|

But setting level of detail (LOD) with texture->SetLOD(3)
f.inst. really creates a nice blurring effect :-)

and somewhat happy is better than all sad ...

Share this post


Link to post
Share on other sites