grainy lighting on simple cubes

Started by
10 comments, last by krum 18 years, 8 months ago
Good mornin all. New question for you guys today, but it's a bit more subjective I'm afraid. I have been following one of the many "beginners DirectX" tutorials out there, and currently I'm on the obligatory "texture map and light a cube" portion. So it seems to work fine, except for one difference. The author's code produces a cube that seems smoothly lit accross the faces, whereas mine seems grainy. My cube has these very clearly defined "lighting layers" as vertices get farther from the light source. I am using the same lighting parameters as the author, and the same texture map. The only real difference I can think of is that my cube is 2x2x2 with a light source at a distance of 5, whereas his cube is 15x15x15 with a light source at a distance of 20. I adjusted my viewing position and my light attenuation/range to compensate for my smaller cubes. Could this be the reason my lighting looks grainy? Or are there other settings which tend to produce really layered-looking shading like that? I didn't notice any settings for color depth, or shading preferences in the author's code. Thanks for any help anyone can give! Mike
Advertisement
I've seen this caused by one of two things:
1) specifiying lighting colors greater than 1.0f.
or 2) using normals which arn't normalized.

Hope this helps :).
Sirob Yes.» - status: Work-O-Rama.
Have you normalized your vertex normals? Unnormalized normals may cause incorrect lighting results. There is a function in D3DX to do this, D3DXVec3Normalize(). Note that you only have to do this once, when initializing your cube.
It sounds a lot like color banding. If that's the case, then check the bpp settings.
my-eulogy - A blog about coding and gfxsdgi - Semi-Daily Game IdeaChunkyHacker - Viewer for Relic chunky formats (used in DOW)
If you're using a 16-bit display, colors will band. You can counteract this by setting the D3DRS_DITHERENABLE renderstate to 1. Dithering makes 16-bit look rather nice.
_______________________________________________________________________Hoo-rah.
Well, I set my normals manually at the beginning, and I have them sticking straight out in the X, Y or Z direction, depending on which way the face starts out as. ( ie: all normals for vertices on the top face are (0.0f, 1.0f, 0.0f), etc. ) My understanding of a normalized vector is that the magnitude is 1, so aren't my normals normalized to begin with in this case? ( being that the magnitude equation for 3D vectors is SQRT( x^2 + y^2 + z^2 ) )

Is there something I'm forgetting?
DrakeX:

How do I set a higher color depth than 16? And where? Is it a property of the D3D device?
Hmm, your normals seem just fine, so grainy lighting is not their fault...
Quote:Original post by Michael Kron
How do I set a higher color depth than 16? And where? Is it a property of the D3D device?

Yes, it's a device property. It's the BackBufferFormat member of D3DPRESENT_PARAMETERS. Try setting it to some 32-bit D3DFORMAT, such as D3DFMT_A8R8G8B8, if you haven't already.

Are you running the program in windowed mode? If so, it will use the color depth that your desktop is using. You can change it in the display properties in the control panel.

If you're running in fullscreen, you set the screen depth when you create the D3D Device. There is a structure called D3DPRESENT_PARAMETERS that you pass into the IDirect3D::CreateDevice function. One member of that structure is called BackBufferFormat; set it to D3DFMT_X8R8G8B8 to run in 32-bit mode.
_______________________________________________________________________Hoo-rah.
My app is full screen, so this might be the culprit. Thanks, I'll give that a try. BTW, what is the difference between the two 32bit formats you guys are suggesting?

D3DFMT_X8R8G8B8

vs

D3DFMT_A8R8G8B8

?

This topic is closed to new replies.

Advertisement