# how does one do this?

## Recommended Posts

Hey, Notice the lights on the dark side. These are actually two separate textures, one for the dark side, other for the light. How can I do this? Thanks

##### Share on other sites
Sample the first ("daylight") texture.
Sample the second ("night") texture.
Compute diffuse lighting factor (n-dot-l). Perform a linear interpolation between the first and second colors.

Assuming the night texture is mostly dark, you might not even need to bother with actually applying lighting. This might not generate the most desirable results at the "seams" of the night/dark boundary, you might have to play with the way you combine or select the colors and the make-up of the actual texture data.

##### Share on other sites
The obvious way would be lerp between to textures (on daytime and one night-time) based on your N.L result in your pixel shader. I.e. somethin

float lightVal=dot(normal,lightDirec);return lerp(daytimeColor, nighttimeColor, lightVal);

BTW what app is that its not from SGI by any change is it ?

##### Share on other sites
One more thing: do you see any fixed (non-shader) way to do this?

Quote:
 Original post by griffin2000The obvious way would be lerp between to textures (on daytime and one night-time) based on your N.L result in your pixel shader. I.e. somethinfloat lightVal=dot(normal,lightDirec);return lerp(daytimeColor, nighttimeColor, lightVal);BTW what app is that its not from SGI by any change is it ?

its max... got the shot from here

##### Share on other sites
The FFP is quite capable of multi-texturing (called texture blending, in DX documentation).

Back in the day, this would be done by rendering the geometry twice, changing texture in between, using appropriate alpha-blending factors for each pass. However, hardware has long been capable of doing it in a single pass.
You may have noticed that when you 'set' a texture, you have to pass the value '0'. This tells the API to assign the texture to the first (index 0) sampler. Depending on your graphic card, you can use up to 16 samplers simultaneously, each with their own texture and state.
To use several samplers, you'll need to change your vertex declaration to accept more parameters. Often, supplying an extra set of texture coordinates is all you need in order to blend two textures, but the possibilities are diverse.

Take a look at Toymaker/NeHe or whatever site you like for a tutorial.

Regards

##### Share on other sites
k, i think I got it somewhat right:

however, notice the lighting isnt very good on the lit side. How can I integrate some decent lighting with the shaders I have to handle this two texture thing? Now, my output color is:

lerp(daytimecolor, nighttimecolor, lightval)

how do I had some decent diffuse, or even spec to this?

thanks again

##### Share on other sites
First, generate vertex normals for the mesh, then use them in the usual Lambertian lighting model for a diffuse effect (specular lighting would look awful without some very subtle and finely-tuned bump-mapping).

The basic idea is to take the dot product of the normalised light vector with the normalised normal. After clamping the result to [0, 1] you have the diffuse lightness factor. You'll find tutorials all over the web.

Toymaker, for Direct3D.
Lighthouse3D, for OpenGL.

Regards

##### Share on other sites
Quote:
 Original post by TheAdmiralFirst, generate vertex normals for the mesh, then use them in the usual Lambertian lighting model for a diffuse effect (specular lighting would look awful without some very subtle and finely-tuned bump-mapping).The basic idea is to take the dot product of the normalised light vector with the normalised normal. After clamping the result to [0, 1] you have the diffuse lightness factor. You'll find tutorials all over the web.Toymaker, for Direct3D.Lighthouse3D, for OpenGL.RegardsAdmiral

k, but if im passing a color to the vertex in the vertex shader, won't it be overwritten by the frag color in the fragment shader?

##### Share on other sites
Quote:
 if im passing a color to the vertex in the vertex shader, won't it be overwritten by the frag color in the fragment shader?

It is indeed possible to write a pixel/fragment shader that outputs a colour regardless of what the vertex data say, but the default FFP-equivalent pixel shader modulates (pointwise-multiply) its sampled colour with the vertex-interpolated colour. This is how textured geometry can be lit per-vertex. Just tell the pixel shader to accept a COLOR parameter and float4-multiply by that.

Regards

## Create an account

Register a new account

• ## Partner Spotlight

• ### Forum Statistics

• Total Topics
627662
• Total Posts
2978519

• 10
• 10
• 12
• 22
• 13