• Advertisement
Sign in to follow this  

question about glMultiTexCoord3fARB

This topic is 3346 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

the code below glBegin(GL_QUADS); { glNormal3f(0.0, 1.0, 0.0); // tangent glMultiTexCoord3fARB(GL_TEXTURE0 + 1, 1.0, 0.0, 0.0); // binormal glMultiTexCoord3fARB(GL_TEXTURE0 + 2, 0.0, 0.0, 1.0); glTexCoord2f(0.0, 0.0); glVertex3f(-1.0, 0.0, -1.0); glTexCoord2f(0.0, 1.0); glVertex3f(-1.0, 0.0, 1.0); glTexCoord2f(1.0, 1.0); glVertex3f(1.0, 0.0, 1.0); glTexCoord2f(1.0, 0.0); glVertex3f(1.0, 0.0, -1.0); } glEnd(); why can use glMultiTexCoord3fARB here?Is it the input of vertex shader,or fragment shader? vertex shader: write in sh vsh = SH_BEGIN_VERTEX_PROGRAM { ShInputPosition4f SH_DECL(ipos); ShOutputPosition4f SH_DECL(opos); ShInOutTexCoord3f SH_DECL(tc); ShInputNormal3f SH_DECL(normal); ShInputNormal3f SH_DECL(tangent); ShInputNormal3f SH_DECL(binormal); ShOutputVector3f SH_DECL(tan_eye); ShOutputVector3f SH_DECL(tan_light); // Transform position my modelview projection matrix opos = ModelViewProjection | ipos; // Set 3rd component of texture co-ordinate, it will be passed through automatically tc(2) = 1.0; // Map the eye vector into tangent space ShVector3f SH_DECL(eye); eye = (ModelViewInverse | ShPoint3f(0,0,0)) - ipos(0,1,2); tan_eye = -ShVector3f(tangent | eye, binormal | eye, 1.0/bumpdepth * (normal | eye)); // Map the light vector into tangent space ShVector3f SH_DECL(lightVec); lightVec = lightPos - ipos(0,1,2); tan_light = ShVector3f(tangent | lightVec, binormal | lightVec, normal | lightVec); } SH_END; fragment shader: write in sh fsh = SH_BEGIN_FRAGMENT_PROGRAM { ShInputPosition4f SH_DECL(pos); ShInputTexCoord3f SH_DECL(itc); ShInputVector3f SH_DECL(tan_eye); ShInputVector3f SH_DECL(tan_light); // Compute offset vector by normalizing and multiplying with normalization factor ShVector3f offset = normalize(tan_eye); offset(0,1) *= ((double)dmap_depth)/distmap.width(); // March a ray ShTexCoord3f tc = itc; for (int i = 0; i < iterations; i++) { ShAttrib1f distance = distmap(tc); tc = mad(distance, offset, tc); } // Compute derivatives of unperturbed texture coordinates ShAttrib2f ddx = dx(itc(0,1)); ShAttrib2f ddy = dy(itc(0,1)); // Do tangent-space normal mapping tan_light = normalize(tan_light); ShNormal3f tan_normal; if(filter) { tan_normal = 2.0 * normals(tc(0,1), ddx, ddy) - 1.0; } else { tan_normal = 2.0 * normals(tc(0,1)) - 1.0; } // Apply lighting to base color ShColor1f diffuse = tan_normal | tan_light; ShOutputColor3f c; if(filter) { c = colors(tc(0,1), ddx, ddy) * diffuse; } else { c = colors(tc(0,1)) * diffuse; } } SH_END;

Share this post


Link to post
Share on other sites
Advertisement
Quote:
why can use glMultiTexCoord3fARB here?Is it the input of vertex shader,or fragment shader?


Why not?
Everything goes to the vertex shader first. That's how the graphics pipe is designed.

I don't use SH. It seems to be a library for helping users with shaders. Any reason why you don't use GLSL?

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement