Jump to content
  • Advertisement
Sign in to follow this  

OpenGL Differences Rendermonkey - GL application [solved]

This topic is 4454 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello everyone, I'm having a little problem with using the same shader in Rendermonkey and an OpenGL application but getting different results. This is a simple cubemap reflection, and I don't really know, what I could have done wrong. Here is my vertex shader code:
varying vec3  ReflectDir;
varying float LightIntensity;

uniform vec3  LightPos;

void main() 
    gl_Position    = ftransform();
    vec3 normal    = normalize(gl_NormalMatrix * gl_Normal);
    vec4 pos       = gl_ModelViewMatrix * gl_Vertex;
    vec3 eyeDir    = pos.xyz;
    ReflectDir     = reflect(eyeDir, normal);
    LightIntensity = max(dot(normalize(LightPos - eyeDir), normal), 0.0);
And here the pixel shader:

uniform vec3  BaseColor;
uniform float MixRatio;

uniform samplerCube EnvMap;

varying vec3  ReflectDir;
varying float LightIntensity;

void main()
    // Look up environment map value in cube map

    vec3 envColor = vec3(textureCube(EnvMap, ReflectDir));

    // Add lighting to base color and mix

    vec3 base = LightIntensity * BaseColor;
    envColor  = mix(envColor, base, MixRatio);

    gl_FragColor = vec4(envColor, 1.0);
This is my renderloop in my GL application:




for(int i=0;i<m_ObjectBuffer[0].iFaceCount*3;i+=3)
		glVertex3f(va.x		,va.y	,va.z);
		glVertex3f(va[i+1].x	,va[i+1].y	,va[i+1].z);
		glVertex3f(va[i+2].x	,va[i+2].y	,va[i+2].z);
Can anyone help? Thanks for any response in advance Greets Chris [Edited by - Hydrael on August 12, 2006 9:22:03 AM]

Share this post

Link to post
Share on other sites
Check for errors. Are you binding the shader?

BTW, you don't need glColor3f(1.0f,1.0f,1.0f)
EnvMapper->Activate(); (if this calls glTexEnv)

Share this post

Link to post
Share on other sites
EnvMapper->Activate() just "activates" the environment mapping shader (calls glUseProgramObjectARB(glProgram); )

The shader is bound and compiles correctly.
If you take a look at the screenshot again, you can see, that the blueish color of the OpenGL teapot is part of the sky within the cubemap (as seen the top part of the correct rendermonkey shader) - so the shader itself works.
Seems to me like the cubemap somehow gets scaled, or it gets mapped wrong...but I don't know why :/

Edit: Here is another screenshot, which probably shows the problem a little better

Share this post

Link to post
Share on other sites
Sign in to follow this  

  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!