Jump to content
  • Advertisement
Sign in to follow this  
Aztral

OpenGL Texture and Vertex Shader

This topic is 2757 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm working on water rendering with openGL which uses a CPU-generated heightfield to generate (or modify) an existing openGL texture. The initial opengl texture is created with the following code:




uint32 gltex; glGenTextures(1, &gltex);

glBindTexture(GL_TEXTURE_2D, gltex);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_REPEAT);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_REPEAT);
glTexImage2D(GL_TEXTURE_2D, 0, GL_ALPHA32F_ARB, m_heightFieldU, m_heightFieldV, 0, GL_ALPHA, GL_FLOAT, m_heightField.GetArray());




m_heightField.GetArray() is simply an array of floats of size m_heightFieldU * m_heightFieldV. I use the internal format GL_ALPHA32F_ARB and format GL_ALPHA because I only need to send/store a single float value per texture element. The vertex shader looks at this texture and uses its value at each vertex to modify the z-coordinate of the vertex before calculating.. it is quite simple:



uniform sampler2D heightField;

void main(void) {
float vertHeight = texture2D(heightField, gl_MultiTexCoord0.st).a;
gl_Position = gl_ModelViewProjectionMatrix * (gl_Vertex + vec4(0.0, 0.0, vertHeight, 0.0));
gl_FrontColor = vec4(1.0, 1.0, 1.0, 1.0);
}


Also note that the height field is updated regularly by updating the existing opengl texture, via


glBindTexture(GL_TEXTURE_2D, m_texture);
glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, m_width, m_height, m_format, m_type, pData);


where m_width, m_height, m_format, m_type are members of a Texture class that simply wraps the opengl data. In this case:
m_width = 32;
m_height = 32;
m_format = GL_ALPHA;
m_type = GL_FLOAT;

to correspond with the initial texture creation.

The issue I'm having is that the actual alpha of the entire mesh that this texture is being tiled on seems to be "pulsing". It is fading in and out, the entire mesh is visible momentarily and then invisible momentarily, and I can't figure out why. I assume it has something to do with using GL_ALPHA* formats, but again I don't understand why. It is my (probably incorrect?) understanding that these simply specify how opengl loads/stores data internally, and they shouldn't actually effect the rendering. Since in my vertex shader I explicitly set the vertex front color to white with a 1.0 alpha, I don't know how or why the alpha would be changing.

Initially I thought that my texture may at certain times be calculating horribly wrong values and sending the mesh to who knows where in the world, but even if I modify the shader to ignore the texture value completely the same thing happens.

Am I wrong in thinking that the internal format/format used here is rather arbitrary seeing as I explicitly set the vertex color? Does opengl do more with the texture data than I tell it to via the vertex shader that I'm not seeing, particularly in the absence of a fragment shader? If I stop the updating of the texture the pulsing stops, which tells me something is being done with the texture that I'm either not aware of or am aware of and am not realizing.

Thanks very much for any info! Let me know if I can clarify anything. .

Share this post


Link to post
Share on other sites
Advertisement
Whats your pixel shader doing? Are you using a default one? Setting the colour to white is a good diea but perhaps the pixel shader is doing somehtign else (like reading the alpha from your texture).

Share this post


Link to post
Share on other sites
I don't know what behaviour is for having only vertex shader and no fragment shader (The part that deals with texture application). But I can assume that if you don't have a basic fragment shader it will apply the texture the same as if you didn't have a shader, which means it will apply a 'rippling heightfield' as a texture which will pulse and seeing as you have that format is the alpha!

So I would suggest just a simple fragment shader that overrides default functionality.

Share this post


Link to post
Share on other sites
Makes sense, thanks to both of you. I will give that a shot as soon as I get home.

One question comes to mind, since it is the responsibility of the vertex shader to interpolate values as necessary to be sent to the frag shader (notably the texture coordinate projection in this case), and my vertex shader doesn't do that (it wouldn't do that by default, would it?) how would the fragment shader determine the texture values to use, or would it use some default i.e. the same, coordinate for every fragment.

This would make sense, as the alpha fading seems to be uniform across the mesh (I think, as far as I can tell with just my eyeballs).

This explanation also makes sense in the sense that the fading seems to follow the same pattern I use to generate the height map (a rather simple wave function).

Share this post


Link to post
Share on other sites
Yeah it would do that by default.

I have no idea how it would be determining that with the code you have, but I assume you might have some texture info with your mesh the it could use, again I don't know much about using a vertex and no fragment shader as that is just CraZy!

Share this post


Link to post
Share on other sites

Makes sense, thanks to both of you. I will give that a shot as soon as I get home.

One question comes to mind, since it is the responsibility of the vertex shader to interpolate values as necessary to be sent to the frag shader (notably the texture coordinate projection in this case), and my vertex shader doesn't do that (it wouldn't do that by default, would it?) how would the fragment shader determine the texture values to use, or would it use some default i.e. the same, coordinate for every fragment.

This would make sense, as the alpha fading seems to be uniform across the mesh (I think, as far as I can tell with just my eyeballs).

This explanation also makes sense in the sense that the fading seems to follow the same pattern I use to generate the height map (a rather simple wave function).


The hardware interpolators take care of interpolating the vertex shader values over the pixels for you. The vertex shader is just an application to transform your input vertex data to the desired output vertex data.


This is why you have to normalise normals when you calculate them in the vertex shader and use them in the pixel shader, the interpolation of the normal makes them non uniform.

If you use no pixel shader the card will just revert to the non programmable pipeline, even though this might be emulated by the use of driver specified shaders to cope with the fact that the specific silicon hardware isn't actually there. (The DX runtime shows this when you run a DX9.0 app on a DX10 based card with the following lines:
Direct3D9: (INFO) :Using FF to VS converter
Direct3D9: (INFO) :Using FF to PS converter
Just using DX as an example here).



Share this post


Link to post
Share on other sites
Thanks everyone. Plugging in a basic fragment shader (in this case that just set the fragment color to white, for now at least) solved the problem.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!