Sign in to follow this  
cv_student

OpenGL OpenGL and Vertex Texture access

Recommended Posts

Hi, I am trying to use vertex textures. I read nvidias paper about how I would go about doing that but I can't seem to load the texture in the right format. Nvidia's paper states that I have to use:
// original nvidia code
GLuint vertex_texture;
glGenTextures(1, &vertex_texture);
glBindTexture(GL_TEXTURE_2D, vertex_texture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER,
GL_NEAREST_MIPMAP_NEAREST);
glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE_FLOAT32_ATI, width, height, 0,
GL_LUMINANCE, GL_FLOAT, data);
but whenever I assign the image I get an error, saying "invalid enum" which is probably because of the texture storage format. The programm compiles fine and I have the newest drivers. So whats going on? Does anybody have a sample programm?

Share this post


Link to post
Share on other sites
are you including a header which defines GL_LUMINANCE_FLOAT32_ATI anywhere? (glext.h, glee.h or any other extension loader header) if not,then that would explain it

[Edited by - _the_phantom_ on June 23, 2005 7:40:35 AM]

Share this post


Link to post
Share on other sites
Quote:
Original post by _the_phantom_
are you including a header which defines GL_LUMINANCE_FLOAT32_ATI anywhere? (glext.h, glee.h or any other extension loader header) if not,then that would explain it


It'd exist, or it wouldn't even compile (and subsequently wouldn't be able to give the 'invalid enum' error (as generated by glGetError()). The 'invalid enum' error is returned when one of the arguments (like the internal/external pixel format in an image (the image data)), is invalid.

In this case, I'd be checking the extension that allows you to use GL_LUMINANCE_FLOAT32_ATI, since it should give you a list of things you can do with it. Even better: check that you actually have support for that extension. If you have the support, and you're using it correctly, then I'm out of ideas ('cause it should work fine).

[EDIT]
I just read a part of this PDF, which says it's ('it' being GL_LUMINANCE_FLOAT32_ATI) an internal format, so that looks fine, and GL_LUMINANCE is fine for the external format, as is GL_FLOAT for the data type. Going by that, I'd be checking that the extension (ATI_texture_float (not sure if it starts with 'GL_' or not)) can be used.

[EDIT 2 - EDITED]
You should also check to see if the error is generated before you call these functions (I remember trying to fix something I thought was wrong before I did this). A rule I've learnt: track the error down before you try to fix it. ;)

Share this post


Link to post
Share on other sites
Quote:
Original post by Gorax
Quote:
Original post by _the_phantom_
are you including a header which defines GL_LUMINANCE_FLOAT32_ATI anywhere? (glext.h, glee.h or any other extension loader header) if not,then that would explain it


It'd exist, or it wouldn't even compile (and subsequently wouldn't be able to give the 'invalid enum' error (as generated by glGetError()). The 'invalid enum' error is returned when one of the arguments (like the internal/external pixel format in an image (the image data)), is invalid.


heh, good point, sorry, thats what I get for trying to answer a question when i've just crawled out of bed [grin]

Share this post


Link to post
Share on other sites
Thx for the quick replies.

For one I read about the texture format in Nvidias paper concerning vertex texture access (only works on 6xxx). I didn't really think about wheter I would need a GF6xxx to just use the texture format.

I am developing on a gf 5xxx at home and therefore can only try at my university whether my code really works (they have a 6xxx there). I thought I could go develop my code up to the point where the actual texture lookup occurs (in the vertex shader) at home. And since I want to test as much as possible I gradually started to test my implementation ...

Anyways I guess I will just buy a very cheap 6200 to keep on developing at home.

If anybody gets a simple demo (using vertex textures) to work plz post the working code. Nvidia has a demo which I can't get to compile on my machine :)

Thx again.

Mike

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this