OpenGL and Vertex Texture access

Started by
4 comments, last by cv_student 18 years, 10 months ago
Hi, I am trying to use vertex textures. I read nvidias paper about how I would go about doing that but I can't seem to load the texture in the right format. Nvidia's paper states that I have to use:

// original nvidia code
GLuint vertex_texture;
glGenTextures(1, &vertex_texture);
glBindTexture(GL_TEXTURE_2D, vertex_texture);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER,
GL_NEAREST_MIPMAP_NEAREST);
glTexImage2D(GL_TEXTURE_2D, 0, GL_LUMINANCE_FLOAT32_ATI, width, height, 0,
GL_LUMINANCE, GL_FLOAT, data);
but whenever I assign the image I get an error, saying "invalid enum" which is probably because of the texture storage format. The programm compiles fine and I have the newest drivers. So whats going on? Does anybody have a sample programm?
Advertisement
are you including a header which defines GL_LUMINANCE_FLOAT32_ATI anywhere? (glext.h, glee.h or any other extension loader header) if not,then that would explain it

[Edited by - _the_phantom_ on June 23, 2005 7:40:35 AM]
Quote:Original post by _the_phantom_
are you including a header which defines GL_LUMINANCE_FLOAT32_ATI anywhere? (glext.h, glee.h or any other extension loader header) if not,then that would explain it


It'd exist, or it wouldn't even compile (and subsequently wouldn't be able to give the 'invalid enum' error (as generated by glGetError()). The 'invalid enum' error is returned when one of the arguments (like the internal/external pixel format in an image (the image data)), is invalid.

In this case, I'd be checking the extension that allows you to use GL_LUMINANCE_FLOAT32_ATI, since it should give you a list of things you can do with it. Even better: check that you actually have support for that extension. If you have the support, and you're using it correctly, then I'm out of ideas ('cause it should work fine).

[EDIT]
I just read a part of this PDF, which says it's ('it' being GL_LUMINANCE_FLOAT32_ATI) an internal format, so that looks fine, and GL_LUMINANCE is fine for the external format, as is GL_FLOAT for the data type. Going by that, I'd be checking that the extension (ATI_texture_float (not sure if it starts with 'GL_' or not)) can be used.

[EDIT 2 - EDITED]
You should also check to see if the error is generated before you call these functions (I remember trying to fix something I thought was wrong before I did this). A rule I've learnt: track the error down before you try to fix it. ;)
Quote:Original post by Gorax
Quote:Original post by _the_phantom_
are you including a header which defines GL_LUMINANCE_FLOAT32_ATI anywhere? (glext.h, glee.h or any other extension loader header) if not,then that would explain it


It'd exist, or it wouldn't even compile (and subsequently wouldn't be able to give the 'invalid enum' error (as generated by glGetError()). The 'invalid enum' error is returned when one of the arguments (like the internal/external pixel format in an image (the image data)), is invalid.


heh, good point, sorry, thats what I get for trying to answer a question when i've just crawled out of bed [grin]

Probably a silly question, but...you are using a GeForce6 with the latest drivers, right? :)
Orin Tresnjak | Graphics ProgrammerBethesda Game StudiosStandard Disclaimer: My posts represent my opinions and not those of Bethesda/Zenimax, etc.
Thx for the quick replies.

For one I read about the texture format in Nvidias paper concerning vertex texture access (only works on 6xxx). I didn't really think about wheter I would need a GF6xxx to just use the texture format.

I am developing on a gf 5xxx at home and therefore can only try at my university whether my code really works (they have a 6xxx there). I thought I could go develop my code up to the point where the actual texture lookup occurs (in the vertex shader) at home. And since I want to test as much as possible I gradually started to test my implementation ...

Anyways I guess I will just buy a very cheap 6200 to keep on developing at home.

If anybody gets a simple demo (using vertex textures) to work plz post the working code. Nvidia has a demo which I can't get to compile on my machine :)

Thx again.

Mike

This topic is closed to new replies.

Advertisement