I know a big difference between GL_TEXTURE_RECTANGLE_ARB and GL_TEXTURE_2D is the way texture dimensions work. Of course, my images are all over the place size-wise, so I decided I would just take the longest dimension of an image and pass a square texture to glTexImage2D. I'd then work out the cropping, etc. in the shader:
/* "orig_data" is value returned from image reader, etc. It is the actual pixel data as a grayscale image (only one color plane) "orig_length" is the length of the actual data in bytes */ int width = 1024; int height = 768; glEnable(GL_TEXTURE_2D); /* make a bigger-than-normal buffer to hold the image plus padding needed to make texture square */ GLubyte *new_data = (GLubyte*)malloc(height*height); /* copy existing data into bigger buffer */ memcpy(new_data,orig_data,orig_length); glTexImage2D(GL_TEXTURE_2D,0,1,width,width,0,GL_LUMINANCE,GL_UNSIGNED_BYTE,(GLubyte*)new_data);
However, this doesn't work even when I use sampler2D and texture2D in my shader. What I usually get is either garbled data or black.
Any idea why this isn't working as expected? Is it something to do with my padding of the image data?