glDrawPixels/Textures/other method

Started by
11 comments, last by aznium 17 years, 2 months ago
Hello, What is the best way to paint a 1920x1080 from video ram to screen. The image is split into two 960x1080 subimages. Currently I am using this glDrawPixels, glRasterPos2f(-1, -1); glDrawPixels(hwidth, height, GL_RGB, GL_UNSIGNED_BYTE, partialFrame1); glRasterPos2f(0, -1); glDrawPixels(hwidth, height, GL_RGB, GL_UNSIGNED_BYTE, partialFrame2); However it is too slow. I heard textures are good, but my images are not 2^n. Also the height of 1080 is jsut more than 1024. So jumping to 2048 seems like a waste? Any advice? Thanks
Advertisement
I'll probably get my head bitten off for saying this, but if you load the image as a mip-mapped texture then you can use non power of two (NPOT) texture sizes.
Quote:Original post by AndyEsser
I'll probably get my head bitten off for saying this, but if you load the image as a mip-mapped texture then you can use non power of two (NPOT) texture sizes.


You sure will. Mipmaps are no different when it comes to power of two restrictions. Either your card doesn't support non-power of two textures, in which case mipmapping won't help. If it does, it supports it for non-mipmaps aswell.

Thinking about gluBuild2DMipmaps maybe? It will resize to nearest power of two before uploading it as a texture to OpenGL, so no magic work around for the power of two limitation there.
what are mipmaps . srry for noob question.

what cards will support non-power of two?

i am willing to buy a new card
gluBuild2DMipmaps obtains the input image and generates all mipmap images (using gluScaleImage) so that the input image can be used as a mipmapped texture image. glTexImage2D is then called to load each of the images. If the dimensions of the input image are not powers of two, then the image is scaled so that both the width and height are powers of two before the mipmaps are generated.

It says the image will be scaled? my image when outputted to the screen can not be scaled . it has to remain as 1920x1080.
i am trying to use

GL_TEXTURE_RECTANGLE_ARB

but it says error C2065: 'GL_ARB_texture_rectangle' : undeclared identifier

is this a compiler problem? or a library problem? im using visual c++ 6
Quote:Original post by aznium
is this a compiler problem? or a library problem? im using visual c++ 6

Without more information (read: "code"), it's hard to say. One thing to be sure of, though, is that you should ditch VC6. It's not standards compliant, or anything worthwhile, really, now that Visual Studio 2005 Express Edition is free. It's not hard to learn or get used to, especially if you're already familiar with an older VS flavor.

-jouley

[Edit: The code I'm looking for, specifically, is where you're trying to use GL_TEXTURE_RECTANGLE_ARB, and where the error is cropping up. It seems like you're actually trying to use "GL_ARB_texture_rectangle" somewhere, which would (and should) throw a flag.]

[Edited by - jouley on January 29, 2007 7:47:21 PM]
i tried to glEnable it

perhaps you can give me a code snippet . i cant seem to google this
>> GL_TEXTURE_RECTANGLE_ARB
>> but it says error C2065: 'GL_ARB_texture_rectangle' : undeclared identifier

use an extension loading library eg glew / glee
Your graphics card may not support this particular extension. What card are you using?

[Edit: zedzeek shows you how to find out, and this is just for reference.]

This topic is closed to new replies.

Advertisement