Jump to content
  • Advertisement
Sign in to follow this  
aznium

glDrawPixels/Textures/other method

This topic is 4218 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello, What is the best way to paint a 1920x1080 from video ram to screen. The image is split into two 960x1080 subimages. Currently I am using this glDrawPixels, glRasterPos2f(-1, -1); glDrawPixels(hwidth, height, GL_RGB, GL_UNSIGNED_BYTE, partialFrame1); glRasterPos2f(0, -1); glDrawPixels(hwidth, height, GL_RGB, GL_UNSIGNED_BYTE, partialFrame2); However it is too slow. I heard textures are good, but my images are not 2^n. Also the height of 1080 is jsut more than 1024. So jumping to 2048 seems like a waste? Any advice? Thanks

Share this post


Link to post
Share on other sites
Advertisement
I'll probably get my head bitten off for saying this, but if you load the image as a mip-mapped texture then you can use non power of two (NPOT) texture sizes.

Share this post


Link to post
Share on other sites
Quote:
Original post by AndyEsser
I'll probably get my head bitten off for saying this, but if you load the image as a mip-mapped texture then you can use non power of two (NPOT) texture sizes.


You sure will. Mipmaps are no different when it comes to power of two restrictions. Either your card doesn't support non-power of two textures, in which case mipmapping won't help. If it does, it supports it for non-mipmaps aswell.

Thinking about gluBuild2DMipmaps maybe? It will resize to nearest power of two before uploading it as a texture to OpenGL, so no magic work around for the power of two limitation there.

Share this post


Link to post
Share on other sites
what are mipmaps . srry for noob question.

what cards will support non-power of two?

i am willing to buy a new card

Share this post


Link to post
Share on other sites
gluBuild2DMipmaps obtains the input image and generates all mipmap images (using gluScaleImage) so that the input image can be used as a mipmapped texture image. glTexImage2D is then called to load each of the images. If the dimensions of the input image are not powers of two, then the image is scaled so that both the width and height are powers of two before the mipmaps are generated.

It says the image will be scaled? my image when outputted to the screen can not be scaled . it has to remain as 1920x1080.

Share this post


Link to post
Share on other sites
i am trying to use

GL_TEXTURE_RECTANGLE_ARB

but it says error C2065: 'GL_ARB_texture_rectangle' : undeclared identifier

is this a compiler problem? or a library problem? im using visual c++ 6

Share this post


Link to post
Share on other sites
Quote:
Original post by aznium
is this a compiler problem? or a library problem? im using visual c++ 6

Without more information (read: "code"), it's hard to say. One thing to be sure of, though, is that you should ditch VC6. It's not standards compliant, or anything worthwhile, really, now that Visual Studio 2005 Express Edition is free. It's not hard to learn or get used to, especially if you're already familiar with an older VS flavor.

-jouley

[Edit: The code I'm looking for, specifically, is where you're trying to use GL_TEXTURE_RECTANGLE_ARB, and where the error is cropping up. It seems like you're actually trying to use "GL_ARB_texture_rectangle" somewhere, which would (and should) throw a flag.]

[Edited by - jouley on January 29, 2007 7:47:21 PM]

Share this post


Link to post
Share on other sites
>> GL_TEXTURE_RECTANGLE_ARB
>> but it says error C2065: 'GL_ARB_texture_rectangle' : undeclared identifier

use an extension loading library eg glew / glee

Share this post


Link to post
Share on other sites
Your graphics card may not support this particular extension. What card are you using?

[Edit: zedzeek shows you how to find out, and this is just for reference.]

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!