Jump to content
  • Advertisement

Archived

This topic is now archived and is closed to further replies.

sc0rpy

Background.. how?

This topic is 6965 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Advertisement
Ok, here are my first results and comments:
glBitmap() appears to only draw 2 color images (fonts, simple bitmaps) so for what I want to do it's not very useful.

glDrawPixels() works but a2k is DEAD ON. it SUCKS!
just drawing my scene with no bitmap 49fps. drawing with the bitmap using glDrawPixels() 5.3 fps. The image was only 300x300 pixels, 24 bit color so i'd imagine 640x480 would be nasty slow.

I have an Annihilator Pro DDR so I don't thing it's a video card thing, must simply be the implementation of glDrawPixels(). My guess is that moving the bitmap from system memory to video memory is whats killing it.

a2k, if you still have your code to how you drew your 'larger' image please paste some of it so we can see what you did.

I guess I'll have to write a handler to break the image into tiles and use them as textures.


Edited by - sc0rpy on 4/16/00 1:02:21 PM

Share this post


Link to post
Share on other sites
Well, I don''t really know how fast (or how slow) raster graphics are (I work mainly with software rendering, my computer does not have any 3D card), so I obviously don''t see any difference (raster graphics might actually be faster in this case).

If raster graphics are so slow, then it might explain why I only get under 10 FPS on my friend''s Athlon 700 with a GeForce when I have to draw bitmaps... I thought the bottleneck was the palette stuff, now I''ll know what to do

Damn! I can''t wait to buy a new computer!

Thanks guys!

Eric Laberge

Share this post


Link to post
Share on other sites
No worries, in all honesty I hadn''t heard of either of those 2 functions you mentioned so at least having you mention them has helped be become aware of their purpose and their limitation when dealing with high framerate graphix.

I''ve found a interesting URL when dealing with oddly shaped textures:
http://www.gegi.polymtl.ca/info/granger/cours/3.430/OpenGL/UserGuide/OpenGLonWin-13.html

read the section: "Updating Textures Quickly With Subtextures"
here is a quick rip from that URL:

"When you''re working with an image that has a width and height that''s not a power of 2, you can''t directly use it as a texture. This can be a problem, for example, when you''re using video frames as textures: none of the common video formats (NTSC, PAL, SECAM, HD-TV, and so on) have a height and width of those dimensions. You can solve this problem by creating a NULL texture whose dimensions are larger than the video frame and then load the video frame as a subtexture. "

I''ll post any comments and implementation on the above if I can decypher the text.

Share this post


Link to post
Share on other sites
hey, is the opengl superbible online? well, that''s where i got the code for loading up bitmaps and textures, so if you know how to make texture objects, just apply that texture to the rectangle in ortho mode, so that you''re looking at the rectangle straight on. switch the matrix mode to texture, and then use the scale transformation matrix to mess around with the x, y coordinates until the picture matches the size of the screen. then switch to perspective mode, and then you can render your 3d geometry.

Share this post


Link to post
Share on other sites
A2K:

what happens if the texture is > 256 x 256 ??

My video card supports large textures but what happens if your video card doesn''t. (like 3dfx cards).

Actually, has anyone tried the 3ds loader program posted today on opengl.org:
http://home1.pacific.net.sg/~gishsh05/

he doesn''t check to see if the texture is > 256x256 so I''m curious what happens for you 3dfx people. Actually, does anyone know if the software driver (Micro$oft) has any texture limitation size?

Thanks.


Share this post


Link to post
Share on other sites
i dunno. i think i even loaded up a 640x480 texture onto my surface....

a2k

Share this post


Link to post
Share on other sites
Well, a texture of 640x480 won''t work since it doesn''t follow the power of 2 rule. Large textures (ie: 1024x256) work with the software driver but I''m pretty sure I read somewhere that a good percentage of 3d cards out there dont support textures > 256x256. I wonder if there is a way or a call in OpenGL, etc that will return the maximum resolution of a texture?

Share this post


Link to post
Share on other sites

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!