Jump to content
  • Advertisement
Sign in to follow this  
mrfixit

OpenGL opengl runs ten times fater on linux?

This topic is 4085 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello, Ive made a simulation of particles in 2D, it's really very basic... In the beginning i drew the particles as triangles and scaled them according to the net force acting on them - in linux i got around 8000fps for 100 particles. then i stopped using the scale function, and drew each particle as a point. that got me quite a boost - 12000 fps for 100 particles. now i thought to myself that it would be nice to see how my simulation does in windows XP with the latest nvidia drivers. i compiled it with minGW, so that it would as similar as possible to the linux version. and i got 1000 fps (for the points with no scaling - 750 when scaling triangles!) -how can that be?!

Share this post


Link to post
Share on other sites
Advertisement
my guess is that the timers you are using in linux have a higher resolution than the ones you are using in windows

-- or --

you don't have the right drivers installed for windows

also that's still not a huge difference. Only a difference of 0.8 milliseconds per frame. (running at 60fps is 17ms per frame). it wouldn't be enough to change your FPS from 60 -> 59

-me

Share this post


Link to post
Share on other sites
Not to mention that framerates are very vague and useless when you have so little happening. Frame rates rarely are ever scale linearly as your project progresses. And to top that off anything over monitor refresh (usually 60-75 depending on your monitor) is just spinning your wheels since it won't be seen anyway. I wouldn't even worry about it.

Share this post


Link to post
Share on other sites
Quote:
Original post by mrfixit
8000fps...12000 fps...1000 fps...

Framerates above 100 FPS or so are not relevant for performance comparisons.

Share this post


Link to post
Share on other sites
I'm using glutGet(GLUT_ELAPSED_TIME) for the timer - so i don't think there should be any difference between windows and linux here...
and both my linux and windows use the latest drivers...

anyway, i guess my next move is to limit the drawings to 60fps and use the counter in the idlefunc to see whats the speed of the loop...

Share this post


Link to post
Share on other sites
heh, now that i limited it to 30fps my idle loop runs 23500 times per sec on linux and and 27000 times per sec on winXP... maybe if i had my kernel optimized, it would be the same though...

Share this post


Link to post
Share on other sites
may be that linux you are using written by bil gates?[grin]
can you uplad this file and may be source somewhere.?
and what linux are you using.?


Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!