Jump to content
  • Advertisement
Sign in to follow this  
directNoob

OpenGL [Solved]Using new ogl features on older hardware?[GeForce5200Go]

This topic is 4230 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Good evening. In the cg course at university, we have to make computer graphics stuff with OpenGl. The worst thing on this is, that my hardware of my notebook is a bit older... Recently I bought a very good new graphics card, a GeForce 8600 GTS. For this, I left a long ATI relationship. I mean, I use ATI cards since 3dfx went bankrupt, so this was hard hit to me, as you can imagine... Anyway... The problem now is, that, at home I have a very new graphics hardware even with support to OpenGL 2.1. But the graphics chip on my notebook is quite old. It is a nvidia geforce 5200 go. But I cant find appropriate drivers for newer OpenGL support. So the next question comes up. How can I use the newest features of OpenGL? I mean, I can use shaders in DirectX without any problems, but in OpenGL, I even cant use VBOs on my notebook. For example: The call glGenBuffers causes an error because of a null pointer. Do I have to use the extensions for this? And if it is so, how could I make my program using the propper OpenGl functions, for the newest and older hardware I have? How can I enumerate the hardware like in DX or how does it work??? I'm very new to OpenGL, so I would appreciate it much, if somebody gave me a helping hand! Very grateful Alex [Edited by - directNoob on May 16, 2007 11:29:45 AM]

Share this post


Link to post
Share on other sites
Advertisement
Get GLEE or GLEW to load all the extensions you need for OpenGL. It's easier that way, vs. doing it yourself.

http://www.laptopvideo2go.com/

try that for latest drivers

Share this post


Link to post
Share on other sites
Alex,

I have bad news... opengl and laptop video cards are no the best of friends.
Unfortunetly you will need to abandon your laptop, if you plan on using modern
features of opengl.. their is no workaround.


Share this post


Link to post
Share on other sites
You can just disable the features on your laptop and use them only on your other PC with the 8600. You should be checking for support on all the extensions you plan to use in any OpenGL application you write anyways.

You will have to write an atlernate rendering path to fallback on when the features aren't supported.

Share this post


Link to post
Share on other sites
The 5200 is really bad performance wise but I would like to recall that it's fully PS2.0 compliant (PS2_a it's almost PS3.0).
Although the new G8x stuff is really exciting, you shouldn't suffer so much. PS2.0 hardware is a real step foward wrt programmability, features and learning opportunities. The FX family had a lot of issues but it got full FP32 support (besides blending).
The only bad thing is the lack of dynamic branching but it's not something I would scream for anyway.

Share this post


Link to post
Share on other sites
Hi and thanks for you replies!

I'm using the latest GLEE lib I could get at OpenGL.org.
But, the worst thing is, that I couldn't use VBOs, which is a very very basic
technique to render graphical primitives!
And also, Im addicted to it, because of Direct3D, bacause there is no
very poor performance call like glVertex* or similar...
I really don't like glVertex* or similar.
glVertex* or similar gl...*() functions are great, because of easy usage, but have a huge lack of performance, so this is completely stroke out in my list of potentially drawing calls.

Instead, I need to use the VBO model on my notebook. This is even more important
than using the newest shaders!
Because you can't see anything witout geometry!

So, as I understood it, the nice Mr. Glee loads the needed AND also supported
extensions for my, simply by calling, for example, glGenBuffers() or any other
function without explicitly specifing the EXT or so?!

I mean, VBOs are standard since OpenGl 1.5, so why do I get null pointers,
when I call, for example, glGenBuffers???

Is it simply not supported or whats going on???

This is so strange, this cant be, because in D3D, I simply can use vertex or index buffers, but in opengl, it seems to be not possible!

Truly spoken, how many people have the newest graphics hardware to their use??
There must be a way... Or not... ?

Thanks
Alex

Share this post


Link to post
Share on other sites
If you really have a geforce 5200 go, then it supports GL 2.1 so it's just a matter of updating your drivers. Search for omega drivers or go to that laptoptogo website someone previously mentioned.

Yes, VBO has been core some years ago (GL 1.5)

Share this post


Link to post
Share on other sites
Ok. Everything is fine.
Thanks to you guys and thanks for the link.

I downloaded the latest drivers 93.xx for GeForceFX 5200 (without GO)
from developer.Nvidia.com.
Than I unpacked it and replaced the nvdisplay.inf with a modified, which I found
here:
http://www.fredfred.net/skriker/index.php/geforce-fx-go5200-with-latest-forceware-drivers/

Than, I was able to install the package.
NO error message!!!

Thanks again
Alex

P.S.
@ Nicholas Christopher:
If you have a similar problem, have a look at the link!
It is possible to use newer features on older laptop hardware.
And I have to say, that it is running very smoothly!

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!