Jump to content
  • Advertisement
Sign in to follow this  
bytebucket

OpenGL OpenGL and hardware acceleration

This topic is 3612 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

I'm using an nVidia Quadro FX 1400 video card on my development machine with the latest drivers running on Windows XP SP2 using Visual C++ 2005 SP1. I ran gDEBugger from graphic Remedy which has a feature to enumerate all available pixel formats. When I use this, all the pixel formats shown indicate "None" in the Hardware column. This matches what I've seen in code when enumerating the available pixel formats (via calling DescribePixelFormat in a loop). Is the issue that this video card really does not support any hardware acceleration for OpenGL rendering? I'm posting this here in case I'm missing something or there's some other way I need to get to hardware accelerated rendering. I was under the impression that the Quadro FX did support OpenGL rendering and if that really only means software rendering then I guess its just a hardware issue, but I thought I'd check here just in case.

Share this post


Link to post
Share on other sites
Advertisement
Quote:
Original post by bytebucket
Is the issue that this video card really does not support any hardware acceleration for OpenGL rendering? I'm posting this here in case I'm missing something or there's some other way I need to get to hardware accelerated rendering. I was under the impression that the Quadro FX did support OpenGL rendering and if that really only means software rendering then I guess its just a hardware issue, but I thought I'd check here just in case.
The card shipped with OpenGL 1.5 support, and I believe recent drivers upped that to GL 2.0. Make sure your drivers are installed correctly.

Share this post


Link to post
Share on other sites
Quote:
Original post by swiftcoder
The card shipped with OpenGL 1.5 support, and I believe recent drivers upped that to GL 2.0. Make sure your drivers are installed correctly.

When you say OpenGL 1.5/2.0 support do you mean with hardware acceleration? i.e. should I expect to get hardware acceleration? I would have thought so, but the pixel formats I'm getting seem to indicate otherwise. I do have the latest drivers installed.

Share this post


Link to post
Share on other sites
Quote:
Original post by bytebucket
When you say OpenGL 1.5/2.0 support do you mean with hardware acceleration?
Yes, with hardware acceleration - it is a DirectX 9 level card, and IMHO a software device isn't really useful these days (apart from debugging) [smile]
Quote:
i.e. should I expect to get hardware acceleration? I would have thought so, but the pixel formats I'm getting seem to indicate otherwise.
Print out the values of glGetString(GL_VENDOR) and glGetString(GL_VERSION), once you have setup a context with one of those formats - there is the possibility that you are getting the ancient Microsoft software implementation.
Quote:
I do have the latest drivers installed.
Probably worth reinstalling them.

Share this post


Link to post
Share on other sites
Quote:
Original post by swiftcoder
Print out the values of glGetString(GL_VENDOR) and glGetString(GL_VERSION), once you have setup a context with one of those formats - there is the possibility that you are getting the ancient Microsoft software implementation.

I'm getting the below.

GL_VENDOR = "Microsoft Corporation"
GL_VERSION = "1.1.0"

So it looks like I am getting the ancient MS software implementation. Granted, if I enumerate over all the available pixel formats they all indicate "None" for hardware acceleration so I'm not sure picking a different pixel format would help.

I saved the system info obtained using gDEBugger to a text file and noticed the below.

////////////////////////////////////////////////////////////
// Graphic Card
////////////////////////////////////////////////////////////
Item Value
Renderer Vendor Microsoft Corporation
Renderer name GDI Generic
Renderer Version 1.1.0
Shading Language Version N/A
Renderer type Generic OpenGL software renderer

So perhaps, as you suggested, the drivers are not correctly installed which is why I'm only getting pixel formats for the generic MS driver and its probably not even using the nVidia driver for OpenGL.

I'll re-install the drivers and see what happens. Thanks for the tips!

Share this post


Link to post
Share on other sites
I think the problem is solved :) Here's what I found.

I un-installed and then re-installed the nVidia drivers. After doing this, when I ran gDEBugger it now showed the below in the Graphic Card section of the System Information output instead of the generic MS values.


////////////////////////////////////////////////////////////
// Graphic Card
////////////////////////////////////////////////////////////
Item Value
Renderer Vendor NVIDIA Corporation
Renderer name Quadro FX 1400/PCI/SSE2
Renderer Version 2.0.1
Shading Language Version 1.10 NVIDIA via Cg 1.3 compiler
Renderer type Installable client


It also showed HW accelerated pixel formats. So there was definitely a driver issue at play.

However, this alone didn't get my app to use the HW accelerated pixel formats. Our code which picked the pixel formats was expecting the PFD_GENERIC_ACCELERATED flag to be set in the PIXELFORMATDESCRIPTOR dwFlags member variable. This is how it was determining that the pixel format was hardware accelerated.

As it turns out, for my system, none of the pixel formats have this flag set, even the ones which gDEBugger indicates are hardware accelerated. Instead for the hardware accelerated pixel formats, neither of the flags (PFD_GENERIC_ACCELERATED or PFD_GENERIC_FORMAT) are set.

I updated our code so that if both flags are clear it considers that a hardware accelerated pixel format. After doing this, our app now correctly chooses one of the hardware accelerated pixel formats and the performance got a HUGE boost!

I suppose the correct interpretation of these flags is that the nVidia driver provides acceleration but not _GENERIC_ acceleration.

Anyway, hopefully this helps avoiding the headache for someone else :)

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!