Accelerated GL?

Started by
18 comments, last by Mayrel 22 years, 5 months ago
Kippesoep: okay, what you might want to do is write your own OpenGL implementation and put it in the directory where you want to run your OpenGL application from. it will link the "OpenGL32.dll" from the one in the current directory. i''ve done it myself. i dunno, maybe it was fluke or my Windows installation was different or something. anyhow, i don''t feel like arguing about this anymore. if you say it links it from the registry then i guess it does.

To the vast majority of mankind, nothing is more agreeable than to escape the need for mental exertion... To most people, nothing is more troublesome than the effort of thinking.
To the vast majority of mankind, nothing is more agreeable than to escape the need for mental exertion... To most people, nothing is more troublesome than the effort of thinking.
Advertisement
True. It''s not a question of which OpenGL32.dll gets loaded. Windows always prefers the one in the current directory. What I meant was that the standard MS OpenGL32.dll uses this registry key to load the vendor specific one in order to get hardware acceleration.

If the vendor specific driver implements all of the OGL spec, I suppose the decrease in call overhead would actually make your program run slightly faster if you call it directly.
Kippesoep
I have a Dell laptop with said ATI Mobility (thats mobility not stability Rage 128.

OpenGL works accelerated out of the box using NeHe tutorials/PortaLib and so on.

How do you know you are not getting acceleration?

~~~
Cheers!
Brett Porter
PortaLib3D : A portable 3D game/demo libary for OpenGL
Community Service Announcement: Read How to ask questions the smart way before posting!
~~~Cheers!Brett PorterPortaLib3D : A portable 3D game/demo libary for OpenGLCommunity Service Announcement: Read How to ask questions the smart way before posting!
Enumerate the Pixel Formats and check the flags for them (the dwFlags member). The following flags tell whether it is an accelerated Pixel Format or not (these flags are *not* vendor specific):

PFD_GENERIC_ACCELERATED
The pixel format is supported by a device driver that accelerates the generic implementation. If this flag is clear and the PFD_GENERIC_FORMAT flag is set, the pixel format is supported by the generic implementation only.

PFD_GENERIC_FORMAT
The pixel format is supported by the GDI software implementation, which is also known as the generic implementation. If this bit is clear, the pixel format is supported by a device driver or hardware.

So simply check for the PFD_GENERIC_ACCELERATED flag. If none of your Pixel Formats have this, then try reinstalling the GL drivers for your graphics card.

Look here for more info about PIXELFORMATDESCRIPTOR.
I know I''m not getting acceleration because my Voodoo4 4000 allows me to set seperate gamma settings for D3D, OpenGL, video and GDI applications. I''ve also got anti-aliasing turned on.

If I run a game using OpenGL (like Quake2 or Homeworld), then I can see that its using the gl gamma, and it *is* antialiased. On the other hand, when running my program (which is based on the NeHe tuts) it uses the GDI gamma, and isn''t antialiased.

I know that I have acceleration, because the games are accelerated. (I also know I have acceleration because that''s why I paid quite a lot of money for the card. )

I''ve noticed there''s no EnumPixelFormat, I think I was thinking of display modes.

If you use GetPixelFormat - after SetPixelFormat - you can test the dwFlags member of the structure to determine what kind of acceleration is being used:

PFD_GENERIC_FORMAT = Software
PFD_GENERIC_FORMAT & PFD_GENERIC_ACCELERATED = Software and Hardware
Neither = Hardware

That''s how I understand the MSDN''s (vague) documentation. However, when I test it the pfd indicates that the mode is entirely hardware accelerated, whilst it clearly isn''t.

Uuuuuulrika-ka-ka-ka-ka-ka
CoV
Ok, I think I''ve figured it out. Thanks for all your help guys. Basically, unless you specifically tell GL to use software, or choose a mode that isn''t supported by the hardware in your PIXELFORMATEDESCRIPTOR structure, your program should run using the hardware driver. My problem running XP, is that I had to download the updated driver for the ATI Mobility card. As soon as I did, the hardware acceleration worked. Though now the smoothing doesn''t look as good as I think it should, but I''m working on that.

Mayrel, if you put the PFD_GENERIC_ACCELERATED flag in the PIXELFORMATDESCRIPTOR structure, it should run using your hardware. I''m still confused about how to enumerate the modes, other than create the hDC first, then get its supported modes and copy those to a PFD structure.
And how do I enumrate display modes that are accelerated, without setting those ?

I use EnumDisplaySettings to get all supported display modes, that works fine. But how can I determine *without* activating it, if a specific mode is accelerated, and if that mode supports stencil masks, z buffer depth, etc... ? Generally speaking, I would like to get a valid PIXELFORMATDESCRIPTOR for a certain display setting.


There''s a flag you can pass to ChangeDisplaySettings that prevents it from actually making the change. Perhaps that''s what you want?

I demand unsatisfaction!
CoV
Well first of all don''t bother with the enumerate pixel format functions, use choosepixelformat and it should find a format that''s as close to what you wanted, choosing an accelerated one over a non accelerated one, if memory serves. You can try the GENERIC_ACCELERATED flag and its ilk if you want, but they didn''t do squat for me, I wanted a software format and I kept getting hardware. Anyhow, try checking for error messages where you create your contexts and bind them, and that your window is appropriate for gl rendering, etc. Failing that, try doing things in a different order, instead of creating the window, then showing it, then creating your contextx and binding them try doing it in a different (yet acceptable) order. It might just be that your drivers are picky about what they accept.

------------
- outRider -
outRider:

This is how I used to do it, but it''s not acceptable anymore. Some display formats do not have the properties my game requires (eg. stencil buffer) and I want to sort them out, before presenting the user a list of modes he can choose from. I don''t want unaccelerated modes on that list either. I want all modes that I present the user to be valid, accelerated modes that work on his card. Giving him a generic list and then telling him, ''Sorry mate, the mode you selected is not hardware accelerated, and don''t have a stencil buffer. Please select another one !'', is just unprofessional. I can''t check through all modes either, there may be hundreds of display settings, I can''t try all of them, imagine the sreen flickering...

This topic is closed to new replies.

Advertisement