Jump to content

  • Log In with Google      Sign In   
  • Create Account


Accelerated GL?


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
19 replies to this topic

#1 Mayrel   Members   -  Reputation: 348

Like
Likes
Like

Posted 25 October 2001 - 04:31 AM

I have a simple problem, my (win32) gl apps aren''t being accelerated. It''s quite plain that games I have, like UT, Homeworld, and the like, are being accelerated. The test is quite simple, the Voodoo4 lets me set gl gamma seperately from gdi gamma. I set the gl gamma to 0.1, and accelerated programs are pitch black. I know, then that my apps are not accelerated, rather than just being slow due to programmer incompetance. I assume I need to 1) enable acceleration using some wgl function, or 2) use a replacement opengl32.dll that is accelerated (like id''s minigl, for example, although I want a full implementation). Does anyone which it is, and how to go about the correct solution? Uuuuuulrika-ka-ka-ka-ka-ka

Sponsor:

#2 Dactylos   Members   -  Reputation: 122

Like
Likes
Like

Posted 25 October 2001 - 04:38 AM

Make sure that you select an accelerated PixelFormat.

#3 Prosper/LOADED   Members   -  Reputation: 100

Like
Likes
Like

Posted 25 October 2001 - 05:02 AM

quote:
Original post by Dactylos
Make sure that you select an accelerated PixelFormat.


Yes, some cards let you select 24 bpp format but aren''t accelerated for it.



#4 Mayrel   Members   -  Reputation: 348

Like
Likes
Like

Posted 25 October 2001 - 05:18 AM

Ah... I should use EnumPixelFormat for that, right?

Uuuuuulrika-ka-ka-ka-ka-ka

#5 GLUbie   Members   -  Reputation: 122

Like
Likes
Like

Posted 25 October 2001 - 06:32 AM

Looking over the VC++ help files on the OpenGL stuff, and from the previous comments, it seems like all one has to do is setup a PIXELFORMATDESCRIPTOR structure and choose what would be accelerated settings for it.

Though this doesn''t work for me ;-), it still runs using the Software renderer

#6 ANSI2000   Members   -  Reputation: 122

Like
Likes
Like

Posted 25 October 2001 - 08:39 AM

If you set the values to the highest possible in the PIXELFORMATDESCRIPTOR, the hardware will pick the highest possible it can do, if I remember correctly.

#7 GLUbie   Members   -  Reputation: 122

Like
Likes
Like

Posted 25 October 2001 - 08:44 AM

Yeah, but how do you tell OpenGL to use the hardware renderer instead of the standard GDI renderer? That''s what I''m having trouble figuring out.

#8 jenova   Members   -  Reputation: 122

Like
Likes
Like

Posted 25 October 2001 - 09:36 AM

hmmm. assuming you have hardware acceleration support. *** you didn''t mention which video card you have ***. you NEED the vendor specific OpenGL32.dll either in your "system" or "system32" directory in you Windows directory (assuming ICD). or in the the directory where your application is run from (ICD or MCD). the Windows application loader will automatically link the "dll" that is in the program path or in the environment variable search path (i.e winnt/system32). you may need to select accelerated support in the "PIXELFORMATDESCRIPTOR" structure, but that would be implementation specific.

ICD <- installable client driver: full OpenGL implementation.
MCD <- mini client driver: partial OpenGL implementation.

#9 GLUbie   Members   -  Reputation: 122

Like
Likes
Like

Posted 25 October 2001 - 09:52 AM

This is a Dell laptop with ATI MOBILITY 128. So it''s just a matter of finding the driver and probably copying it to the directory that the program resides in and running it from there? And in the case that there is a flag in the PIXELFORMATDESCRIPTOR structure for the Accelerated option, anyway to find out what it is based on the driver?

--Thanks!

#10 Kippesoep   Members   -  Reputation: 892

Like
Likes
Like

Posted 25 October 2001 - 11:01 AM

Actually, the Microsoft OpenGL32.dll will automatically load the vendor specific driver (from a registry key: HKEY_LOCAL_MACHINE\Software\Microsoft\Windows\CurrentVersion\OpenGLDrivers
). If there is none installed, it''ll use software rendering. Copying the driver you need won''t just not make a difference, it might also not run at all if you need certain things that the driver doesn''t implement.

#11 jenova   Members   -  Reputation: 122

Like
Likes
Like

Posted 25 October 2001 - 11:32 AM

Kippesoep: okay, what you might want to do is write your own OpenGL implementation and put it in the directory where you want to run your OpenGL application from. it will link the "OpenGL32.dll" from the one in the current directory. i''ve done it myself. i dunno, maybe it was fluke or my Windows installation was different or something. anyhow, i don''t feel like arguing about this anymore. if you say it links it from the registry then i guess it does.

To the vast majority of mankind, nothing is more agreeable than to escape the need for mental exertion... To most people, nothing is more troublesome than the effort of thinking.

#12 Kippesoep   Members   -  Reputation: 892

Like
Likes
Like

Posted 25 October 2001 - 11:13 PM

True. It''s not a question of which OpenGL32.dll gets loaded. Windows always prefers the one in the current directory. What I meant was that the standard MS OpenGL32.dll uses this registry key to load the vendor specific one in order to get hardware acceleration.

If the vendor specific driver implements all of the OGL spec, I suppose the decrease in call overhead would actually make your program run slightly faster if you call it directly.

#13 brettporter   Members   -  Reputation: 122

Like
Likes
Like

Posted 26 October 2001 - 12:33 AM

I have a Dell laptop with said ATI Mobility (thats mobility not stability Rage 128.

OpenGL works accelerated out of the box using NeHe tutorials/PortaLib and so on.

How do you know you are not getting acceleration?

~~~
Cheers!
Brett Porter
PortaLib3D : A portable 3D game/demo libary for OpenGL
Community Service Announcement: Read How to ask questions the smart way before posting!

#14 Dactylos   Members   -  Reputation: 122

Like
Likes
Like

Posted 26 October 2001 - 12:53 AM

Enumerate the Pixel Formats and check the flags for them (the dwFlags member). The following flags tell whether it is an accelerated Pixel Format or not (these flags are *not* vendor specific):

PFD_GENERIC_ACCELERATED
The pixel format is supported by a device driver that accelerates the generic implementation. If this flag is clear and the PFD_GENERIC_FORMAT flag is set, the pixel format is supported by the generic implementation only.

PFD_GENERIC_FORMAT
The pixel format is supported by the GDI software implementation, which is also known as the generic implementation. If this bit is clear, the pixel format is supported by a device driver or hardware.

So simply check for the PFD_GENERIC_ACCELERATED flag. If none of your Pixel Formats have this, then try reinstalling the GL drivers for your graphics card.

Look here for more info about PIXELFORMATDESCRIPTOR.

#15 Mayrel   Members   -  Reputation: 348

Like
Likes
Like

Posted 26 October 2001 - 12:58 AM

I know I''m not getting acceleration because my Voodoo4 4000 allows me to set seperate gamma settings for D3D, OpenGL, video and GDI applications. I''ve also got anti-aliasing turned on.

If I run a game using OpenGL (like Quake2 or Homeworld), then I can see that its using the gl gamma, and it *is* antialiased. On the other hand, when running my program (which is based on the NeHe tuts) it uses the GDI gamma, and isn''t antialiased.

I know that I have acceleration, because the games are accelerated. (I also know I have acceleration because that''s why I paid quite a lot of money for the card. )

I''ve noticed there''s no EnumPixelFormat, I think I was thinking of display modes.

If you use GetPixelFormat - after SetPixelFormat - you can test the dwFlags member of the structure to determine what kind of acceleration is being used:

PFD_GENERIC_FORMAT = Software
PFD_GENERIC_FORMAT & PFD_GENERIC_ACCELERATED = Software and Hardware
Neither = Hardware

That''s how I understand the MSDN''s (vague) documentation. However, when I test it the pfd indicates that the mode is entirely hardware accelerated, whilst it clearly isn''t.

Uuuuuulrika-ka-ka-ka-ka-ka

#16 GLUbie   Members   -  Reputation: 122

Like
Likes
Like

Posted 26 October 2001 - 07:54 AM

Ok, I think I''ve figured it out. Thanks for all your help guys. Basically, unless you specifically tell GL to use software, or choose a mode that isn''t supported by the hardware in your PIXELFORMATEDESCRIPTOR structure, your program should run using the hardware driver. My problem running XP, is that I had to download the updated driver for the ATI Mobility card. As soon as I did, the hardware acceleration worked. Though now the smoothing doesn''t look as good as I think it should, but I''m working on that.

Mayrel, if you put the PFD_GENERIC_ACCELERATED flag in the PIXELFORMATDESCRIPTOR structure, it should run using your hardware. I''m still confused about how to enumerate the modes, other than create the hDC first, then get its supported modes and copy those to a PFD structure.

#17 Anonymous Poster_Anonymous Poster_*   Guests   -  Reputation:

Likes

Posted 26 October 2001 - 06:22 PM

And how do I enumrate display modes that are accelerated, without setting those ?

I use EnumDisplaySettings to get all supported display modes, that works fine. But how can I determine *without* activating it, if a specific mode is accelerated, and if that mode supports stencil masks, z buffer depth, etc... ? Generally speaking, I would like to get a valid PIXELFORMATDESCRIPTOR for a certain display setting.




#18 Mayrel   Members   -  Reputation: 348

Like
Likes
Like

Posted 27 October 2001 - 03:59 AM

There''s a flag you can pass to ChangeDisplaySettings that prevents it from actually making the change. Perhaps that''s what you want?

I demand unsatisfaction!

#19 outRider   Members   -  Reputation: 852

Like
Likes
Like

Posted 27 October 2001 - 12:15 PM

Well first of all don''t bother with the enumerate pixel format functions, use choosepixelformat and it should find a format that''s as close to what you wanted, choosing an accelerated one over a non accelerated one, if memory serves. You can try the GENERIC_ACCELERATED flag and its ilk if you want, but they didn''t do squat for me, I wanted a software format and I kept getting hardware. Anyhow, try checking for error messages where you create your contexts and bind them, and that your window is appropriate for gl rendering, etc. Failing that, try doing things in a different order, instead of creating the window, then showing it, then creating your contextx and binding them try doing it in a different (yet acceptable) order. It might just be that your drivers are picky about what they accept.

------------
- outRider -

#20 Anonymous Poster_Anonymous Poster_*   Guests   -  Reputation:

Likes

Posted 27 October 2001 - 03:16 PM

outRider:

This is how I used to do it, but it''s not acceptable anymore. Some display formats do not have the properties my game requires (eg. stencil buffer) and I want to sort them out, before presenting the user a list of modes he can choose from. I don''t want unaccelerated modes on that list either. I want all modes that I present the user to be valid, accelerated modes that work on his card. Giving him a generic list and then telling him, ''Sorry mate, the mode you selected is not hardware accelerated, and don''t have a stencil buffer. Please select another one !'', is just unprofessional. I can''t check through all modes either, there may be hundreds of display settings, I can''t try all of them, imagine the sreen flickering...





Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS