Jump to content

  • Log In with Google      Sign In   
  • Create Account

Interested in a FREE copy of HTML5 game maker Construct 2?

We'll be giving away three Personal Edition licences in next Tuesday's GDNet Direct email newsletter!

Sign up from the right-hand sidebar on our homepage and read Tuesday's newsletter for details!


We're also offering banner ads on our site from just $5! 1. Details HERE. 2. GDNet+ Subscriptions HERE. 3. Ad upload HERE.


64-bit OpenGL


Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.

  • You cannot reply to this topic
16 replies to this topic

#1 Tom Backton   Members   -  Reputation: 100

Like
0Likes
Like

Posted 20 September 2009 - 06:06 AM

1. If I write a 64-bit program in C++ and use OpenGL, GLU, GLX, WGL, etc., will all of them support all the functionality like in 64-bit? For example, in 32-bit the OpenGL functions take 32bit pointers as parameters, but in 64bit the pointers are 64-bit wide, so will the OpenGL functions accept them? Can I use OpenGL in 64-bit just like I use to use it in 32-bit ? Does it depend on the OpenGL version I use? 2. Do GPUs save doubles? Or functions like glVertex3d() convert the doubles to floats and then send them to the GPU (or they are send as doubles and converted to floats by the GPU?) ?

Sponsor:

#2 Martins Mozeiko   Crossbones+   -  Reputation: 1422

Like
0Likes
Like

Posted 20 September 2009 - 06:22 AM

1. Yes, 64bit program should work fine with OpenGL. It it is not working correctly (crashing), then it's drivers fault. Send a bug to vendor.

2. Usually no. Pre-DX11 GPUs doesn't support doubles, so driver convert them to float's.

#3 Tom Backton   Members   -  Reputation: 100

Like
0Likes
Like

Posted 20 September 2009 - 07:40 AM

Thanks for the answers! :D

More questions:

3. So even if I use a OpenGL 1.1 DLL file it will work with 64-bit pointers passed as parameters to the functions? Or it depends on the header file version? or on the graphics card (I have GeForce 7300)?

4. How can I tell which version my OpenGL files (dll, lib and C/C++ header) are? glGetString() gives the latest version supported by the GPU, right?

5. Which is generally better: converting doubles to floats or letting the driver do it?

6. Let's say I need to do pixel-base collision detection. which would be a better choice, FBO or an auxiliary buffer?

7. What exactly is glVertexAttrib() ? I saw it in several places, but I leaned OpenGL with the online old red book and this function doesn't appear there.

8. Let's say I have a textured rectangle. I use transformations on it but the specification of vertices and tex coords doesn't change, it's constant numbers. Which is the best choice for me, considering "deprecation vs old fast features": immediate mode, immediate mode with "v" functions (glVertex2fv(), etc.), vertex arrays, display lists or VBO?

#4 Yann L   Moderators   -  Reputation: 1798

Like
0Likes
Like

Posted 20 September 2009 - 07:49 AM

Quote:
Original post by Tom Backton
3. So even if I use a OpenGL 1.1 DLL file it will work with 64-bit pointers passed as parameters to the functions? Or it depends on the header file version? or on the graphics card (I have GeForce 7300)?

The OpenGL DLLs are supplied by the OS and the graphics drivers. A 64bit application run on a 64bit OS will be linked with those.

Quote:
Original post by Tom Backton
4. How can I tell which version my OpenGL files (dll, lib and C/C++ header) are? glGetString() gives the latest version supported by the GPU, right?

It gives the version supported by the currently installed driver at runtime. The version exposed by the header, lib and dll are irrelevant under Windows. You'll have to use extensions anyway (or some third party library).

Quote:
Original post by Tom Backton
5. Which is generally better: converting doubles to floats or letting the driver do it?

Since you will be using VBOs, convert them yourself before uploading the array.

Quote:
Original post by Tom Backton
6. Let's say I need to do pixel-base collision detection. which would be a better choice, FBO or an auxiliary buffer?

FBO.

Quote:
Original post by Tom Backton
7. What exactly is glVertexAttrib() ? I saw it in several places, but I leaned OpenGL with the online old red book and this function doesn't appear there.

It's the standard way to specify per-vertex attributes on GL3. All others (glVertex, glColor, glTexCoord, etc) are deprecated.

Quote:
Original post by Tom Backton
8. Let's say I have a textured rectangle. I use transformations on it but the specification of vertices and tex coords doesn't change, it's constant numbers. Which is the best choice for me, considering "deprecation vs old fast features": immediate mode, immediate mode with "v" functions (glVertex2fv(), etc.), vertex arrays, display lists or VBO?

VBO. All others are slow and deprecated.


#5 Tom Backton   Members   -  Reputation: 100

Like
0Likes
Like

Posted 26 September 2009 - 07:02 AM

" Quote:Original post by Tom Backton
4. How can I tell which version my OpenGL files (dll, lib and C/C++ header) are? glGetString() gives the latest version supported by the GPU, right?

It gives the version supported by the currently installed driver at runtime. The version exposed by the header, lib and dll are irrelevant under Windows. You'll have to use extensions anyway (or some third party library). "

What about other operating systems? Is it possible that a new OS uses new OpenGL
DLL, LIB and header that support features not supported by the graphics card? It's not a problem since I can tell what's supported using glGetString() with GL_VERSION , but I'm still wondering whether the DLL is supplied by the OS independently of the card or the DLL depends on the card and is therefore of the same version...

If everything else except VBO is deprecated, what is the starndard way to deal with polygons that change between frames? For example, in computer games and physics simulation programs there are non-rigid objects: jelly, blob, mass-spring-modelled objects, etc., so I'd have to update/recreate the VBO every frame?





#6 Brother Bob   Moderators   -  Reputation: 8428

Like
0Likes
Like

Posted 26 September 2009 - 07:29 AM

Quote:
Original post by Tom Backton
What about other operating systems? Is it possible that a new OS uses new OpenGL
DLL, LIB and header that support features not supported by the graphics card? It's not a problem since I can tell what's supported using glGetString() with GL_VERSION , but I'm still wondering whether the DLL is supplied by the OS independently of the card or the DLL depends on the card and is therefore of the same version...

On Windows, the DLL that makes the OpenGL interface is opengl32.dll and is an old file from many years ago, providing a reference implementation and an interface for OpenGL 1.1 (although on Vista, I heard it was supposed to be a new one for 1.4, but haven't seen much evidence of it, but it doesn't change anything for the sake of explaining how it works and what files are what). The library and header file corresponds to this DLL, and provides what you need for OpenGL 1.1 only. This DLL is a system file and a part of the operating system itself, not a stand-alone application interface.

Other implementations of OpenGL, like your driver for the graphics card, are provided by each card's manufacturer and hook into this interface (note, it does not replace it, only hook into it), allowing the DLL to forward calls to the driver, which in turn provides support for later versions of OpenGL. Since new functions are provided in later versions, there is also a mechanism to load new functions that are not handled by the DLL so you can call any function in later versions.

This mechanism to load any OpenGL function from the driver sort of much makes the interface unnecessary, as you can just hook directly into the driver yourself. Is also makes the version the DLL provides completely irrelevant, and anything it provides is overridden by the driver you install for your graphics card. All you need is a way to load function pointers (manually via wglGetProcAddress, or using an extention library like GLEE) and the glext.h header providing the symbols for all versions and extensions.

In short, the only thing that matters is the driver for the graphics card.

Quote:
Original post by Tom Backton
If everything else except VBO is deprecated, what is the starndard way to deal with polygons that change between frames? For example, in computer games and physics simulation programs there are non-rigid objects: jelly, blob, mass-spring-modelled objects, etc., so I'd have to update/recreate the VBO every frame?

You update the VBO every time. But that is not different from what you already have to do without VBO; you need to update the vertex data in one way or another in order to render the updated objects. Just make sure you give the proper usage hints when creating the VBO to indicate that you intend to write to the buffer often.

#7 V-man   Members   -  Reputation: 805

Like
0Likes
Like

Posted 27 September 2009 - 12:36 AM

The DLL stuff is explained here
http://www.opengl.org/wiki/Getting_started

and here in the FAQ
http://www.opengl.org/wiki/FAQ

and VBO stuff
http://www.opengl.org/wiki/General_OpenGL

#8 Tom Backton   Members   -  Reputation: 100

Like
0Likes
Like

Posted 01 October 2009 - 08:40 AM

Now I'm a little confused...I'd like my project to support hardware-accelerated OpenGL 2.1, which means it doesn't use newer features. On Windows (VC++) I just need to include gl.h and the driver's files will automatically be used, right? But I'm sure about Linux. For OpenGL 2.1, will just including gl.h work or I'll have to use GLEE or GLEW?

#9 Brother Bob   Moderators   -  Reputation: 8428

Like
0Likes
Like

Posted 01 October 2009 - 08:53 AM

OpenGL on Windows is version 1.1, so you need to load anything past that yourself or with some third party library.

#10 Tom Backton   Members   -  Reputation: 100

Like
0Likes
Like

Posted 01 October 2009 - 09:50 PM

How exactly do I do that? And what about Linux?

I know it's something with wglGetProcAddress() and function pointers, but I don't understand what exactly I should do with them. And why the function pointers have the same names as the functions (which could cause ambiguity).

And are all the OpenGL 2.1 new features accessible in version 1.1 through extensions? If yes, can I simply use glext.h instead of loading the functions?

#11 Brother Bob   Moderators   -  Reputation: 8428

Like
0Likes
Like

Posted 01 October 2009 - 10:18 PM

You load function with wglGetProcAddress, yes, or third party libraries. I recommend going with the latter if you are confused about this process so you can just get it work and forget about it.

If you want to use OpenGL 2.1, then forget about 1.1 and extensions. Load the functions for 2.1 and use them as if everything is just working. That means, include whatever header is needed (glee.h for GLEE for example), call the initialization function if needed, and just use whatever 2.1-functions you want.

#12 Prefect   Members   -  Reputation: 373

Like
0Likes
Like

Posted 04 October 2009 - 06:20 AM

Quote:
Original post by Tom Backton
How exactly do I do that? And what about Linux?

No conceptual difference to Windows. Loading function pointers directly is done via glXGetProcAddress(), but using something like GLEW is much recommended, as it saves you from writing a ton of boiler-plate code.

#13 Tom Backton   Members   -  Reputation: 100

Like
0Likes
Like

Posted 08 October 2009 - 11:39 PM

Now I see how it works...I'm still surprised none of the tutorials I read mentioned function pointer loading.

Anyway, I have another little question. About extensions. Few days ago I saw an extension list generated by glGetString(GL_EXTENSIONS). The graphics card if GeForce 7300, but the driver wasn't the original driver, but a newer one. The latest OpenGL version supported is 2.1.2 (or something else a little above 2.1), and - if I'm not mistaken - one of the extensions was GL_ARB_framebuffer_object . This extension was approved in 2008, so its full contents didn't exist when the driver was created. So is there a connection between the version and the supported extensions? The OpenGL specification files (which can be found at www.opengl.org) list the new ARB extensions, but GL_ARB_framebuffer_object is newer than the graphics card and is still supported. Can a driver support an extension that the hardware doesn't support, or GL_ARB_framebuffer_object guarantee that the card supports the extension? And in general, is it possible that a old card supports a new extension through a new driver (if it originally supported a limited EXT version and the new ARB one only adds features that don't require changing the hardware) ?

In other words, can I rely only on glGetString(GL_VERSION) the specification files or it's a better idea to check which extensions are supported using glGetString(GL_EXTENSIONS)?

#14 Brother Bob   Moderators   -  Reputation: 8428

Like
0Likes
Like

Posted 09 October 2009 - 12:38 AM

You should check what you aim to use. If you want to use the extension ARB_framebuffer_object, then you have to check the extension string if it is supported. If you instead want to use the corresponding core functions, you must check the version number for correct core version. You can not rely on the version number to report extension support, nor rely on the extension string to report core support.

If a feature is reported as supported, whether via the extension string or the version number, you know that you can use that particular feature. It does not say anything about hardware support, performance, or anything, only that you can use it and it will behave as described in it's specification. However, you can often assume that a reported extension is supported by the hardware, as is the corresponding core feature.

#15 V-man   Members   -  Reputation: 805

Like
0Likes
Like

Posted 09 October 2009 - 03:44 AM

Quote:
Original post by Tom Backton
Now I see how it works...I'm still surprised none of the tutorials I read mentioned function pointer loading.

Anyway, I have another little question. About extensions. Few days ago I saw an extension list generated by glGetString(GL_EXTENSIONS). The graphics card if GeForce 7300, but the driver wasn't the original driver, but a newer one. The latest OpenGL version supported is 2.1.2 (or something else a little above 2.1), and - if I'm not mistaken - one of the extensions was GL_ARB_framebuffer_object . This extension was approved in 2008, so its full contents didn't exist when the driver was created. So is there a connection between the version and the supported extensions? The OpenGL specification files (which can be found at www.opengl.org) list the new ARB extensions, but GL_ARB_framebuffer_object is newer than the graphics card and is still supported. Can a driver support an extension that the hardware doesn't support, or GL_ARB_framebuffer_object guarantee that the card supports the extension? And in general, is it possible that a old card supports a new extension through a new driver (if it originally supported a limited EXT version and the new ARB one only adds features that don't require changing the hardware) ?

In other words, can I rely only on glGetString(GL_VERSION) the specification files or it's a better idea to check which extensions are supported using glGetString(GL_EXTENSIONS)?


Read what Brother Bob said and I can I add that GL_ARB_framebuffer_object should have been supported in the first place instead of GL_EXT_framebuffer_object. With the EXT version, if you bind a color buffer and depth buffer and any other attachments, they all must be the same dimensions. The ARB version got rid of that. The ARB version also merged a bunch of individual extensions
http://www.opengl.org/wiki/GL_EXT_framebuffer_object

It's too bad that OpenGL is so complicated.
I can suggest that you either use GL 2.1 + GL_EXT_framebuffer_object
or GL 3.0 (FBO is core) but the general public doesn't have GL 3.0 drivers.



#16 elFarto   Members   -  Reputation: 206

Like
0Likes
Like

Posted 09 October 2009 - 04:48 AM

Quote:
Original post by V-man...but the general public doesn't have GL 3.0 drivers.

NVIDIA have had OpenGL 3.0+ in their proper drivers for a while now. I'm not sure about ATI, I don't follow what they do.

Regards
elFarto



#17 V-man   Members   -  Reputation: 805

Like
0Likes
Like

Posted 09 October 2009 - 07:39 AM

Quote:
Original post by elFarto
Quote:
Original post by V-man...but the general public doesn't have GL 3.0 drivers.

NVIDIA have had OpenGL 3.0+ in their proper drivers for a while now. I'm not sure about ATI, I don't follow what they do.

Regards
elFarto


I doubt you'll find GL 3.0 drivers on every machine today. Depends too much on drivers. ATI has 3.0 support since Catalyst 8.9. Not sure which GPUs.




Old topic!
Guest, the last post of this topic is over 60 days old and at this point you may not reply in this topic. If you wish to continue this conversation start a new topic.



PARTNERS