Sign in to follow this  
directNoob

OpenGL OpenGL updating???

Recommended Posts

Hi. Now that I take a CG corse at the university, i´m forced to learn openGl. The main differences compared to Dx are the releases. I was everywhere. Where can I get the newest OpenGl files? The newest I found was at SGI. The OpenGl Sdk. But the version inside was just 1.2. Also the ext ogl files. But for general understanding, and I´m sure this topic is well discused, but, I have a ATI graphics card. Now I just found a SDK at the AMD/ATI website where some ATI files were inside. What are these files for? What is the difference between the functions in this files and those of the standard ogl version(from SGI?)? I mean, what if i use a *_*_ATI or ATI_*_* or whatever function, is it also available on other manufacturers graphics cards? I think so... But, come on, than this isn´t better than using D3D. Where is the advantage of using those extensions when they are so hardware dependent? From the faq list of gamedev, I found the article "Beyond ogl 1.1" or so, and it says that its save to use functions with ARB or EXT suffix or prefix. Does this mean, this runs on every graphics hardware althought when functions of this kind are defined in the ATI header files?! I'm also asking, because everywhere I read about OpenGL 2.0, but I just have found 1.2... I'm mean, come on, whats wrong. Where can I find these libs and headers? Please, I need suggestions what files I should use... Is it safe to use the ati files for example? Oh man, D3D was so good to me, and now I have to use OpenGL. :( Now I can program hardware specific apps instead of os specific stuff(Dx, you understand). But, although all this happens to me, I want(forced to)use it, because I'm also interested in it! I need some help please! Thanks Alex

Share this post


Link to post
Share on other sites
Accessing the OpenGL API is a little different than Direct3D. With Direct3D you download the SDK, point your compiler to the headers and point your linker to the libs, pretty much like any other SDK out there. Then as long as the user has the runtime you built against, your game should run.
With OpenGL, typically your compiler will come with the headers and the lib file, but they will likely be out of date (usually OpenGL 1.2, possibly 1.4). You can get up to date headers with the latest and greatest extensions from places like SGI and opengl.org, but you will have to obtain a function pointer at runtime using calls such as GetProcAddress. The "run time" for OpenGL is provided by the guys who produce the driver for your video card. This is where the procedures live, and different drivers are going to have different levels of support for extensions. So if you are using an extension, you need to make sure the driver supports it at runtime. To make this easier there are libraries such as GLEE. So the basic procedure is, get GLEE so that you will have the latest OpenGL headers and so you don't have to go through contortions to load extensions. Make sure your compiler has a gl.h and an opengl32.lib so that you have "core" OpenGL functionality. Make sure you test the availability of extensions at runtime before you use them so your program does not crash. Make sure you have the latest driver from your video card vendor so that you will have the latest OpenGL runtime. Couldn't be simpler ;).

Share this post


Link to post
Share on other sites
Hi CodeMunkie.

This sounds simple... You are right.
You said, that I should use the latest graphics drivers.
But how does new functionality conincid with old versions of ogl???
In the ati sdk, there is no opengl.dll or opengl32.dll. So, how can new
functionality be utilized, when I'm just running with the opengl32.dll
from version 1.1. What is with shader support?
In D3D I just used D3DXCompileShaderFrom*(...) and Create*Shader() than I could include it with Set*Shader()...
Now, how is this done in ogl?

But maybe this becomes clearer, when I have used the extensions...
But now, everything is a bit confusing.

Alex

Share this post


Link to post
Share on other sites
This is actually a good question man, I had a lot of trouble myself trying to get the latest version of OpenGL.

I’m still slightly confused about the subject, for instance, if a game boasts that it supports “OpenGL 2.1”, exactly how many extensions from the latest spec is it required to implement to make such a claim?

Share this post


Link to post
Share on other sites
Oh man....

First. OpenGL support is double ended. 1) you need the proper stuff to compile it, and 2) you need the proper drivers to run it.

1) The proper files, have been mentioned. I'd also direct you to GLEW (i dont know if this is the same as GLEE)
inorder to get all the function pointers for everything all ready to go. If you look in the headers, it is broken into sections,
so it should be easy to see what extentions form the bases of each core release (ie 2.0, or 1.1).

And WGL_ extention will be available on ALL windows machines WITH approprite hardware support.
And GLX_ extentions will be available on ALL linux-xsever machines WITH appropriate hardwhare support.
Any NVIDIA_ or ATI_ extention will ONLY be available on that vendors cards, of the approriate hardware level.

Shader support is throught the ARB_vertex_program and ARB_fragment_program extentions OR in GL2.0 glCreateShader/glCompileShader


2) using opengl support at the highest version is as easy as getting the latest drivers for you card. you are done

Share this post


Link to post
Share on other sites
OpenGL has different "levels", I guess you could say, of extensions. You have hardware specific extensions supported only by the vendor that developed the extension (or possibly by a very few others), and then you have extensions that are supported by multiple vendors. An extension will have a prefix that describes where it came from and can also give you a clue about how widely it is supported. Here is list of some of the prefixes (just to name a few):

ARB – Extensions officially approved by the OpenGL Architecture Review Board
EXT – Extensions agreed upon by multiple OpenGL vendors
HP – Hewlett-Packard
INTEL – Intel
NV – NVIDIA corp
ATI - ATi/AMD corp
SGI – Silicon Graphics
WIN – Microsoft

Once enough vendors agree to support a certain extension, that extension gets promoted from vendor specific to EXT or ARB. As a developer, if you want to support the largest number of different cards and vendors, you should go for EXT and ARB extensions.
You probably know that Direct3D is governed by Microsoft. OpenGL is governed by an architectural review board. This is just a group of folks from different companies who control the direction of OpenGL. Periodically, the architectural review board will meet and decide to release a "new version" of OpenGL. At that time certain ARB and EXT extensions will be promoted to the OpenGL core. Almost all functionality added after OpenGL 1.0 started life as an extension. Once an extension is promoted to core OpenGL, it must be supported by any implementer who says they support that version of OpenGL. If a game says they support OpenGL 2.1, then it means they are using only what is considered the core functionality of version 2.1. In other words, they do not use extensions that were not considered part of the core of OpenGL 2.1. It also means as long as your driver supports OpenGL 2.1, you will be able to run the game because if a vendor implements OpenGL 2.1, they must implement all of the core functionality. They may also implement additional, non-standard extensions, but the core must be there or it can not be called OpenGL 2.1 (or whatever version the implementation is saying they support).

There is a really cool program called GLview over at www.realtech-vr.com. It will show you a list of core features by OpenGL version and tell you which ones are supported by your current driver. It will also take you to the specification for a particular extension so you can read all about it.

[Edited by - CodeMunkie on April 5, 2007 8:51:24 PM]

Share this post


Link to post
Share on other sites
Quote:
You said, that I should use the latest graphics drivers.
But how does new functionality conincid with old versions of ogl???
In the ati sdk, there is no opengl.dll or opengl32.dll. So, how can new
functionality be utilized, when I'm just running with the opengl32.dll

with nvidia the gl driver is called nvoglnt.dll, i assume ati is something like atiogl32.dll, this does all the drawing not opengl32.dll, opengl32.dll is an old software only opengl1.2 version from many years ago

Share this post


Link to post
Share on other sites
Quote:
Original post by directNoob
Ok, GLee. I quickly found it, because I knew that it is
located in the OpenGL Sdk at opengl.org.

Now, this sounds just better. "up to 2.1".

Alex


You don't really need to download an SDK. Your compiler comes with everything you need to use OpenGL 1.1 already. You just need to get glext.h to use the latest extensions, and GLEE if you want to use those extensions easily!

GLEE is at: http://elf-stone.com/glee.php

Share this post


Link to post
Share on other sites
Quote:
Original post by KulSeran
Any NVIDIA_ or ATI_ extention will ONLY be available on that vendors cards, of the approriate hardware level.


Not quite true, My ATI X1600 mobile card thingy has support for some gl_nv extensions and my GF7800GTX card has support for a bunch of gl_ati extensions.

Share this post


Link to post
Share on other sites
Quote:
Original post by SimonForsman
Quote:
Original post by KulSeran
Any NVIDIA_ or ATI_ extention will ONLY be available on that vendors cards, of the approriate hardware level.


Not quite true, My ATI X1600 mobile card thingy has support for some gl_nv extensions and my GF7800GTX card has support for a bunch of gl_ati extensions.


QFT.

Share this post


Link to post
Share on other sites
Hi and Wow.

I have problems!!!

I downloaded the the GLee SDK and included every file where they are supposed to be.
In "include\gl" I put glee.h and glee.c and the .lib, I put just in the lib dir of the SDKPlatform. Using VS 8 or 2005...

Today I wrote some gl code with buffer objects.
The first thing happend was this:

1>Linking...
1>CNode.obj : error LNK2001: unresolved external symbol _pglBufferData
1>CNode.obj : error LNK2001: unresolved external symbol _pglBindBuffer
1>CNode.obj : error LNK2001: unresolved external symbol _pglGenBuffers





Then I added:

#pragma comment(lib, "GLee.lib" )





Then, the linker said:
1>LINK : fatal error LNK1104: cannot open file 'LIBC.lib'

In the internet I found that i can overcome this problem by including
"LIBC.lib" to the linke exclude libraries input field.

This worked. But what does it mean?
The vertices in the buffer objects arent drawn!
Could this problem rely on the ignored "LIBC.lib"?

With the debugger, I can say, that glGenBuffers is working, because I get
successive numbers from it.

Maybe I open a new thread for the buffer object problem?!

But another questeon according the files.
When I include the glext.h which is 1.2 or 1.3, I havent no newer version(!)
how can I use the features included in the next versions?
For example buffer object, which are included in v1.5?

?
Alex

Share this post


Link to post
Share on other sites
I prefer to add extensions in myself as I need them rather than use a 3rd party lib. Doing things this way also forces you to see all the functionality required for a extension. Here is my handy macro:
#define OGLEXT(x,y) y = (x) wglGetProcAddress(#y); if(y == NULL) return -1;

Use this macro in a function like HRESULT InitOpenGL()

hr = InitOpenGL()
if(FAILED(hr))
<freak out>

Share this post


Link to post
Share on other sites
I was unable to get the .LIB from GLee to work myself.

However, you can compile it yourself - just add the GLee.c and GLee.h files to your project and call GLeeInit() after you create your rendering context, but before you do any extension stuff.

Share this post


Link to post
Share on other sites
just add the glee.c file into your project, include the glee.h before gl.h and stuff like this and everything should work fine!

and god...directNoob..... don't be so pessimistic/prejudiced! opengl has its advantages. for example you can access some features like features in directX 10 (if you driver supports it) without new directx Versions. just load the extension and get the newest driver.....

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this  

  • Announcements

  • Forum Statistics

    • Total Topics
      628402
    • Total Posts
      2982469
  • Similar Content

    • By test opty
      Hi all,
       
      I'm starting OpenGL using a tut on the Web. But at this point I would like to know the primitives needed for creating a window using OpenGL. So on Windows and using MS VS 2017, what is the simplest code required to render a window with the title of "First Rectangle", please?
       
       
    • By DejayHextrix
      Hi, New here. 
      I need some help. My fiance and I like to play this mobile game online that goes by real time. Her and I are always working but when we have free time we like to play this game. We don't always got time throughout the day to Queue Buildings, troops, Upgrades....etc.... 
      I was told to look into DLL Injection and OpenGL/DirectX Hooking. Is this true? Is this what I need to learn? 
      How do I read the Android files, or modify the files, or get the in-game tags/variables for the game I want? 
      Any assistance on this would be most appreciated. I been everywhere and seems no one knows or is to lazy to help me out. It would be nice to have assistance for once. I don't know what I need to learn. 
      So links of topics I need to learn within the comment section would be SOOOOO.....Helpful. Anything to just get me started. 
      Thanks, 
      Dejay Hextrix 
    • By mellinoe
      Hi all,
      First time poster here, although I've been reading posts here for quite a while. This place has been invaluable for learning graphics programming -- thanks for a great resource!
      Right now, I'm working on a graphics abstraction layer for .NET which supports D3D11, Vulkan, and OpenGL at the moment. I have implemented most of my planned features already, and things are working well. Some remaining features that I am planning are Compute Shaders, and some flavor of read-write shader resources. At the moment, my shaders can just get simple read-only access to a uniform (or constant) buffer, a texture, or a sampler. Unfortunately, I'm having a tough time grasping the distinctions between all of the different kinds of read-write resources that are available. In D3D alone, there seem to be 5 or 6 different kinds of resources with similar but different characteristics. On top of that, I get the impression that some of them are more or less "obsoleted" by the newer kinds, and don't have much of a place in modern code. There seem to be a few pivots:
      The data source/destination (buffer or texture) Read-write or read-only Structured or unstructured (?) Ordered vs unordered (?) These are just my observations based on a lot of MSDN and OpenGL doc reading. For my library, I'm not interested in exposing every possibility to the user -- just trying to find a good "middle-ground" that can be represented cleanly across API's which is good enough for common scenarios.
      Can anyone give a sort of "overview" of the different options, and perhaps compare/contrast the concepts between Direct3D, OpenGL, and Vulkan? I'd also be very interested in hearing how other folks have abstracted these concepts in their libraries.
    • By aejt
      I recently started getting into graphics programming (2nd try, first try was many years ago) and I'm working on a 3d rendering engine which I hope to be able to make a 3D game with sooner or later. I have plenty of C++ experience, but not a lot when it comes to graphics, and while it's definitely going much better this time, I'm having trouble figuring out how assets are usually handled by engines.
      I'm not having trouble with handling the GPU resources, but more so with how the resources should be defined and used in the system (materials, models, etc).
      This is my plan now, I've implemented most of it except for the XML parts and factories and those are the ones I'm not sure of at all:
      I have these classes:
      For GPU resources:
      Geometry: holds and manages everything needed to render a geometry: VAO, VBO, EBO. Texture: holds and manages a texture which is loaded into the GPU. Shader: holds and manages a shader which is loaded into the GPU. For assets relying on GPU resources:
      Material: holds a shader resource, multiple texture resources, as well as uniform settings. Mesh: holds a geometry and a material. Model: holds multiple meshes, possibly in a tree structure to more easily support skinning later on? For handling GPU resources:
      ResourceCache<T>: T can be any resource loaded into the GPU. It owns these resources and only hands out handles to them on request (currently string identifiers are used when requesting handles, but all resources are stored in a vector and each handle only contains resource's index in that vector) Resource<T>: The handles given out from ResourceCache. The handles are reference counted and to get the underlying resource you simply deference like with pointers (*handle).  
      And my plan is to define everything into these XML documents to abstract away files:
      Resources.xml for ref-counted GPU resources (geometry, shaders, textures) Resources are assigned names/ids and resource files, and possibly some attributes (what vertex attributes does this geometry have? what vertex attributes does this shader expect? what uniforms does this shader use? and so on) Are reference counted using ResourceCache<T> Assets.xml for assets using the GPU resources (materials, meshes, models) Assets are not reference counted, but they hold handles to ref-counted resources. References the resources defined in Resources.xml by names/ids. The XMLs are loaded into some structure in memory which is then used for loading the resources/assets using factory classes:
      Factory classes for resources:
      For example, a texture factory could contain the texture definitions from the XML containing data about textures in the game, as well as a cache containing all loaded textures. This means it has mappings from each name/id to a file and when asked to load a texture with a name/id, it can look up its path and use a "BinaryLoader" to either load the file and create the resource directly, or asynchronously load the file's data into a queue which then can be read from later to create the resources synchronously in the GL context. These factories only return handles.
      Factory classes for assets:
      Much like for resources, these classes contain the definitions for the assets they can load. For example, with the definition the MaterialFactory will know which shader, textures and possibly uniform a certain material has, and with the help of TextureFactory and ShaderFactory, it can retrieve handles to the resources it needs (Shader + Textures), setup itself from XML data (uniform values), and return a created instance of requested material. These factories return actual instances, not handles (but the instances contain handles).
       
       
      Is this a good or commonly used approach? Is this going to bite me in the ass later on? Are there other more preferable approaches? Is this outside of the scope of a 3d renderer and should be on the engine side? I'd love to receive and kind of advice or suggestions!
      Thanks!
    • By nedondev
      I 'm learning how to create game by using opengl with c/c++ coding, so here is my fist game. In video description also have game contain in Dropbox. May be I will make it better in future.
      Thanks.
  • Popular Now