Sign in to follow this  
MARS_999

OpenGL Curious to who is sticking with OpenGL now...

Recommended Posts

MARS_999    1627
I am curious to see who here is planning on sticking with OpenGL or moving onto DX or some other means. Would like to see who is planning on leaving the community and who is staying around. Trying to get a feel for what support is left for OpenGL as a hobbyist game coder.

Share this post


Link to post
Share on other sites
Hodgman    51223
My engine uses GL, the 3.0 spec hasn't broken anything, nothing has changed so I don't have any reason to drop it...

I did have some ATI compatibility problems recently, but if you report bugs to them and send them instructions on how to reproduce the problem, then they do fix their drivers.

Share this post


Link to post
Share on other sites
Lazy Foo    1113
The OpenGL situation didn't worse, it just didn't get better when it really needs to.

There's a hole in the Good Ship OGL that's been slowly getting bigger, but can be patched up. They had the patch almost ready to go. Then they came a year late and put duct tape all over it. Now they're saying they'll have the needed patchwork ready less than 12 months from now.

Share this post


Link to post
Share on other sites
Daaark    3553
Quote:
Original post by MARS_999
Trying to get a feel for what support is left for OpenGL as a hobbyist game coder.
No less support than you had a week ago. Which means a spec that has more holes through it then a sponge, and shitty drivers on the gpu market leader, I abandoned it almost 2 years ago myself, because of all the little problems I didn't want to put up with anymore, that I don't feel like typing up again.

If you're a hobbyist GAME coder, why do you bother with all that anyways? Use XNA or something else that is a bit higher level, and focus on your game content instead. I don't see why all hobbyist game coders around here aren't trying to get a games ready for XBLCG this fall.

Share this post


Link to post
Share on other sites
sprite_hound    461
I'm sticking with OpenGL for now.

While I suppose I'm disappointed with OpenGL 3.0, I wasn't following what was happening particularly closely. Everything's still there, and in the short term this doesn't really change anything for me.

In the long term... well, who knows. I'm hoping before too long to get a shinier computer, on which I'll install vista, and a better graphics card than I have atm. This may well be a cue to learn C# and DirectX (I've been wanting to do the former for a while).

Share this post


Link to post
Share on other sites
bootstrap    100
I am staying with OpenGL. I have no choice, because I cannot allow my application to become dependent upon the evil empire. Either people forget how many times they have been shafted by macroshaft, or they haven't been developing software long enough yet to *repeatedly* find out and get angry enough to abandon THEM on principle - and out of self-preservation. I must admit, it took several times to sink into my lame brain.

In other words, I am one of those people who choose OpenGL because it runs efficiently on Linux, at least with nvidia drivers (the only ones I have tried during the past few years).

A question. Is now the time to create a new 3D graphics/rendering API for realtime applications [and leave OpenGL for CAD, if recent threads are on-point]? Is this kind of project impossible (unless you are nvidia) because no access to the low-level hardware or drivers is provided by the GPU makers?

BTW, I have seen many people complain that "games" are being sacrificed for the "CAD companies". I believe this is a very dangerous way to characterize the situation, because the concept "games" tends to trivialize the importance of an enormous variety of realtime and interactive graphics/rendering applications - all of which have virtually identical requirements as "games". I refer to presentation of simulations of physical [and fictional] processes (mechanical, chemical, atomic, optical, you-name-it), vision systems, robotic systems - every non-trivial that must be realtime or interactive. ALL of these applications will be thrown out with the bathwater if people can deem them "games" (implication "unimportant"). But this seems to be happening at first blush. Furthermore, in a few years at most, every CAD application worth a penny will provide complex interactive handling and manipulation of [whatever aspects of reality it presents]. In other words, everything will BECOME a game, at least in terms of the support it requires.

Share this post


Link to post
Share on other sites
Lazy Foo    1113
Quote:
Original post by Hodgman
Quote:
Original post by Lazy Foo
GL is dying
Does that mean that you were using GL and have chosen to drop it?


Well first let me say OpenGL is "dying", but the question is the rate of death. It could be a small infection that'll be taken care of with the antibiotic that is a real object based API is release in "less than 12 months", or is it being hurled into a volcano and having a rope tossed to it after it's already been roasted in it's own juices?

(I'll stop with the bad analogies now)

I haven't abandoned OpenGL because my site's gimmick is that everything works on Win/Linux/Mac, and if I were to switch to windows only I'd lose a 3rd of my audience and get lost against better Windows dev sites. I'm pissed, but I have no choice when writing multiplatform code.

Share this post


Link to post
Share on other sites
Frederick    140
Hey Mars_999

I will stick... for a while or forever dunno =)
The reason is, I just finished my base rendering library and want to move ON. Don´t want to stay at the beginnings all the time and relearn the stuff for DX.
Someday I might write a DX backend and check this cool NVPerfHud etc...
I really dislike that knowledge is constantly thrown away in the computer community. I would prefer the industry to stay for a while with a given technology and max it out... Always better technique is not the best way at least not for programmers, but sure for marketing people... The Wii is a good example: it doesn´t even have shaders, but how cares !?

I am dissapointed that there is no fresh start for GL. This is probably the worst API I have ever used, maybe this is in the nature of the thing, but it needed lots of trying and reading scattered information, but finally it works =)

So I will simply ignore what happenend yesterday... I work for University so I need it for that also, maybe DX wouldn´t have got me the job.



Share this post


Link to post
Share on other sites
Trapper Zoid    1370
My next game will be for both Windows and Mac, so the Mac version at least will use OpenGL. I haven't made up my mind whether I'll be using someone else's framework or rolling my own, but either way I bet there'll be OpenGL for the Mac version. I haven't yet decided whether I'll have a Direct3D version for Windows or not.

However in my case I'm only using hardware accelerated 2D sprites, so it isn't really an issue either way.

Share this post


Link to post
Share on other sites
haegarr    7372
I will stick with OpenGL:
* For another while I don't see that OpenGL's current feature set is hindering me to make the planned progression (there is enough stuff besides pure graphics I have to do, so I can await what happens for a while).
* Many things from the Deprecation Model are not used by me anyway, and other may perhaps be dropped in the near furture or (w.r.t. OpenGL 3.1) even not started to be used.
* I have no clue when and how Khronos may change the API model, but the actual graphics API is wrapped by my own code anyway.
* Okay okay, that I'm developing on a Mac may play a role, too ;)

Share this post


Link to post
Share on other sites
Krohm    5030
Quote:
Original post by MARS_999
I am curious to see who here is planning on sticking with OpenGL or moving onto DX or some other means. Would like to see who is planning on leaving the community and who is staying around. Trying to get a feel for what support is left for OpenGL as a hobbyist game coder.
I have already switched to D3D9 more than a year ago, before the project I worked on was frozen.

In theory, I would have planned a GL backend at some point. As the thing stands now, this isn't going to happen in the next few years by sure, even assuming I could somehow get back that work...

It sucks to say but the marketing guy who told me to drop GL was right. The marketing guy was right!!!
Yesterday I had about 80km cycling to get a reason for this. I still cannot believe it happened.

Share this post


Link to post
Share on other sites
Kambiz    758
We didn't got the promised cleaned up API but opengl is still the only option on mac and linux. The current API might be hard to use but it is up to date. I have written a clean c++ wrapper for opengl that makes it much easier to use... I will stick with opengl.

Share this post


Link to post
Share on other sites
cignox1    735
I haven't worked on real time rendering for a while now, but unless something new happens in the next 12 months, when I will DX will be most probably my choice. I choosed opengl because I didn't like dx syntax (several years ago, dx 6 or 7 iirc). Now they say that things are much different...

Share this post


Link to post
Share on other sites
Josef Meixner    142
I am very disappointed with what OpenGL 3.0 has now been turned into, but for me the situation is now easier. My hobby project has to run on my Laptop (not the newest, ATI 7900 or something like that), so the originally intended OpenGL 3 would have been impossible (that old card doesn't even support 2.0, so 3.0 is also not possible) and switching to 3.0 on another system would have made it incompatible with OpenGL 1.5 (with the originally planned OpenGL 3.0).

So in some twisted way the current situation is better technically, but I would have preferred it the originally planned way. I develop also on Linux, so no chance to switch to anything else.

Professionally I never had an option to switch to OpenGL 3.0 as I have to support Intel-cards. So nothing changed there either.

Share this post


Link to post
Share on other sites
_the_phantom_    11250
Quote:
Original post by bootstrap
I am staying with OpenGL. I have no choice, because I cannot allow my application to become dependent upon the evil empire. Either people forget how many times they have been shafted by macroshaft, or they haven't been developing software long enough yet to *repeatedly* find out and get angry enough to abandon THEM on principle - and out of self-preservation. I must admit, it took several times to sink into my lame brain.


As I said in the other thread; name them.
And lets not ignore the constant shafting the ARB has given OpenGL over the last 8 years either, you can't be selective in these things...

Share this post


Link to post
Share on other sites
_neutrin0_    241
I know the situation with OpenGL 3.0 is not all that good, but as an OpenGL programmer, it not that difficult to shift to Direct3D. So even if you were planning to move to Direct3D now or sometime in the future it's not that big a deal.

We have both renderers working for us, but it's a fact the team is quickly losing interest in the OpenGL one. But that's for sometime now, even before this fiasco with 3.0. There have been several factors for that. Be it OpenGL being slow while rendering to FBOs on certain h/w, the FBO mess that was with ATI drivers, broken Intel drivers, lack of debugging tools like PerfHUD and many more reasons. For our current project all major renderer issues were in the OpenGL renderer as compared to DirectX and they were all hardware and driver related.

Personally, being the senior most member in the team and a long time OpenGL user, I was the one pushing for the OpenGL renderer. Now with this, it's just a slap in the face! So I am putting my tail between my legs and admitting that we will probably lean on the Direct3D renderer from now on. Though I am not ruling out OpenGL (which I might, in my personal time maintain separately).

Share this post


Link to post
Share on other sites
davepermen    1047
they lost me with the move to c#. for a while, i just played with software rendering. then, when i thought, lets look at hardware rendering again, there came xna. i never look back.

the new api would've been great, great for porting to other languages like c#, too. but as it stays where it is, it stays in the c land, and gets less and less useful for any new project.

it's sad. i started with gl, it was great (and still is, by idea). kronos is the worst thing that happened to gl (and it was in a terrible state before kronos came..)

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this  

  • Similar Content

    • By Kjell Andersson
      I'm trying to get some legacy OpenGL code to run with a shader pipeline,
      The legacy code uses glVertexPointer(), glColorPointer(), glNormalPointer() and glTexCoordPointer() to supply the vertex information.
      I know that it should be using setVertexAttribPointer() etc to clearly define the layout but that is not an option right now since the legacy code can't be modified to that extent.
      I've got a version 330 vertex shader to somewhat work:
      #version 330 uniform mat4 osg_ModelViewProjectionMatrix; uniform mat4 osg_ModelViewMatrix; layout(location = 0) in vec4 Vertex; layout(location = 2) in vec4 Normal; // Velocity layout(location = 3) in vec3 TexCoord; // TODO: is this the right layout location? out VertexData { vec4 color; vec3 velocity; float size; } VertexOut; void main(void) { vec4 p0 = Vertex; vec4 p1 = Vertex + vec4(Normal.x, Normal.y, Normal.z, 0.0f); vec3 velocity = (osg_ModelViewProjectionMatrix * p1 - osg_ModelViewProjectionMatrix * p0).xyz; VertexOut.velocity = velocity; VertexOut.size = TexCoord.y; gl_Position = osg_ModelViewMatrix * Vertex; } What works is the Vertex and Normal information that the legacy C++ OpenGL code seem to provide in layout location 0 and 2. This is fine.
      What I'm not getting to work is the TexCoord information that is supplied by a glTexCoordPointer() call in C++.
      Question:
      What layout location is the old standard pipeline using for glTexCoordPointer()? Or is this undefined?
       
      Side note: I'm trying to get an OpenSceneGraph 3.4.0 particle system to use custom vertex, geometry and fragment shaders for rendering the particles.
    • By markshaw001
      Hi i am new to this forum  i wanted to ask for help from all of you i want to generate real time terrain using a 32 bit heightmap i am good at c++ and have started learning Opengl as i am very interested in making landscapes in opengl i have looked around the internet for help about this topic but i am not getting the hang of the concepts and what they are doing can some here suggests me some good resources for making terrain engine please for example like tutorials,books etc so that i can understand the whole concept of terrain generation.
       
    • By KarimIO
      Hey guys. I'm trying to get my application to work on my Nvidia GTX 970 desktop. It currently works on my Intel HD 3000 laptop, but on the desktop, every bind textures specifically from framebuffers, I get half a second of lag. This is done 4 times as I have three RGBA textures and one depth 32F buffer. I tried to use debugging software for the first time - RenderDoc only shows SwapBuffers() and no OGL calls, while Nvidia Nsight crashes upon execution, so neither are helpful. Without binding it runs regularly. This does not happen with non-framebuffer binds.
      GLFramebuffer::GLFramebuffer(FramebufferCreateInfo createInfo) { glGenFramebuffers(1, &fbo); glBindFramebuffer(GL_FRAMEBUFFER, fbo); textures = new GLuint[createInfo.numColorTargets]; glGenTextures(createInfo.numColorTargets, textures); GLenum *DrawBuffers = new GLenum[createInfo.numColorTargets]; for (uint32_t i = 0; i < createInfo.numColorTargets; i++) { glBindTexture(GL_TEXTURE_2D, textures[i]); GLint internalFormat; GLenum format; TranslateFormats(createInfo.colorFormats[i], format, internalFormat); // returns GL_RGBA and GL_RGBA glTexImage2D(GL_TEXTURE_2D, 0, internalFormat, createInfo.width, createInfo.height, 0, format, GL_FLOAT, 0); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); DrawBuffers[i] = GL_COLOR_ATTACHMENT0 + i; glBindTexture(GL_TEXTURE_2D, 0); glFramebufferTexture(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0 + i, textures[i], 0); } if (createInfo.depthFormat != FORMAT_DEPTH_NONE) { GLenum depthFormat; switch (createInfo.depthFormat) { case FORMAT_DEPTH_16: depthFormat = GL_DEPTH_COMPONENT16; break; case FORMAT_DEPTH_24: depthFormat = GL_DEPTH_COMPONENT24; break; case FORMAT_DEPTH_32: depthFormat = GL_DEPTH_COMPONENT32; break; case FORMAT_DEPTH_24_STENCIL_8: depthFormat = GL_DEPTH24_STENCIL8; break; case FORMAT_DEPTH_32_STENCIL_8: depthFormat = GL_DEPTH32F_STENCIL8; break; } glGenTextures(1, &depthrenderbuffer); glBindTexture(GL_TEXTURE_2D, depthrenderbuffer); glTexImage2D(GL_TEXTURE_2D, 0, depthFormat, createInfo.width, createInfo.height, 0, GL_DEPTH_COMPONENT, GL_FLOAT, 0); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST); glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST); glBindTexture(GL_TEXTURE_2D, 0); glFramebufferTexture(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, depthrenderbuffer, 0); } if (createInfo.numColorTargets > 0) glDrawBuffers(createInfo.numColorTargets, DrawBuffers); else glDrawBuffer(GL_NONE); if (glCheckFramebufferStatus(GL_FRAMEBUFFER) != GL_FRAMEBUFFER_COMPLETE) std::cout << "Framebuffer Incomplete\n"; glBindFramebuffer(GL_FRAMEBUFFER, 0); width = createInfo.width; height = createInfo.height; } // ... // FBO Creation FramebufferCreateInfo gbufferCI; gbufferCI.colorFormats = gbufferCFs.data(); gbufferCI.depthFormat = FORMAT_DEPTH_32; gbufferCI.numColorTargets = gbufferCFs.size(); gbufferCI.width = engine.settings.resolutionX; gbufferCI.height = engine.settings.resolutionY; gbufferCI.renderPass = nullptr; gbuffer = graphicsWrapper->CreateFramebuffer(gbufferCI); // Bind glBindFramebuffer(GL_DRAW_FRAMEBUFFER, fbo); // Draw here... // Bind to textures glActiveTexture(GL_TEXTURE0); glBindTexture(GL_TEXTURE_2D, textures[0]); glActiveTexture(GL_TEXTURE1); glBindTexture(GL_TEXTURE_2D, textures[1]); glActiveTexture(GL_TEXTURE2); glBindTexture(GL_TEXTURE_2D, textures[2]); glActiveTexture(GL_TEXTURE3); glBindTexture(GL_TEXTURE_2D, depthrenderbuffer); Here is an extract of my code. I can't think of anything else to include. I've really been butting my head into a wall trying to think of a reason but I can think of none and all my research yields nothing. Thanks in advance!
    • By Adrianensis
      Hi everyone, I've shared my 2D Game Engine source code. It's the result of 4 years working on it (and I still continue improving features ) and I want to share with the community. You can see some videos on youtube and some demo gifs on my twitter account.
      This Engine has been developed as End-of-Degree Project and it is coded in Javascript, WebGL and GLSL. The engine is written from scratch.
      This is not a professional engine but it's for learning purposes, so anyone can review the code an learn basis about graphics, physics or game engine architecture. Source code on this GitHub repository.
      I'm available for a good conversation about Game Engine / Graphics Programming
    • By C0dR
      I would like to introduce the first version of my physically based camera rendering library, written in C++, called PhysiCam.
      Physicam is an open source OpenGL C++ library, which provides physically based camera rendering and parameters. It is based on OpenGL and designed to be used as either static library or dynamic library and can be integrated in existing applications.
       
      The following features are implemented:
      Physically based sensor and focal length calculation Autoexposure Manual exposure Lense distortion Bloom (influenced by ISO, Shutter Speed, Sensor type etc.) Bokeh (influenced by Aperture, Sensor type and focal length) Tonemapping  
      You can find the repository at https://github.com/0x2A/physicam
       
      I would be happy about feedback, suggestions or contributions.

  • Popular Now