opengl

Started by
24 comments, last by MJP 15 years, 6 months ago
Quote:Original post by xZekex
That is a biased statement.
It's also true.
Advertisement
Quote:Original post by Brotocol
I just find the API very far fetched and not intuitive at all.


Far-fetched? I'd say that DX10 does a 1000% better job of presenting an API that resembles the way graphics hardware actually works.

Quote:Original post by Demirug
The OpenGL ES for PS3 doesn’t count as it is not useable for real games.


I would have to disagree with that. I'm a programmer on a very real, and very big game, and most other games are using our engine in my company, and the PS3 version of the engine is made in OpenGL ES. It works fine.
Ok, lets not have this get too heated [smile]

The "cleanliness" of code has come up a few times since I first raised it, but I think my point has been slight misunderstood compared with what I intended. Again, I state my bias and experience toward D3D first. It's the multiple path architecture that I've heard OpenGL developers discuss that I dislike. D3D9 isn't much better here but I do think D3D10 is a big step forwards with the fixed-caps design. The complexity of graphics code was (is?) too high - scaling on performance, on features, on driver stability etc... Again, from hearsay mostly but OpenGL seems to require many more paths if you want to truly utilize the hardware available.


Anyway.

What do people think of the Intel Larrabee and Tim Sweeney's latest comments? A fundamental move back to custom software rendering in the many-core world and the death of the GPU (+OpenGL/D3D style API's), or just noise...?


Jack

<hr align="left" width="25%" />
Jack Hoxley <small>[</small><small> Forum FAQ | Revised FAQ | MVP Profile | Developer Journal ]</small>

Quote:Original post by BenMatlock
Quote:Original post by Demirug
The OpenGL ES for PS3 doesn’t count as it is not useable for real games.


I would have to disagree with that. I'm a programmer on a very real, and very big game, and most other games are using our engine in my company, and the PS3 version of the engine is made in OpenGL ES. It works fine.


I don’t deny that it works. But it ate way too much performances compared to the native graphics API. If you can live with this limitation its fine.

Quote:Original post by jollyjeffers

What do people think of the Intel Larrabee and Tim Sweeney's latest comments? A fundamental move back to custom software rendering in the many-core world and the death of the GPU (+OpenGL/D3D style API's), or just noise...?



What Tim Sweeney has said is interesting, but I think at this point it's hard to tell what the landscape is going to like in 5 years. How relevent is PC gaming going to be? Will Larrabee be a success or a failure? Will future ATI and Nvidia GPU's look more like Larrabee or more like something else? What hardware will the consoles be using?

Even if Larrabee does take off big time, I'm not so sure everyone's going to be tripping over themselves to write a software rasterizer for it. I think if the D3D driver Intel provides is good, 99% of developers are going to want to utilize it to maintain compatibility with ATI and Nvidia. It might become more popular if say the next Xbox were to use Larrabee...but even then I'm not so sure too many developers are going to want to work with somethign so specifically geared towards one platform over the other. Multi-plat releases have become increasingly popular, and I don't see that changing.

This topic is closed to new replies.

Advertisement