TapewormTuna

Members
  • Content count

    8
  • Joined

  • Last visited

Community Reputation

253 Neutral

About TapewormTuna

  • Rank
    Newbie
  1. OpenGL What happened to Longs Peak?

    Late response, but to answer your question: Apple does not like open standards. I'm going to guess they had plans to build their own proprietary API like Microsoft did, and with the recent surge of low-level APIs they had an opportunity to do just that. I'm not sure what Blizzard could have possibly had against it.
  2. Java vs. C++ for my game?

      This is only partially true. You can decompile Java and get something very similar to the original source code. Obfuscating can help, but I don't know how well that works in practice (there are tools for de-obfuscating Java). C++ is, practically speaking, impossible to decompile. You can disassemble it, which could be enough for a cheater, but actually turning the executable back into working code is a huge task that I don't think any cheater will care to try.
  3. I'm wondering if there it is possible to (and if so, if there has been any attempts at) implement Vulkan on top of OpenGL. The reason being that it would allow the rendering engine to optimize for Vulkan while supporting older platforms that do not implement the API. All of the discussions and projects I've found are about implementing OpenGL on top of Vulkan (which seems like a silly thing to do) and not the other way around.
  4. OpenGL What happened to Longs Peak?

    While it's good to see that they added many of these features into OpenGL since, why did they scrap the new spec and wait to implement these features later?   The object model would have been a major change for the better. I don't understand why they thought it was a good idea to not include it. OpenGL's current object system feels like a bunch of hacks thrown on a 25 year old API.
  5. For the uninformed, back in ~2007 the Khronos Group made a big revision to OpenGL called "Longs Peak" but killed it off in favor of what is now OpenGL 3.0. I read about it a while back and forgot about it, but I recently was reminded of it and I'm curious as to why it was killed off. There are still quite a few interesting articles and documents explaining the changes, such as this slide from GDC 2007 https://www.khronos.org/assets/uploads/developers/library/gdc_2007/OpenGL/A-peek-inside-OpenGL-Longs-Peak.pdf. What I found really interesting was that one of the reasons for changing the API was because OpenGL hasn't caught up with current hardware (2007 "current"). Isn't that why Vulkan was created? I know the two APIs are very different, but how would Longs Peak have compared to the newer APIs? It seems like they had an idea for this wonderful new API but killed it before it ever got released. Why?
  6. This question has already been asked many times before.   It probably won't matter which scripting language you choose. Just pick whichever one works best for you.   AngelScript is a very good library. The only downsides are that it's large, it's not very popular, and it's not particularly fast (but that's not a problem 99% of the time).
  7. My rendering code is meant to be API-agnostic (I am actually using two different APIs). I have a VideoDriver class and a pure virtual HardwareBuffer class. Whenever I need to create a new HardwareBuffer you create it via the VideoDriver class. Internally, the VideoDriver implementation stores a vector of the actual HardwareBuffer objects. When you create a new one, it pushes a new object to the back of the vector and returns a pointer to the virtual class. This worked until I changed some of the code. nvogl.dll would sporadically crash because it tried referencing a null pointer. I eventually figured out that it was caused by the creation of the buffers, but after probing various parts of the code I couldn't figure out why it wasn't working. I spent maybe 10 hours searching through my code trying to figure out what could've possible gone wrong. I eventually found out that after a certain number of buffers were created, the vector would resize and cause all pointers previously returned to become dangling (causing the VBO handle to become trashed). The fact that the memory addresses would change after a reallocation had never crossed my mind. I think I've been using Java too much recently.
  8. There are other libraries such as GLFW and SFML that exist and are arguably a lot easier to use. Last time I used SDL2 I was left frustrated over how it needed to hijack main. SDL2 also doesn't have the same portability that SDL1.2 did.   What am I missing? Does it have some feature that none of the other libraries have? I'm assuming there's a reason why it's popular, but what?
  9.   A hybrid system for using the appropriate technique where they perform best makes sense. But it does sound like a pain to implement.     Good to know.
  10. What I mean by hardware-accelerated is using a GPGPU API such as OpenGL's compute shaders. Now, correct me if I'm wrong, but from what I've heard is that the only reason why rasterization is faster is due to hardware optimizations. So exactly how fast is ray-tracing on modern GPUs? Would making a game in it be viable? Or are we just not to that point yet?