• Content count

  • Joined

  • Last visited

Community Reputation

144 Neutral

About chronozphere

  • Rank
  1. GLSL rounding errors?

    You are completely right! Thanks alot.   Now I know that I should not take such shortcuts in order to "optimize". 
  2. GLSL rounding errors?

    Thanks for the reply.   I made a very stupid mistake by writing aPosition instead of vPosition in the second code snippet.   So the following gives artifacts: aPosition = uView * uModel * vPosition; gl_Position = uProjection * aPosition; The following works: aPosition = uView * uModel * vPosition; gl_Position = uProjection * uView * uModel * vPosition;   //vPosition instead of aPosition   It seems strange to me why these different lines of code return different results.    About the depth testing: Yes, I could have set it to GL_LEQUAL and leave it there. Good point!   Thanks
  3. Hey guys, I've got the following weird issue. I'm trying to implement multipass lighting and I encountered some sort of Z-fighting issue. I perform one ambient/diffuse pass with glDepthFunc(GL_LESS) and then a additive light pass with glDepthFunc(GL_EQUAL) with depth writes disabled. When I pan my camera around, I get huge flickering artifacts. The vertex shader for the light pass contains:uniform mat4 uModel;uniform mat4 uView;uniform mat4 uProjection; void main() {aPosition = uView * uModel * vPosition;gl_Position = uProjection * aPosition; I'd pass aPosition to the fragment shader to do the lighting with. When I change this to: aPosition = uView * uModel * vPosition;gl_Position = uProjection * uView * uModel * vPosition; it suddenly works without flickering. What's going on? Does an assignment inbetween some math expressions create errors? Thanks a bunch!
  4. Can somebody explain me how I can use the gl_nv_draw_buffer extension, either in JAVA or using native code? Thanks!
  5. Hi, I would like to use the gl_nv_draw_buffers extension, which should be available on all tegra devices. I found the following header file: [font="arial, sans-serif"][color="#009933"][size=1][size=4][url=""][/url][/size][/size][/color][/font] However, I'm not sure how to use this, as it is not part of android. The official NDK doesn't contain a version of gl2ext.h which defines gl_nv_draw_buffers. I would like to know where to get the official headers as used by NVidia. Also, I would like to know how to use these properly. Should I put them in the android-ndk/platform directory somewhere? Thanks a bunch!
  6. If you feel like optimizing, you could also store the normal using spherical coordinates (in R and G) and still have B left for other purposes. That would allow you to squeeze your image into an RGB format, saving 16bits per pixel.
  7. Thanks for your helpfull reply. I was only aware of GL_NV_draw_buffers, which didn't mention GL_MAX_COLOR_ATTACHMENTS in it's spec. I've tested on my TF101 tablet, which has a tegra 2 soc. It reports that it has 8 color attachments. This is probably the same for tegra 3.
  8. OpenGL glDelete*** on android/OpenGLES 2

    Thank you.
  9. Hi everybody, I would like to use FBO's on android. I know it's possible because of GLES20.glGenFrameBuffers() and the other related methods. However, I can't find GL_MAX_COLOR_ATTACHMENTS anywhere? Do I have to define it myself? Is it contained in some other class which I need to import? Thanks!
  10. Hey, Do I always need to free every OpenGL resource using glDelete***, when programming for android in java? There is a garbage collector, but does it properly clean up these resources? Can it hurt to leave everything in VRAM? Thanks a bunch, Chronozphere
  11. Game loop of an event driven engine

    You are totally right. I'm thinking too much about what could go wrong. Instead, I should just try it and see what happens. :)
  12. Game loop of an event driven engine

    Ok, thanks! So there are no events leaking into the next frame? That is interesting, because I thought that that could happen. Events will trigger other events in most cases, so there is a fair chance that there would be an long loop, which must fully execute before you proceed to the next frame. Is this "looping" scenario something to be afraid of? Looking forward to any new replies. :)
  13. Hey guys, I want to go down the event-driven/Observer-pattern path, but I need some advice now: All the entities in an event-driven engine continously send events to eachother. So when should I go to the next frame? Let me clarify my problem a bit more: Consider the following events: 1. User presses CTRL-key 2. InputEntity converts it to an ACTOR_FIRE event and sends it to the actors. 3. The actor responds by sending a Raycast_callback event to the physics engine 4. The physics engine processes the event and sends Raycast_hit events to all the actors that were hit. 5. All the actors that were hit, AND who are human will die and send an ACTOR_DEAD event 6. etc etc... Where should I stop and say "Hey, now we should render something and move on"? I will create a single-threaded engine (may change in the future), so therefore I must process the events of my subsystems in a specific order. I've mentioned the steps, as described in the previous scenario. > Process Input (Step 1 and 2) > Process Actors (Step 3) > Process Physics (Step 4.. Step 5 will happen in next frame). > Process Rendering So some operations like raycasting may be split into multiple frames because it involves interaction between multiple subsystems. So what kind of game loop do I need here? What kind of problems do I need to look out for? Thanks!
  14. Physics engine v.s Scene Graph

    Thanks. Those replies really helped me! So I should put the different representations of my entities in different datastructures. > Render datastructure: Keeps objects in an efficient rendering order > Visibility datastructure: For visibility queries > Actors datastructure: Keeps track of the game-actors and updates them etc... I'm now convinced that this separation is usefull. Just stuffing everything in one big graph is very bad, as pointed out by the "scene graphs: just say no" article. However, this big shift causes new questions and challenges. For example, is it wise to represent each thing in the scene with one entity-object, or should each have it's own physics-object, render-object, visibility-object etc? At this point, I know that I should ditch scene-graphs. But what SHOULD I do? Are there any articles on modern engine design that discuss this? Thanks.
  15. Physics engine v.s Scene Graph

    That is indeed a good question. ;) First of all, if you have multiple rooms having their own entity, you can have rigid body objects in those rooms. You could also build a house of brick objects. However, you COULD also define all these at root level of your graph (which is absolutely not elegant if you ask me). And joints are another good reason to have a parent-child relationship in your scene-graph. That makes sense to me. Having my objects organized in a tree is a good way to organize them. If I actually want to build a house out of bricks and simulate it in my game, I would consider to define the brick entities as childeren of the house-entity. By doing so, all the other mechanisms (most notably visibility determination) will still work as they should. If I just dump every physics-object at the root level, I simply loose all the benefits of having a tree of objects. So, the best solution would be to make some modifications to my entity-class to allow physics code to change the world matrix. I'm still stuck with the problem of how I should manage my transformations in the tree to support all this. Any input would be highly appreciated!