Advertisement Jump to content
  • Advertisement


  • Content Count

  • Joined

  • Last visited

  • Days Won


lawnjelly last won the day on January 5

lawnjelly had the most liked content!

Community Reputation

1812 Excellent

About lawnjelly

  • Rank
    Advanced Member

Personal Information


  • Github

Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

  1. If you have a problematic game another option is to try running it under wine in linux:
  2. Sorry I should have worded better, it is a good alternative solution.
  3. It is an established solution, it's easy to demonstrate working, just code it up. The transform isn't necessary because you are simply interpolating a triangle in both cases... However... Afaik what may be causing the confusion is that strictly speaking, the texture mapping used in general 3d is not 'physically correct' as you are seeing it. If you use a fish eye projection for a camera, and draw a triangle, in theory the texture should also be distorted, but if you render it with standard 3d games hardware it will not be distorted. Only the vertices are going through the transform matrix, the fragment shader afaik is typically given simple linear interpolation. This may not be the case in a ray tracer. So, you are actually right in a way I think.
  4. I had a little trouble conceptualizing this, I kind-of see where you are going but am not convinced it would work. You could try it out though, code it up and compare with the barycentric solution?
  5. You can do something like that, it is doing essentially exactly the same as the very first suggestion (using barycentric coordinates), except in an extremely roundabout fashion (going on a roundabout trip via the GPU). Of course, it depends what the actual use case is, whether the conversion is rare or needed as a realtime lookup. There are many cases where having a UV -> 3d mapping for the entire texture is useful rather than e.g. using the barycentric method per point, and using the GPU is an option to create this. In my own use cases I've been fine using the CPU to calculate this conversion texture, however it you needed to e.g. recalculate it on a per frame basis the GPU might be an option, bearing in mind the potential for pipeline stalls if you have to read this back.
  6. Well not quite true (I do inverse UV calculations in 3d paint in quite a few places for instance, it might also be used in lightmapping, ray tracing etc etc), but it is probably rarely used by developers relying on 3rd party engines and tools to do this sort of thing for them. To elaborate a little, if the OP is talking about the very specific case of using a uniform transform to get all 3d verts into uv space (such as matching a viewport transform, e.g. 'project from view' in blender), rather than the more usual uv unwrapping or complex projections, then in theory to go from UV space back to 3d space you would simply use the reverse transform (e.g. inverse matrix). However in practice with a UV coordinate you have usually lost the 'depth' value (a degree of freedom I think this is called?), so even if you got back to the viewport x, y, you would have lost the depth information and thus the 3d position. I think also there might be issues once you are not using a simple orthographic projection (consider a fish eye lens and trying to extrapolate barycentric coordinates outside a central triangle, the relationship between uv and 3d space would break down I think). If you still had the 3d -> uv matrix and the uv depth value, then it may in some cases be possible to go directly back to a 3d vertex (don't quote me on that, I might be missing something obvious lol ). It is similar to any matrix transform in that respect.
  7. This is essentially correct. The 3d position of a particular UV coordinate only makes sense in terms of the triangle (or triangles) that encompass that point. If you pick a UV coordinate outside of the triangles for instance, it doesn't really represent any point in 3d space.
  8. Copying and pasting from examples can be okay but afterwards you should usually go through it to make sure you fully understand what it is doing (of course, we all copy and paste code sometimes! ). Yes, when correct, the code should work whatever the speed of the projectile. Here are some tutorials on 2d vector math for games, they will explain it far better than I can, e.g. The code that alvaro posted is basically how to normalize a vector, although you should usually do a check for zero length vector to prevent divide by zero, which will cause an error. Normalizing (aka Unitizing) a vector takes a vector in any direction (except zero length vector, because the direction is undefined) and resizes it so that the length in that direction is 1.0. It is a very common operation in games.
  9. Part of the problem for us is that your code shown includes too much irrelevant stuff to easily work out what is happening, or intended (for me, at least! ), and your explanation of what is intended is not clear. If you are reading the mouse x, y each loop and using the bullet start position each loop to determine the angle, you will get problems. Are you trying to have a bullet that has a constant direction? Or are you trying to create a homing missile? For a constant direction you could for example RECORD the target position when fired, then calculate the move towards this target each loop: e.g. // calculate offset from bullet to target // Where vector 2 is using floats rather than ints Vector2 ptOffset = ptTarget - ptBullet; // make offset length 1 ptOffset.Normalize(); // make length of offset equal to desired move length ptOffset *= fMoveLength; // probably have some logic for reaching target // .... // move the bullet towards target ptBullet += ptOffset; A similar approach to this might also work for a moving target (but usually you'd use something a bit more complex to limit bullet direction changes etc). You could also probably more appropriately use this kind of thing for a constant direction bullet: class CBullet { public: Vector2 m_ptPos; Vector2 m_ptVelocity; void Fire(const Vector2 &ptStart, const Vector2 &ptTarget) { m_ptPos = ptStart; m_ptVelocity = ptTarget - ptStart; m_ptVelocity.Normalize(); m_ptVelocity *= SPEED; } void Update() { m_ptPos += m_ptVelocity; } }; It seems like you are using a different x, y target position each time: (more like a homing missile) SDL_GetMouseState( &x, &y ); In which case any direction for a bullet move should be based on the current bullet position, not the bullet start position.
  10. It just depends how accurate you want to go for. You can estimate a rough centre of mass for each bone by using something like this: // for through bones for (int b=0; b<32; b++) { Vertex3 ptAverage(0, 0, 0); float fTotalWeight = 0.0f; for (int n=0; n<nVerts; n++) { // for through weights for (int w=0; w<4; w++) { if (vert[n].boneID[w] == b) { ptAverage += vert[n].pos * vert[n].boneWeight[w]; fTotalWeight += vert[n].boneWeight[w]; } // if } // for w } // for n ptAverage /= fTotalWeight; print ("AveragePosition of bone " + b + " is " + ptAverage); } You can also estimate the orientations of the bones by e.g. making a guess of a orientated bounding box, guesstimate some joint positions etc.
  11. Just a quick guess (I haven't properly read your code) but it sounds like it might be an integer rounding issue. If your destination is e.g. at 190, 100 and you move 1/100th of the way each time, converted to integers you might either move 1, 1 pixel at a time or 2, 1 (depending on how you round) so you would end up at 200, 100 or 100, 100. If you move faster it will be more accurate (e.g. 19, 10 per move) and slower, less accurate.
  12. lawnjelly

    Playing with water physics

    Haha yep!! I haven't decided on game type yet, there's a few hungry shark games that do well lol. Can't decide whether to do something modern or ye olde with sailing. I guess I can do both.
  13. lawnjelly

    Playing with water physics

    While I've been mostly busy this past month with moving house, I have been making a few models and experimenting with making some water physics in Godot. I got thinking about when doing Frogger, about having a section with the frog driving a boat, but didn't really have time. Anyway as far as I can see there is no built in support for water in Godot using Bullet or internal physics engine, but I figured it couldn't be too hard to hack something together. Fixed tick rate physics First things first, I figured a good way of getting physics to work in combination with a fixed tick rate and interpolation. I will probably write a separate post on this, but I was successful in this case by completely separating the physics representation from the rendered object representation (the 3d model etc) in the scene graph. This ensured that any interpolation code on the rendering side did not interfere with the physics. Buoyancy Next step was to turn off gravity, and add some kind of manual forces to keep the objects around the waterline. It turns out that you have to be very careful how and when to apply manual forces on every tick, so as not to disrupt the simulation. In Godot I achieved this by using the _integrate_forces function, NOT the _physics_process. I found that rather confusingly adding forces (particularly angular) within _physics_process borked the simulation. _integrate_forces on the other hand gives you access to the actual physics state. My current code to keep objects at the surface is rather simple: It calculates the y offset (y is up in Godot) of the object from the surface, and applies an impulse in proportion towards the sea level. I actually increase this above water for some kind of gravity, but I'm sure I will come to a more robust solution for this, such as turning on real gravity when significantly above sea level. Righting This works reasonably well, however, there is a problem, boats etc roll around in the water and don't stay upright. Clearly there needs to be some kind of way of 'righting' objects. I posted a thread to get advice on this ( however in the end, rather than using a complex and CPU intensive technique to calculate proper buoyancy I have for now opted for a rather simple hack. I essentially calculate how far an imaginary mast is from pointing vertically up, and calculate an angular velocity towards this. The details are in the thread. This seems to work pretty well, at least in the flat ocean case. I may have to either modify this or opt for a more complex technique for large waves. Waves It was pretty easy to put in some keyboard control to move a boat or similar about, next step was to see if I could get some kind of waves in the ocean. It turns out Godot has a simplified method of writing multiplatform shaders, and it was pretty easy to follow a tutorial and get a vertex shader with some moving waves. This was already looking better but I wanted the boats etc to move with the waves. As there is no way of reading back the results from the shader, I essentially duplicated the shader wave height function in the main code, did some hackery for scaling, and ended up with a function where I could calculate wave height in the game at any location that would match the shader. My first thoughts were to simply plug in the wave height into the existing buoyancy code, however, probably due a delay between the impulse being applied and having a decent effect, the objects ended up bobbing out of sync with the waves. Due to the velocity being applied the objects tended to overshoot the surface continually. Really I just wanted to have the objects match the surface with loads of friction, so I realised for a hack I could apply the wave height to the rendered representation, and not apply it to the physics. I tried this out and it seems to basically work (for now). The physics is working on a flat ocean, and the waves are added afterwards to what you see. Obviously this is not ideal, there will be collisions in the physics where there would not be in reality due to the wave heights, and there is no momentum due to the waves etc. This is all a work in progress. Anyway these are the results so far I think it is promising. Large Worlds Soon I want to experiment with a large world. For a boating game I'd like to be able to travel a long way at speed, and this may cause precision issues if using single precision floats. So I'll see if I can have some kind of scheme where I teleport the player and objects once they move a certain distance from the origin, while keeping the physics and waves consistent. Another alternative is keeping the player at the origin and moving the objects relative to the player, however that might be a nightmare interfacing with the physics engine.
  14. Is this for a game engine driven by a scripting language, like godot or unity? The onus is usually on the user not to have infinite loops, or performance sapping code. You can easily check for this just with a simple: bool bQuit = false; while (!bQuit) { for (int n=0; n<10; n++) { bQuit = RunVMInstruction(); if (bQuit) break; } if (TimeTooLong() || TooManyInstructions()) bQuit = true; } If a timer check is expensive, surely you can move it out of an inner loop? Perhaps there is not enough information to properly understand why this is a problem, as Wessam says. Afaik event based timers involving the OS are often wildly inaccurate (+- lots of milliseconds) so you are usually better off doing this kind of thing yourself. Even if calling your OS gettime function was that expensive (even outside a loop), you could also just roughly calibrate a number of VM instructions that would run in a given time, and use the instruction counter.
  • Advertisement

Important Information

By using, you agree to our community Guidelines, Terms of Use, and Privacy Policy. is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!