Jump to content
  • Advertisement

Cappah

Member
  • Content Count

    5
  • Joined

  • Last visited

Community Reputation

175 Neutral

About Cappah

  • Rank
    Newbie
  1. Ok, let me clarify, because it seems that the other answers are focused on the first half, but not the last half of what I said. I have already calculated the angle between points B and C. That's not the problem whatsoever, so that part is already solved.   The problem (tested in my program) is that yaw of the camera is dependent upon world space with z = north (0), -z = south (180), x = east(90) and -x = west(270).  Now I would have already been done just from the first half if the camera was statically placed at 0, y, 0 (since I'm only checking the horizontal plane), but this is not the case. The camera can be anywhere in the world (anywhere of the four Quadrants). This makes the yaw of the camera problematic as I will explain next.   Given the same image, if you move the diagram to the Quadrant I, II, III or IV, the world coordinates are valid in relation to the camera's orientation and the angle, plus the angle of point B and C matches that of the world.   But here is where everything fails: Rotate the direction of the camera and point B for example 180 degrees, but keep point C in the same spot. Point C will still be 270 degrees in relation to the world, but in relation to the camera's direction (point B always = 0 degrees), point C = 90 degrees.   So the problem is not the angle between point B and C. The problem is defining degrees specific to the camera's direction it is facing, which will always be 0 degrees no matter which way it rotates. If you dont understand that, the problem is no different than getting proper orientation. if you are facing the world's north and your dog is standing to your left, you understand that not only the dog is to the left of you, but the dog in reference to the world is also west of you. But if you were to turn 90 degrees counter clockwise, the dog will still be west of you, but is the dog still to the left of you? NO. The dog is now in front of you.    So roughly, as the world has its own Cartesian coordinate system, the camera needs an independent version based on the direction the camera is pointing.   P.S. Not all programmers read/write math notation and I don't appreciate the condescending response. Now if I give you sentence written in Farsi writing, should you know how to say it or should it written phonetically so you can say it and use it? Same goes with programming. A lot of people can't read mathematically notations, but they sure know how to program the same equations if it's written in a form they understand such as when you said "the angle is equal to the inverse cosine of the dot product of two normalized vectors."      Measure the angle from the world to C. Then measure to B. Then subtract B from C. If C is 180 and B is 45, then the number of degrees between B and C is 135. Also you should be working with radians, not degrees. I suggest you study this https://en.wikipedia.org/wiki/Unit_circle.   If the answer is still off base, their exists some other issue. From your description it is difficult to tell what you want to solve.   "But here is where everything fails: Rotate the direction of the camera and point B for example 180 degrees, but keep point C in the same spot. Point C will still be 270 degrees in relation to the world, but in relation to the camera's direction (point B always = 0 degrees), point C = 90 degrees"   So what do you want C degrees to be? Is 90 wrong for your use case?   I apologize, I did not mean to come off condescending. If you hadn't down voted I would not have sounded so harsh.   If you add 180 degrees to the camera direction/Point B, it will be point 225 degrees in relation to the world. The angle between B and C will still be 90 degrees, but point C in relation to point B will be 90 degrees (to the right of the camera based on the direction it is pointing).   To make the explanation as simple is as possible, you have to relate the camera to yourself and world space to Earth. The cardinal directions never change, but you do depending on the direction you are facing. The direction you face is always 0 degrees. So no matter what cardinal direction you face whether N,NW,NE,S,SW,SE in the world, if an object is directly to your right, it is 90 degrees to your right. If an object is behind you, it is 180 degrees behind you regardless of the direction you face.   You've seen this enacted in games, movies and real life. It's applied in navigation points (arrows pointing offscreen to orientate you to your destination). Or most notably, 3D sound in a game. if an object blows up to the left of you, you get the angle of the position of the explosion in relation to the direction you are facing so that you can properly pan the sound in your speaker/headphones to the left, right or center.   So point C is the angle (clockwise) was from 0 (camera direction). This is why the camera's facing direction must always remain 0. If it went by world space, the camera direction would be arbitrary and will give false values based on where the camera and point C is positioned in the world.    No apology needed. It's all gravy.
  2. Ok, let me clarify, because it seems that the other answers are focused on the first half, but not the last half of what I said. I have already calculated the angle between points B and C. That's not the problem whatsoever, so that part is already solved.   The problem (tested in my program) is that yaw of the camera is dependent upon world space with z = north (0), -z = south (180), x = east(90) and -x = west(270).  Now I would have already been done just from the first half if the camera was statically placed at 0, y, 0 (since I'm only checking the horizontal plane), but this is not the case. The camera can be anywhere in the world (anywhere of the four Quadrants). This makes the yaw of the camera problematic as I will explain next.   Given the same image, if you move the diagram to the Quadrant I, II, III or IV, the world coordinates are valid in relation to the camera's orientation and the angle, plus the angle of point B and C matches that of the world.   But here is where everything fails: Rotate the direction of the camera and point B for example 180 degrees, but keep point C in the same spot. Point C will still be 270 degrees in relation to the world, but in relation to the camera's direction (point B always = 0 degrees), point C = 90 degrees.   So the problem is not the angle between point B and C. The problem is defining degrees specific to the camera's direction it is facing, which will always be 0 degrees no matter which way it rotates. If you dont understand that, the problem is no different than getting proper orientation. if you are facing the world's north and your dog is standing to your left, you understand that not only the dog is to the left of you, but the dog in reference to the world is also west of you. But if you were to turn 90 degrees counter clockwise, the dog will still be west of you, but is the dog still to the left of you? NO. The dog is now in front of you.    So roughly, as the world has its own Cartesian coordinate system, the camera needs an independent version based on the direction the camera is pointing.   P.S. Not all programmers read/write math notation and I don't appreciate the condescending response. Now if I give you sentence written in Farsi writing, should you know how to say it or should it written phonetically so you can say it and use it? Same goes with programming. A lot of people can't read mathematically notations, but they sure know how to program the same equations if it's written in a form they understand such as when you said "the angle is equal to the inverse cosine of the dot product of two normalized vectors." 
  3. Sorry, but I'm not a mathematician, so I can't read math notation. Plus you didn't explain how this applies
  4. What I'm trying to achieve is get the angle that an object is in relation to the direction the camera is facing. From the image I've provided: Point A: Camera position. Point B: Reference point which is always in front of the camera's view regardless of rotation. Point C: Object in question.   Calculating the angle between point B and C is fairly easy, but the problem is that the yaw of my camera (in degrees) is according to world space and not in view(camera) space. This becomes problematic depending on which Cartesian Quadrant the camera position currently resides in.     How do I define degrees based on the camera's direction independent of world space?
  5. I'm building a level editor and I've been stuck at getting the mouse to work properly. I've managed to display a model at the Vector.Zero coordinates with my camera set at Vector3(0, 50, 150).   Keyboard input works as it should as an independent Free Camera. My end goal is to get the camera to freely move and rotate around a scene. The problem comes when I'm trying to pitch and yaw the camera by mouse. I've set the camera to supposedly rotate in whatever direction by moving the mouse while holding the right mouse button down. But what happens is that the second I click the mouse button, my model disappears. I tried using a bigger model (terrain this time) and the same results happened. From my code, does anyone see the the problem?      Attached is the free camera and custom panel for displaying and updating.   **NOTE**  The Draw method in the panel class acts in place of Draw and Update in a regular XNA project (for those that don't know about XNA within WinForms). 
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!