Jump to content
  • Advertisement

Naruto-kun

Member
  • Content Count

    169
  • Joined

  • Last visited

Everything posted by Naruto-kun

  1. Naruto-kun

    Reverse transformation

    Bottom line, I have narrowed down the fault to this piece of code: double yr = (rate[0] * sb + rate[1] * cb) * (1.0 / cp); This is supposed to be the horizon relative yaw rate, made by combining body pitch rate (rate[0]) and body yaw rate (rate[1]). In order to get the proper value, I need to establish what is the relationship rate[1] has with pitch and heading, since those 2 combined are what cause it to go off the expected value.
  2. Naruto-kun

    Reverse transformation

    Hi guys Back again with reverse transformations for my RLG INS simulation: In the above image, an aircraft is depicted by the set of axes on the left, at 29.54233° latitude on the earth, with a heading of 150.12° and a pitch and bank of 0. The RLGs detect the earth's rotation rate of 15°/hr and the P/R/Y rate values indicate the earth rate sensed by each RLG in its respective axis. By reversing the pitch and roll transformation, and then taking the atan2 of the resulting pitch and roll rates, I can calculate the aircraft true heading (the CH value. Ignore the difference between it and actual heading. That's due to a normally distributed error applied to the calculation to simulate gyro inaccuracies). However, by taking the sin^-1 of the yaw rate/earth rate, I can calculate the aircraft latitude. This works all well and fine when pitch is 0, but quickly goes haywire if I change pitch. I know my other transforms are correct because the calculated heading is always consistent with the actual heading. Could someone spot the error in my reverse transformation for the yaw rate in the code below? Thanks JB double cp = cos(D2A(att1[0])), sp = sin(D2A(att1[0]));//pitch double cb = cos(D2A(att1[1])), sb = sin(D2A(att1[1]));//bank double ch = cos(D2A(att1[2])), sh = sin(D2A(att1[2]));//heading double clat = cos(D2A(lat - 90.0)), slat = sin(D2A(lat - 90.0)); double mat1[3][3]; double mat2[3][3]; mat1[0][0] = cp*cb; mat1[1][0] = cp*sb; mat1[2][0] = -sp; mat1[0][1] = sh*sp*cb - ch*sb; mat1[1][1] = sh*sp*sb + ch*cb; mat1[2][1] = sh*cp; mat1[0][2] = ch*sp*cb + sh*sb; mat1[1][2] = ch*sp*sb - sh*cb; mat1[2][2] = ch*cp; mat2[0][0] = clat; mat2[1][0] = 0; mat2[2][0] = -slat; mat2[0][1] = 0; mat2[1][1] = 0; mat2[2][1] = 0; mat2[0][2] = 0; mat2[1][2] = 0; mat2[2][2] = 0; //Transform pitch/roll/yaw matrix by latitude matrix double mat3[3][3] = { mat1[0][0]*mat2[0][0] + mat1[0][1]*mat2[1][0] + mat1[0][2]*mat2[2][0], mat1[0][0]*mat2[0][1] + mat1[0][1]*mat2[1][1] + mat1[0][2]*mat2[2][1], mat1[0][0]*mat2[0][2] + mat1[0][1]*mat2[1][2] + mat1[0][2]*mat2[2][2], mat1[1][0]*mat2[0][0] + mat1[1][1]*mat2[1][0] + mat1[1][2]*mat2[2][0], mat1[1][0]*mat2[0][1] + mat1[1][1]*mat2[1][1] + mat1[1][2]*mat2[2][1], mat1[1][0]*mat2[0][2] + mat1[1][1]*mat2[1][2] + mat1[1][2]*mat2[2][2], mat1[2][0]*mat2[0][0] + mat1[2][1]*mat2[1][0] + mat1[2][2]*mat2[2][0], mat1[2][0]*mat2[0][1] + mat1[2][1]*mat2[1][1] + mat1[2][2]*mat2[2][1], mat1[2][0]*mat2[0][2] + mat1[2][1]*mat2[1][2] + mat1[2][2]*mat2[2][2] }; //Transform earth rate by pitch/roll/yaw/heading matrix double rate[3] = { 15.0 * mat3[1][0] + 15.0 * mat3[1][1] + 15.0 * mat3[1][2],//Pitch rate 15.0 * mat3[0][0] + 15.0 * mat3[0][1] + 15.0 * mat3[0][2],//Yaw rate 15.0 * mat3[2][0] + 15.0 * mat3[2][1] + 15.0 * mat3[2][2],//Roll rate }; double pr = rate[0] * cb - rate[1] * sb;//This one is fine double rr = rate[2] * cp + rate[1] * sp*cb + rate[0] * sb*sp;//This one is fine double yr = (rate[0] * sb + rate[1] * cb) * (1.0/cp);//This one is faulty. Any ideas? //Calculate latitude from yaw rate double _calclat = A2D(asin(yr / 15.0));
  3. Naruto-kun

    Reverse transformation

    I will explain the process again: 1: The purpose here is to simulate the detected earth rotation rate that a set of 3 mutually orthogonal ring laser gyroscopes aligned with the aircraft's axes would detect, and then use these detected values to calculate the aircraft's latitude and true heading. An example: If the aircraft is at the equator, and is pointing to true north, the pitch and yaw rate gyroscopes would detect 0 rotation, while the roll rate gyroscope would detect a rotation rate of 15°/hr since the aircraft is in effect, barrel rolling around the earth's axis. If you were to change the heading to 90°, the roll rate would be 0 and the pitch gyro would detect a downward pitch rotation of 15°/hr. At 270°, it would be an upward pitch rotation of the same. Now back to the initial setup (latitude at 0 ie equator, pitch/bank/heading all 0), if you were to shift the latitude up or down, the detected roll rate would change by the cosine of the latitude, while the detected yaw rate would change by the sine of the latitude. It is this latter factor that I am attempting to resolve since this is part of the alignment error checking mechanism in modern inertial navigation systems. Eg If the pilot-entered initial latitude is correct, and the RLG detected latitude doesn't match, then there is likely something faulty with the system. Vice versa, the pilot needs to make sure his initial position entry is correct. 2: Orientation of the aircraft will also affect influence of the earth's rotation rate on the gyroscopes. You can try it in the app in the above link. Changing the aircraft pitch while at the equator with a heading and bank of 0 will have the same effect on the roll and yaw rate gyros as changing latitude ie detected roll rate will change by the cosine of the pitch, and yaw rate by the sine of the pitch. 3: To account for non zero orientations in pitch and bank, the INS computer applies an inverse transform of the aircraft orientation to the detected rates output by the RLG units. This results in the same effect as making the aircraft truly level would. Then you can use the detected pitch and roll rates to calculate aircraft true heading, and the detected yaw rate to calculate aircraft latitude. 4: The inverse transform is working well for pitch and roll as I get a consistently correct true heading output from the calculations. But the inverse transform to the yaw rate is not working all that well since the latitude output of the calculation is only correct when pitch is 0, or heading is +-90 deg and pitch is not equal to 0.
  4. Naruto-kun

    Reverse transformation

    I find it difficult to believe that coincidence is involved because I consistently get the true heading output correct and the aircraft latitude output is also consistently correct when pitch is 0. Here is the test app so you can see for yourself what happens (note, keep the pfx.hlsl shader file in the same location as the exe) : https://www.dropbox.com/s/54fr4gf22ri1n6q/EarthRateSim.zip?dl=0 If you compare the actual heading (labelled heading in the text field) with the calculated heading (CH), you will notice they are always equal, and the heading error value (HER) is always 0. Calculated latitude (CLAT) will always be the same as Latitude, unless you change the aircraft pitch to a non zero value, or if pitch is non 0 and heading is == 90 or -90. Such consistency doesn't allow for the probabilities required for coincidence.
  5. Naruto-kun

    Reverse transformation

    I understand that, but this is not your typical flight simulator. It is an advanced sensor model of an inertial navigation system, where I have to calculate the influence of the earth's rotation on the ring laser gyros, which output rotation rates to be integrated later, and then add error signal values to those rotation rates according to manufacturer specs, so that I can get an integrated orientation output with a specified uncertainty that will affect navigation performance. So there is a lot of transforming back and forth. This modelling of the sensor output is also required in order to simulate the alignment function which finds true heading, and also compares the sensed latitude with the pilot latitude input. If there is a significant difference, then it will require a re entry of the position data and a re-alignment in order to rule out pilot error and/or a failure of the system. By applying an inverse of the orientation transform to the sensed outputs, I get accurate pitch and roll rate values which I can then use to determine true heading. The same ought to work on the yaw rate sensor, but for some reason it doesn't unless pitch is 0 and as a result I cannot get accurate latitude values if there is the slightest non 0 pitch. If you look at the second last line in my code sample (and the 2 preceding it) you will see the inverse transformation which is working fine for pitch and roll but not for yaw.
  6. Further update (sorry for the reply spamming. Studying up a bit more on the subject): The link above also indicates that mod and % can only be used with integer values, just like C++. However, you can create your own fmod function for floating point values. It will look like this: (note, GML does have the floor function). //Name this script fmod { var result = argument0 - (floor(argument0 / argument1) * argument1); return result; } //Examples var test = fmod(5.3, 2); test will be 1.3 since 2 goes into 5 twice with 1.3 remaining.
  7. If you don't see a % anywhere, look for the term "mod" or "modulo", as that is what it is called. Sidenote, if you have never seen it before, the default windows calculator has a key called "Mod" in the Scientific and Programmer modes which performs the modulo function. Update: Just searched for it myself. They have both % and mod. https://docs2.yoyogames.com/source/_build/3_scripting/3_gml_overview/12_expressions.html
  8. Update: I assumed C++ was in use here (bad assumption). C# allows the use of the % operator and doesn't have the fmod function. I don't know about other languages.
  9. If you are using floating point (float or double) values however, you won't be able to use %. You would have to use fmod instead.
  10. Naruto-kun

    Reverse transformation

    I am using matrices for the most part. But for some reason my inverse of the matrix is not behaving as expected. Picture an sphere representing the earth in the image I posted. Then picture the aircraft (the 3 coloured lines on the left) as rotated according to its pitch/bank/heading around its own origin, then rotated in latitude around the earth's origin (3 coloured lines on the right). All while the entire assembly is rotating west to east at 15 deg/hr. The ring laser gyroscopes aligned with the aircraft's respective axes are detecting this rotation. I should then be able to take that sensed rotation and work backwards to get the aircraft true heading and latitude. As mentioned, the calculations for true heading work. The calculation for latitude only works if pitch is == 0. Solving the latter is what I am trying to do.
  11. Naruto-kun

    A transformation problem

    (facepalm) I have done this rotation on numerous occasions before. Somehow I got my sines and cosines mixed up. Resolved.
  12. Naruto-kun

    A transformation problem

    Hi guys I have a bit of a transformation challenge to deal with here. I have modelled the sensor outputs of a 3 axis ring laser gyro system which detects the earth's rotation. As you can see in the first image, a simple atan2 of the Prate and the R rate is sufficient to determine true heading. However if the local horizon pitch or bank angle is changed, I can no longer use a simple atan2. In this second image, you can see pitch is set to -1.5 deg. Now the R rate has decreased and I have a signal on the Y rate gyro. This produces an error in the atan2 output even though actual heading has not changed. Any suggestions on how to correct for this? I haven't been successful in my attempts to invert the pitch and roll rotations. Thanks JB
  13. Naruto-kun

    Qdot from 2 quaternions

    Hi guys I have a bit of a challenge here. I am attempting to determine body rotation rates from 2 Tait-Bryan angles. I have each set of angles converted to a quaternion, and the time step dt. I know that the formula for obtaining body rotation rates is 2*(qd/dt)*inverse of q. My question is, how do I determine qd/dt from 2 orientation quaternions, and which quaternion I would invert. Thanks JB
  14. Naruto-kun

    Qdot from 2 quaternions

    Thanks a bunch. Once I have sorted out the polarity I take it I just subtract each individual element in the quaternion? And dividing by dt I take it is again simply dividing each individual element by dt?
  15. Yes. I don't have the source code and am having to rely on a undocumented function that I located in one of the dll export lists. To give you an idea what also occurs, if I have the camera pointing in the same direction as the aircraft nose, and I pitch the nose up, the camera pitch/bank/heading values will be the same as the aircraft. However, if I rotate the camera 90 deg horizontally to the side, the camera pitch/bank values will be swapped.
  16. Hi guys I have a bit of a challenge here. It is best illustrated in the below screenshots: In this scene, the camera is linked in hierarchy to the cone. The cone has 0 rotation on it. The camera is rotated to the left (yaw motion) by 15 degrees (see the Z value in the bottom right corner). I then rotate the cone upwards (pitch motion on the X axis) by 30 degrees using the World transform. But then I switch from the World transform to the local and rotate the cone around the Y axis (roll motion) by -25 deg (ie to the left). The camera is linked to the cone and follows it through these motions. You can see the rotation values in the bottom right corner. The above illustrates a problem I am trying to solve using reverse engineered data from a flight simulator. I have the camera world rotation angles, and the aircraft (represented by the cone) world rotation angles in pitch and heading(yaw). My end goal is to calculate the camera rotation angle relative to the local aircraft rotation angles. But I am getting thrown for a loop by the switch to local in roll axis. Anyone have any suggestions?
  17.   Am aware of the dangers but they do not apply in what I am trying to achieve.   That is the idea yes. I want to know where the camera is looking relative to the aircraft nose. The purpose ranges from a 3D sound engine, to helmet mounted sights. I can get the aircraft pitch/bank/heading and make a rotation matrix out of it (inverting the signs of course to un-rotate as you say), and this works on locating the head position relative to the aircraft. But for some reason it isn't working on the heading rotation relative to the aircraft. Yes it is a SR-71.
  18. I also tried this solution from a much older topic of mine (I left off this problem due to other duties at work for a while) but I only got confused from it. https://www.gamedev.net/topic/681548-converting-world-rotation-to-body-rotation/?view=findpost&p=5308675
  19. I see I am still not quite understood....I tried Alundra's idea (multiplying the camera rotation matrix by the inverse aircraft rotation matrix and then decomposing the resultant matrix into its Euler angles as per this site: http://nghiaho.com/?page_id=846 ) However, this is the result I get. As you can see in this picture, the camera pitch and bank relative to the aircraft are correct (0), as is the heading (-48.83). In this next picture, the camera angle relative to the aircraft is again correct in pitch (-38.78) and bank and heading (0). The moment I combine a pitch and heading change however, this is what I get. Bank should still be 0 but as you can see it isn't. Any suggestions?
  20. Yes the camera follows the cone and can be offset from it. I'm not sure I follow exactly though. The cone has its own local transformation which affects the world transformation as well as the camera. Basically, if I roll the cone by -25 deg, set the pitch of the cone to 0, and then rotate the camera heading(yaw) by -90 deg, the camera transform will show a heading of -90 deg, a bank angle of 0, and a pitch angle of -25 deg. If I rotate the camera heading back to 0, pitch will be the same as the cone (0) and roll will be the same as the cone (-25). I need to calculate the local angles of the camera relative to the cone so that if I rotate the camera heading by -90 deg, no matter what pitch/roll values I will get the a heading of -90 deg, pitch 0, bank 0.
  21. Naruto-kun

    Direction to point for lighting purposes

    Perfect. Thanks a bunch.
  22. Hi guys I am working on a directional light system, where the light is collocated with the camera position. As such, the lighting returns will depend on the direction to the surface from the camera position. I want to get the eye direction to this point so I can perform the dot product calculation on it with the surface normal and determine how much light is reflected back. Would I simply normalize the vertex position after multiplying it by the view matrix? Thanks JB
  23. Hi guys   I am reading the pitch/bank/heading values of a camera in this flightsimulator. But I have a problem in that they are relative to world, not body. In the example image, the aircraft is banked 25 deg to the left. The camera bank values reflect this. However, the camera bank angle relative to the aircraft is 0. This latter value is what I would like to calculate. Things get a little more complicated when I pan around to the left in this image. The bank angle becomes the pitch angle. Any suggestions?
  24. Naruto-kun

    Converting world rotation to body rotation

    Thanks a bunch. Apologies for the misunderstanding.
  25. Naruto-kun

    Converting world rotation to body rotation

    I need the camera rotation for custom 3D sound engine purposes. Since I didn't create the camera (it is created internally by the simulator and I have to use a bit of reverse engineering to get hold of it) I need to be able to transform it so that I know where it is looking relative to the nose of the aircraft.
  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!