Convert between coordinate spaces

Started by
5 comments, last by Mercurialol 11 years, 2 months ago


I'm trying to simulate a realistic camera in OpenGL ES, using iPhone camera and data received from gyroscope sensor. Basically when I
receive roll, pitch and yaw rotation from gyroscope I pass them to the openGL context and rotate every object in scene by minus of each angle, simulating the camera rotate.


// first object rotation/translation
CC3GLMatrix *modelView = [CC3GLMatrix matrix]; // cocos3D matrix

[modelView populateFromRotation:CC3VectorMake(-pitch, -roll, -yaw)];
[modelView translateBy:CC3VectorMake(0, 0, -distance)];
glUniformMatrix4fv(_modelViewUniform, 1, 0, modelView.glMatrix);
// draw object

// second object rotation/translation
CC3GLMatrix *modelView2 = [CC3GLMatrix matrix];
[modelView2 populateFromRotation:CC3VectorMake(-pitch, -roll-30, -yaw)];
[modelView2 translateBy:CC3VectorMake(0, 0, -distance)];
glUniformMatrix4fv(_modelViewUniform, 1, 0, modelView2.glMatrix);
// draw object

And it works just fine. The problem is that iPhone coordinate system rotates with the phone, so for example roll is always "hitting" the sides of the phone. So when I add another object, alpha degrees to the left/right, it becomes the left/right of the iPhone coordinate system, not the world coordinate system.


The first image is with no rotation, looking ahead.

resizeofb5xpx.png

But when I rotate the phone to the right by around 30-40 degrees, the objects' sides are no longer parallel and its obvoius that coordinate axis are bound to the device.

resizeofb2lwx.png

As the phone is inclined to the right, so are the objects, and the only difference in objects positions is -30 in roll.

Is there a way to convert these coordinates to the world coordinates or any other way to rotate the whole scene?

Advertisement

I'm not too familiar with Cocos3D, but if I'm reading your code correctly, it looks like you're performing the rotation and then moving the objects in world space. The problem is that you're performing those two operations in the wrong order. In general, you want the most "local" transformations to be applied first; for example, if an object is rotating around its own axis, you'll perform that rotation first, then translate it to its position in the world. However, if it's rotating around the camera, you'll want to translate it before rotating. Every translation/rotation is applied in the camera's space, not in the space of the object that received the last transformation.

To visualize: using simple matrix transformations, if you put an object in front of yourself and translate it "up" (+Y), it will always move toward your "up", not toward its "top" side (unless it's already right-side-up). If you rotate it, it will always move as if it's orbiting around your head. If you want an object to be rotating around its own axis but staying in one location, you need to put it "inside" your head, rotate it there, and move it out to its final position; otherwise, it will move around your head as it rotates. A little unintuitive, but the math works out nicely.

Yeah, that's what I initially thought but it looks like it goes the other way around. Not sure why... I don't think that is the issue here, since you can see on the first image that the I have the correct order. Otherwise you'd see both objects in the same ( -distance) position, just rotated differently.

Yeah, that's what I initially thought but it looks like it goes the other way around. Not sure why... I don't think that is the issue here, since you can see on the first image that the I have the correct order. Otherwise you'd see both objects in the same ( -distance) position, just rotated differently.

That's quite odd. It seems as though the changing device rotation alters the objects' rotation, but the position is only affected by the first rotation value and then stays fixed. Weird...

Is the origin of each object (without any transformation) at the center of the rectangle, or is it at some other point?

The origin is at center.

Remember, I'm getting the gyroscope data from the iPhone and the rotation data I'm getting is relative to the device. It's simple as this: The difference between two objects is 30 degrees (roll). When I rotate the device, the coordinate space rotates with the phone, while the world coordinate space stays static, obviously. When I rotate, the difference is still 30 degrees in roll, the thing is that roll is always hitting the sides of the phone, therefore it's not the same axis, as it was before rotation.

Aha! I think I've got it. When you subtract 30 from the roll value, you're doing it in the conversion from model space to view space, so the second object's position will change by 30 degrees in view space, not in world space. Try splitting it into two transformations: one to rotate its position vector 30 degrees in world space, followed by the (-pitch, -roll, -yaw) transformation to get it into view space. You might have to swap the order of those two transformations because Cocos3D seems to perform matrix operations in the inverse order (at least from what I've seen your code do, but I'm not completely sure because I've never messed with it).

Oh man, you're a lifesaver!


[modelView populateFromRotation:CC3VectorMake(-pitch, -roll, -yaw)];
[modelView rotateByY:-30];
[modelView translateBy:CC3VectorMake(0, 0, -distance)];

One of those problems...When you feel stupid after realizing how simple the solution was smile.png

Thanks again

This topic is closed to new replies.

Advertisement