I'm trying to simulate a realistic camera in OpenGL ES, using iPhone camera and data received from gyroscope sensor. Basically when I
receive roll, pitch and yaw rotation from gyroscope I pass them to the openGL context and rotate every object in scene by minus of each angle, simulating the camera rotate.
// first object rotation/translation CC3GLMatrix *modelView = [CC3GLMatrix matrix]; // cocos3D matrix [modelView populateFromRotation:CC3VectorMake(-pitch, -roll, -yaw)]; [modelView translateBy:CC3VectorMake(0, 0, -distance)]; glUniformMatrix4fv(_modelViewUniform, 1, 0, modelView.glMatrix); // draw object
// second object rotation/translation CC3GLMatrix *modelView2 = [CC3GLMatrix matrix]; [modelView2 populateFromRotation:CC3VectorMake(-pitch, -roll-30, -yaw)]; [modelView2 translateBy:CC3VectorMake(0, 0, -distance)]; glUniformMatrix4fv(_modelViewUniform, 1, 0, modelView2.glMatrix); // draw object
And it works just fine. The problem is that iPhone coordinate system rotates with the phone, so for example roll is always "hitting" the sides of the phone. So when I add another object, alpha degrees to the left/right, it becomes the left/right of the iPhone coordinate system, not the world coordinate system.
The first image is with no rotation, looking ahead.
But when I rotate the phone to the right by around 30-40 degrees, the objects' sides are no longer parallel and its obvoius that coordinate axis are bound to the device.
As the phone is inclined to the right, so are the objects, and the only difference in objects positions is -30 in roll.
Is there a way to convert these coordinates to the world coordinates or any other way to rotate the whole scene?