Coordinate systems

Started by
2 comments, last by Stani R 14 years, 10 months ago
Hi, My math is pretty basic, but I'm very much interested in learning it more, not touched it since uni over 12 years ago really, I use to be ok at it. Could somebody explain certain coordinate systems such as why I would want to convert spherical coordinates to cartesian and what is cylindrical coordinate system, what would that be used for? Many thanks, Steve

If it isn't working, take a bath, have a think and try again...

Advertisement
Different co-ordinate systems make certain types of problems easier to work on. For instance, if you have something that radiates out from a point, like light, a spherical co-ordinate system usually makes the math easier. Or, for water flowing through a cylindrical pipe, a cylindrical co-ordinate system is more appropriate.

The final answers should be fundamentally the same no matter what co-ordinate system you use, they are merely a convenience.

--www.physicaluncertainty.com
--linkedin
--irc.freenode.net#gdnet

Thanks for that,

I can understand converting cartesian to polar coordinates:

r = √ (x*x + y*y) ( pythag to get r )

θ = atan( y / x ) ( get the angle )

Anybody know of some good examples in game development where different types of coordinate systems are used?

Thanks,
Steve

If it isn't working, take a bath, have a think and try again...

I can think of just one (related to some things I'm doing, but not implemented yet): Let's say you are making a game set in space and you want to make the skybox (spacebox?). You can pull pregenerated images from the net but the resolution on them is usually not nearly sufficient and they are not easily mapped to a box (most are some weird form of cylindrical projection or something along those lines). So you'll have to make your own, taking into account the position and brightness (magnitude, spectral class) of various stars (unless you want it to be accurate over long time spans and/or distances, this is just a step in the asset pipeline). Star catalogs give data in some kind of celestial coordinate system, and you're trying to project them onto six textures, so you have some interesting math ahead of you.

Then again, "Real Space"(tm) isn't all that interesting to look at compared to "fancy colored nebulae and space fog", and since you'd have to regenerate the textures once you move a large enough distance from Earth or enough time passes, the real question is "why bother?". Part of the reason why I haven't even thought much about implementing it yet :)

For games so far from what I've seen the problem is mostly not about "spherical vs. cartesian" but about "world space vs object space". This gets more fun since your scenegraph has local transforms but your physics needs world transforms and the results must be re-integrated. Wouldn't be so bad if it was just positions but we have orientations and scaling in the mix, as well.

This topic is closed to new replies.

Advertisement