Sign in to follow this  
mathman_pir2

Simple Question (i think)

Recommended Posts

I've searched but can't determine why the field of view area is not proportional to the screen resolution in Xna. For instance, with a 800x600 screen size the maximum x value is something like 17000.0f making the horizontal amplitude 2x17000 when there should only rightfully be 800 pixels, I don't understand that. How do I calculate the field of view area (exactly)?

Share this post


Link to post
Share on other sites
You define the field of view yourself when you build the camera matrix.

It looks like what you're wondering is why something moves by 17000 units to travel from one side of the window to the other. Correct? It all depends on what field of view you're using and how far away from the camera your object is.

Do you need to be using 3D? From the sounds of it you might only need to be working in 2D. What are you trying to do?

Share this post


Link to post
Share on other sites
I think you are correct. My camera is positioned 25000 units backward along the z axis and aimed at (0,0,0). I'm making a game of asteroids that I want to be full screen and screen resizeable, which means I have to know the boundaries of the viewable screen area in order to display the score and have objects wrap around the virtual space when they drift off the edges.

With my 800x600 screen, if I specify an asteroid to start at 17000.0f it appears along the right edge. If I specify 800 ( the horizontal width in pixels ) it appears in the center of the screen.

I just need to be able to calculate the boundaries of the field of view in 3d space. I will be making my game first person in the future which is why I'm sticking with 3d.

Share this post


Link to post
Share on other sites
If you're using 3D, then world co-ordinates are often separate from screen co-ordinates. That means, 800 units usually doesn't mean 800 pixels. You need to set it up specifically for that to be true, but my advice is: don't bother.

World units should not equal pixels. If they do, it should just be a coincidence. If a world unit equals a pixel, you're going to run into trouble when someone resizes the window.

Share this post


Link to post
Share on other sites
Well... I think that's established considering I just said when I draw to (800,0,0) the object appears in the center and when I draw to (17000, 0, 0) it is on the right edge of the screen.

I realize the space is different, now what I need elaborating on is:
Quote:

World units should not equal pixels. If they do, it should just be a coincidence.


I want to understand why they "should not equal pixels" so I can calculate exactly where the edges of my screen are in terms of my field of view area.

Share this post


Link to post
Share on other sites
Quote:
Original post by mathman_pir2
Well... I think that's established considering I just said when I draw to (800,0,0) the object appears in the center and when I draw to (17000, 0, 0) it is on the right edge of the screen.

I realize the space is different, now what I need elaborating on is:
Quote:

World units should not equal pixels. If they do, it should just be a coincidence.


I want to understand why they "should not equal pixels" so I can calculate exactly where the edges of my screen are in terms of my field of view area.

World-space units should not be tied to pixels because you'll run into problems, like the situation I mentioned about the resizing of the window. In that case, the number of world-space units is the same, while the number of pixels changes.

Instead of trying to find the edge of the screen in pixels, why don't you just define your world so that it's something easier to use? In my 2D game, I have my camera, objects and projection set up so that (1, 1) is the top right hand corner of the screen and (-1, -1) is the bottom left hand corner of the screen. Using some arbitrary numbers like 25000 units for the camera position is bound to give you odd values like 17000 for the edge of the screen, depending on what you set your FOV to.

Share this post


Link to post
Share on other sites
I really thought this would be a simple question.

With this: CreateOrthographicOffCenter, it looks like I have specify the values by hand which isn't what I want. Although, to be honest, I really don't understand it, so it may or may not be the solution.

Having the left side of the screen = -1 and the right side = 1 would be wonderful and all that, but how?

It's time to rephrase my question, maybe I've made it seem more complicated than it is...


Vector3 cameraPosition = new Vector3(0.0f, 0.0f, GameConstants.CameraHeight);

projectionMatrix = Matrix.CreatePerspectiveFieldOfView(
MathHelper.ToRadians(45.0f), // the angle of view
aspectRatio,
GameConstants.CameraHeight - 1000.0f, // the near plane
GameConstants.CameraHeight + 1000.0f); // the far plane.

viewMatrix = Matrix.CreateLookAt(cameraPosition, Vector3.Zero, Vector3.Up); // this says we are looking at point (0,0,0) in world space.


Now I just want to know what the offset is of the left, right, bottom, and top so I can create a "wrap" effect. (When objects drift off the screen, they wrap around to the other side).

Elementary math says to me: (-CameraHeight,0,0) = left side of viewable world space because tan45 = x/CameraHeight -> x = CameraHeight. But if I position an object at (-CameraHeight, 0, 0) I'm unable to see it.

I really don't know how to make it any clearer than this: I need to find the coordinate of the edge of my view...

( the reason I don't just want to use an arbitrary constant value is because I'd like my window to be resizeable and still work properly. )

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now

Sign in to follow this