The FOV used in a game depends primarily on the genre. FPS need large FOVs to be playable, and distortion is expected. The first thing I found on Google describes how Valve does it. If you are making an RTS with overhead camera (like Warcraft III), you can probably use a smaller FOV. Isometric projection can be seen as the limit of perspective projection where FOV is 0, and even that is acceptable. There is a Wikipedia page on FOV in video games, but it doesn't seem very informative.
But how does this then work in the game industry then?
Remember that as the field of view tends to zero, you lose all sense of depth (as Alvaro said, the limit is isometric projection, which has no notion of depth). First person shooters need very good depth perception for aiming, so this takes precedence over edge distortions and a high field of view is thus preferred.This sound interesting and nothing I really thought about. I always thought that the only thing the field of view does is that it works as a scaling factor (common example would be the perspective projection matrix used in OpenGL etc.). A smaller fov, from what I understand, would mean that my objects get scaled bigger which would be something similar to zooming in with a camera. How does this affect the depth perception of the scene?
You could also handle the screen as not just a rectangle i guess. If you treated it as a curved surface (imagine some 5 horizontal monitor setup) it might produce a spherical sphere even on the edge of your view (not sure if that works in practise) Kind of like those panorama pics you can make with cameras and some phones.
This is a proposal that I've seen in a few places which could solve the problem. I believe I read about this in another thread here in gamedev when searching for information about my problem where the same idea was proposed but according to people in that thread a curved surface would result in other artifacts in the scene instead. Still being able to implement a curved surface just to see the difference between it and my current screen would be interesting except I have no idea where to even start let alone how I could implement one. I will have to search more about it, relevant links to this would be appreciated.Yes that works in practice. Kind of hard with a rasterizer as you need pretty well tessellated geometry, but for a ray tracer it should be pretty easy to set up.
You could also handle the screen as not just a rectangle i guess. If you treated it as a curved surface (imagine some 5 horizontal monitor setup) it might produce a spherical sphere even on the edge of your view (not sure if that works in practise)
I've managed to minimize the effect of the distortion somewhat (well it's really only minimized for specific settings). The biggest problem I failed to notice was that the camera position was way too close to the scene objects. This would cause the projectors/rays to be spread in a really wide angle for objects close to the edge which would just increase the distortion the closer the camera got to them. Moving the camera and the view plane further away from the objects did lessen the distortion significantly. Of course the problem now was that I was "zooming" out and so my objects became smaller but this is where I used the field of view to correct that by decreasing it. Not sure if this is a correct approach but it did give me less distortion (see image 1).