# Determining maximum resolution with hardware-support

This topic is 4611 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

## Recommended Posts

I'm new to OpenGL, and have tried some of the tutorials. I recently wanted to give the user not only the choice fullscreen/windowed, but also an option to select the fullscreen resolution and bpp. I used the windows EnumDisplaySettings function to retrieve the available display settings. However if I set the combination of resolution and bpp too high, it seems like OpenGL uses the software implementation rather than hardware (the frame rate for the simple crate in lesson 7 drops below 1 fps). I have also noticed the same drop in frame rate when I resize the gl-app and the window reaches a certain size. I believe this probably happens due to lack of video memory. When I tried one of MS’s D3D samples at the same resolutions I got the message: "Not enough video memory. Switching to the reference rasterizer, a software device that implements the entire Direct3D feature set, but runs very slowly." It also seems like graphical applications running in the background have some influence on which resolution this happens at. Is there some way to determine the maximum resolution/bpp, before it goes into software mode, so that I can filter out higher resolutions/bpps or restrict the user from resizing the window too large? My graphics card: GeForce2 MX/AGP/SSE2 32MB Last resolutions/bpps before this happen: 1024x768/32 or 1152x864/16

##### Share on other sites
let's do some math and see.

Frame buffer + back buffer + Zbuffer:
(1024x768x4x2)+(1024x768x2)=
6291456+1572864=7864320 or 7,5MB

textures:
(256x256)+(256x256)+(256x256x1.3)=
131072+85196,8=216268,8 or 0,20625MB

Lesson 7 at 1024x768 uses around 7.7MB of video memory so there should be plenty left even on a 32MB card, but as you said, there might be something with applications running in the background.

However, the GF2MX might have a fixed memory area for the framebuffers and that might be the cause, but im not shure about any of this since i cant doubblecheck it, could be the drivers, make shure that they are up to date.

I just have one advice for you who are still struggeling with a GF2, DON'T!
Get a new one, there are cards out on the market that are so anoyingly cheap that they will almost give you money for bying it, and they can still outpreform a GF2mx several times over.

##### Share on other sites
Thanks for the quick reply. I’m going to build a new PC soon, so I’ll get rid of that GF2 card.

I found that the background applications reduced the maximum resolution with high frame rate. Difference in one or two instances of VS 2003 running, made a difference in the maximum resolution.

When trying D3D again, I didn’t get the same error. It might have been caused by a gl-app running in the background. Now D3D seems to work on all resolutions, independent of background applications. As far as I can see from this is that D3D handles fullscreen better than OpenGL, unless there is a better way to do OpenGL fullscreen than the one described in NeHe lesson 01.

Although a new graphics card solves the problem of low framerate at high resolution, I still want to be able to determine when this problem appears, for instance to be able to restrict users of older graphics cards to use a resolution where this problem appears.

##### Share on other sites
I also want to tell you to forgot everything which is lower than NV3x or RD3xx. Their fixed-function pipe programming model blows up the effort required to get what you want. It is much easier to use ARB_vp + ARB_fp and even easier to use GLSL.
While I understand support is very good, I had to accept the fact GLSL-enabled cards are going to be commodity items very soon.

As for determining the maximum resolution, I think there's a get called MAX_VIEWPORT_SIZE (maybe this is for pbuffers?) which tells you. My glGet manpages says:
       GL_MAX_VIEWPORT_DIMS     params  returns  two  values: the maximum sup-                                ported  width  and  height  of  the  viewport.                                These must be at least as large as the visible                                dimensions of the display being  rendered  to.                                See glViewport.

So, it looks as what you're searching for. Well, not exactly but maybe it simply works.
As I remember, NV25 supports rendering up to 4kX4k pbuffers but only 1920x1440 framebuffers. I really don't understand the reasoning behind this difference so I'm probably wrong.
I hope this helps.

##### Share on other sites
I had a look at some games using OpenGL (CS(HL) and ET), and the same problem with low framerate at high resolutions appears in both (in CS the low framerate causes some networkproblem).

As there appears to be no simple way of detecting this problem before it happens, I've decided to include code to monitor the framerate, and take action if it drops below a defined threshold. I don't want users of my apps to get the same problem shutting it down, as I had when I tried to shut down ET with a frame rate of about two frames per minute (the mouse pointer movese every one minute).

##### Share on other sites
Oops, didn't see your post before i posted mine, Krohm.

I tried to get GL_MAX_VIEWPORT_DIMS, but it only returns 4096 x 4096.

Thanks anyway.