HUD elements with fixed size

Started by
4 comments, last by Krohm 12 years, 6 months ago
Hello!

I wonder what is the standard method in PC games to make HUD elments with a fixed size. For example a mini map in the top right corner or a menu on the left side.
If I make these HUD elements with relative sizes (for example the menu bar on the left side has 1/4 width), then they may look good on some resolutions, but ugly stretched on wide resolutions. On the other side if I use pixel resolutions, the element always has the same pixel size, but doesn't that mean the element looks smaller on huge resolutions?

Whats the best way to make HUD elements that always have the same size (i.e. 10x10cm. But it does not have to be that precise).
Advertisement
Most games I play go with a fixed resolution, which means the HUD takes up less space on higher resolutions. If you want relative sizes, I'd go with creating the HUD for the highest resolution supported and scale it down for smaller resolutions with the use of mipmaps.
You pretty much summarized the issue.

Using relative sizes naively causes stretching of the UI, and if there are bitmaps, they get stretched too.

Using absolute sizes has the effect that the larger the resolution, the smaller the item on the screen (relatively).

If you don't want to keep the size of the UI fixed at an absolute size, you need to resize, but in a way that is smarter than just simply stretching the bitmaps. One way is to avoid bitmap content altogether, and create the graphics required for the UI widget programmatically to the given pixel sizes, so that they appear pixel-perfect. This means using vector fonts instead of bitmap fonts, and vector graphics/vector-based draw commands for the UI. For scaling bitmaps, a method called "window-cross" or "9-patch" stretching is sometimes used. See e.g. http://developer.android.com/guide/developing/tools/draw9patch.html . This is not for scaling arbitrary images, but for e.g. backgrounds of windows and controls.

One method to try to alleviate the effects of stretching by authoring the UI in several sizes manually, and at runtime choosing to scale the UI from the size that matches the closest to the current size, but this can look nasty, since the effect from going from one bitmap set of another is so sudden.

There is really no magic solution. One can use very large images and use different algorithms for good down-scaling, but they never beat the result of having done 1:1 pixel art in the first place. Vector-based approaches probably work the best in general (try out different flash applets around the web to compare), but can be tricky to implement.

Personally, I am increasingly confident the solution is to not stretch at all. Not by default at least.
[s]Most desktop monitors work at 72 dpi.[/s] Take advantage of that. This is what's typically done by most operative systems. On increasing the resolution, the UI effectively gets smaller.
Looking back, I simply cannot understand why didn't I see that in the first place.

If you need to do curved contours (for accurate widget borders when resizing), you might want to look at

Resolution Independent Curve Rendering using Programmable Graphics Hardware
Uses a custom pixel shader to evaluate bezier curves. While the algorithm itself is rather straightforward, the machinery needed to drive it is not. For a given set of widgets the problem can be short circuited by hard-coding the necessary triangles but the general solution takes months of work.
Pro: appears to be able to deliver quality even on extreme magnification. Can be done in real time.
Cons: quality in minification is just so-so, requires work, polygonal trick which won't natively interact with other effects (including rendering multiple countours).

Improved Alpha-Tested Magnification for Vector Textures and Special Effects
This evaluates a "almost vector quality" contour by the means of a distance field. The math behind is really neat yet simple. The good news: the most naive implementation requires almost no work. More good news: being essentially a texture effect, there's no need to deal with polygon offeset and it natively interacts with most manipulations you can throw at it. The cons are limited: in my experience this method tends to break at 6-10X magnification... and the need to build the distance map can be a problem.
But it's still really good - just compute the distance field and fetch it to the hardware.

EDIT: no! They actually work at 96 dpi!

Previously "Krohm"

I want to make it as easy as possible, so I guess I just go for absolute pixel coordinates.

Just out of curiosity: I did not understand the meaning of this dpi stuff. Wiki says "the number of dots per inch", but that does not sound very helpful to me. Inch in which direction? Horizontal, vertical, diagonal? If I have a resolution of 800x600 and change it to 800x400 the number of dots per inch in the horizontal did not change at all, while the resolution did change.
And your statement suggested the dpi number if fixed for a moniter. So it is not dependent on the resolution? I am confused. Could someone clarify this dpi stuff?
I'll elaborate a bit as I have cut a few corners above.

Historically, there have been two standards when it comes to pixel density. You might have read this on the WP page.
Monitors are manufactured with pixel of a fixed size. This size is called dot pitch and it's the distance between two pixels. It is a physical property which does not change.
When the resolution is changed, the monitor will stretch the image (or maybe not, depending on driver setting) to cover as much as it can. There will be less "logical pixels" for the same amount of "physical pixels". The difference is somehow interpolated but in line of theory, every monitor would always run at the highest setting.
To have an idea of this fixed density, you can disable monitor scaling.

In the end, this all boils down to make stuff work as the user expects, and it expects the GUI to get bigger when the resolution is lowered, because that's how it worked for a while. I'm not suggesting to probe the system to check out the real size (inches) of the hardware display.

The apparent pixel density (density of logical pixels) does indeed change with resolution. Unless no scaling is being used, but that's fairly rare IMHO.

Previously "Krohm"

This topic is closed to new replies.

Advertisement