Renderer and GUI

Started by
3 comments, last by XeonXT 13 years, 7 months ago
I'm just curious as to how most people handle there GUI rendering in terms of their renderer. Currently for all my 3D objects etc I have an object that is filled and sent to the renderer, and put in a render queue which handles sorting it and rendering it. Should I do the same with the 2D/GUI stuff and send some kind of specialised object for drawing 2D objects?

If so would it be better to separate them into separate queues and handle them separately?

What are some of the ways of others are doing this sort of thing?

Cheers
Advertisement
I'd recommend adding the idea of views to your renderer. Each view can have it's own projection/view matrices associated with it. With this setup you can just create a 3d view and a 2d view and render all of your frontend to the 2d view.

The 2d view of course having an orthographic projection set up for it's projection matrix.
In general, a GUI comprises one or more textured full-screen quads drawn after all your other rendering. Yes, it would probably be better to give them their own queue. But do you really need a queue for the GUI? IMO, best to handle the entire GUI system as a single vertex and index buffer so it all only takes a single draw call. Of course, the tricky part of this is packing everything you need into a single texture.

There's really no need to worry about transforms, as you can specify your vertex coordinates directly in terms of normalized coordinates (or, even better, viewport coordinates, then divide automatically by the dimensions of your viewport).
I like the idea of having one VB/IB for rendering the GUI. The packing of the textures I could probably achieve by writing some kind of custom sprite/texture generator which takes all images used and compiles a texture after initialization. However, I don't know if having one texture to represent the entire GUI would work? It couldn't possibly fit all the frontend GUI plus the in-game GUI surely?


I'm still not sure how it fits in nicely with the renderer. For the 3D world I send an abstract DrawAtom structure to the renderer which just contains the bare basics to render the object. With the GUI/HUD are you saying that it'd be best to have a separate 2D rendering function or whatever where I simply send the vb/ib/tex from the GUI manager?
Hmm, I think it really depends on how complex your GUI is - whether or not you'd be able to fit it into one sheet. If you are clever about tiling backgrounds on large windows and such, you could fit it on one texture. However, if you're going to have an ornate GUI in which every piece is unique (i.e. no tiling and no reusing pieces), it will be challenging I suppose.

I believe you're making the rendering part of it too complicated. Granted, I don't know how you have your engine set up, but it shouldn't be a problem to set up a GUI object that encapsulates the single VB/IB storing the GUI information, then do an extra draw call for the GUI after your other rendering is done OR have it sent to the engine for drawing after all the other things.

Perhaps your problem is reconciling the 2D with 3D. Understand this: the GUI is 3D, but the z-values of the vertices are 0 and the vertices are not subject to view or projection matrices. You should have a way to enable a shader that simply draws vertices without modifying their position. This is what you would use to render the GUI.

In this sense, the GUI can be treated exactly like any other object in your engine (assuming there's some degree of flexibility in shader usage). The only special thing about it is that it must be drawn last (assuming it has transparency. Else, draw it first so you don't even have to render the pixels that it is occluding). You can implement that however you like.

[Edited by - XeonXT on September 3, 2010 3:26:10 PM]

This topic is closed to new replies.

Advertisement