Got some scattered questions regarding rolling your own GUI system. I have my own game engine with a renderer and window/input handling in place but no GUI stuff yet.
1. Setting up a window and reading mouse/keyboard input involves win32 api. I know the api is archaic but is it unreasonable to try build GUI controls (buttons, text boxes, menues) ontop of it? Dosn't newer versions of windows ship with some new UI libraries?
2. Checking whether buttons are clicked can simply be done by reading the mouse position to see if it is within area of the button whenever WM_LBUTTONDOWN is posted?
3. How are more complex controls such as drop-down menues implemented?
4. (DX11) What is the best/most flexible way to render text? Say you got a text box. It is easy enough to render the box, but any arbitrary text inside it? What about different sizes/fonts? Word wrap?
5. (DX11) Since GUI is rendered ontop of the ingame-graphics, you could just render directly into the backbuffer after all 3D-rendering has been done?