I am about 60-70% done with my game engine editor and have some questions about how to best design the threading architecture of my editor.
Currently, I have 2 separate threads: one that only handles the OS events (call this the UI thread), and another that is the "main" thread of the editor that updates in a loop at 60Hz and handles all of the rendering/simulation updates. My editor uses a custom GUI that is rendered using OpenGL, and all of the updates/rendering of the GUI also happen on the main thread. Events (mouse,keyboard) received on the UI thread are double-buffered and sent to the main thread. The events are then dispatched on the main thread to the hierarchy of GUI widgets. This used to work fine until I started adding more features.
Now, recently I have implemented drag and drop, and it requires handling everything on the UI thread so that it can inform the OS whether a drop can be performed. At the moment, I have mostly ignored thread saftey. When I drag objects around the editor (say from a list into a scene viewport), it can sometimes cause those dropped objects to have their graphics data initialized from the UI thread, which causes a crash since there is no OpenGL context on the UI thread. The same problem occurs when I use native OS menus - selecting a menu item (e.g. creating a new mesh) sends a callback from the UI thread. If I directly create the object in that callback, it can crash when the graphics data is initialized.
I can synchronize these callbacks, but it is quite a lot of work to do since I have dozens of menus that would need to safely communicate to the main thread what item(s) were selected. Drag and drop is more problematic since I have to immediately return the drag operation that can happen from the UI thread callback.
So, I ask for your advice for how I should proceed to fix these problems. I see 2 possible paths:
- Merge the UI/main threads into a single thread that does everything. This will be safest of all, but could cause other problems with responsiveness. I am worried about what happens when I add VR support and need to have precise control over the frame rate. On OS X (my main development platform), I don't have control over the UI thread's event loop so I'm not sure how to ensure that I get a callback every 60Hz or 90Hz to keep up the frame rate. To do this, I have to rewrite a lot of the windowing/buffer swapping code to work from a UI thread callback instead of a simple loop.
- Keep it as is, but synchronize the hell out of it. This is LOT of work (probably adds 20-40 man hours, which I'd rather avoid). Every single menu in the very large editor will have to have threadsafe event communication. Drag and drop becomes more difficult to implement (I have to mutex with the main thread, do everything on the UI thread, and then communicate the dropped objects to the main thread when a drop operation occurs). This sync/communicate would have to be done at every place in the GUI that accepts a drag operation (dozens of locations). BUT - this way gives me more control over the frame rate (e.g. I can drop to 10Hz when the editor is not the foreground app), VR timing becomes easier. I also don't have to worry about stalling the UI thread if my rendering takes too long.
How do existing game engine editors (Unity, Unreal, etc.) manage their threads? I believe that Unity editor all runs on the UI thread (including rendering), but I may be wrong. What is the best overall architecture that is versatile, safe, performant, and future-proof?