Engine UI

Started by
14 comments, last by frob 4 years, 11 months ago

I'm currently in the early stages of programming a cross platform game engine. My goal is to create an engine kind of like Unity, but I'm stumped on how to do it.

In Unity, the engine and the game are the same program, with the game window inside and in a different project. I don't know how to achieve this.

The game's running, but also not, in a different project, but also not, and can be editted while sorta running?

I have no clue how to pull this off. Any help would be appreciated, don't hesitate to ask questions. Thanks.

Advertisement

Hello, I'd like to provide a little information which may be useful. I'm not a professional engine developer. I only made some own engines for fun so this is just personal ideas.

For small team, I think it is much better to use WPF + subprocess(for engine) to simplify the whole task. Since like Unreal (use slate which is rendered by unreal itself) and Unity (based on immediate mode gui) are too complex for you to build such GUI solution. 

I recommend you to see Capcom's RE engine architecture. https://cedil.cesa.or.jp/cedil_sessions/view/1484

They build a system to use WPF as editor which can use MVVM and run engine in a process which can be in same machine or in console. The editor and engine can communicate by RPC. This system seems really useful for the small team compared with Unreal or Unity 3D. You can sync the rendered image to editor or much easier, just put the output window inside the editor with native window control in WPF.

I hope this can help. Please ask any questions. I'm happy to discuss.

 

 I wrote a responsive UI for mine using just OpenGL, LUA, and elbow grease - and wouldn't recommend going down that path because responsive UI can be difficult to lay out correctly and optimize, in addition to having all the other engine junk running.  

If you just add some static click-state buttons (i.e. click inside the rectangle image) on the screen, that's easy.  However if you're in C++ land I'd recommend you encapsulate your engine in Qt.  Qt is very powerful, has tons of optimized pre-written stuff (even math) and works great as a foundation for game engines.  If you don't trust it, you can also do something like @godofpen said and create a UI in C# (WPF, Winforms, Java AWT, maybe even native JS) and route button messages to your game process.  However, if you plan on making a game that utilizes the same UI that you're coding for the engine, it might help to have the whole thing in one window like Unity.  This is not to mention creating a separate "custom game UI Editor" outside of the engine UI itself.  We;d hope you have a small team together before you tackle that dragon of a project.  Now if someone knows how to host the game process itself inside of a Unity-like WPF window, that would be sweet because I haven't figured out how to do that yet.

Unity is kind of complicated to explain if someone hasn't workded with it so far. I worked with it for years and shipped a lot of titles so may point myself out to have some good insights into how it works. Unity games run in a standalone player that provides anything platform specific so the abstraction layer and whatever is needed in the game like input, rendering and so on. Your game is just a set of assets along together with your IL code translated from C#. The player provides a mono runtime in the program itself to execute your game code together with a C# interface library (using UnityEngine.dll include statement). The Unity Editor is almost the same thing but improved with the UnityEditor.dll that offers all the features like Editor UI on top so basically it is the same environment and hitting the Play-Button just starts some kind of debug playback mode internally.

Unreal the other way is based on C++/CLI and partially C# so they have access to the .NET Window Toolkit but I need to guess too what they used for their current editor.

However, I have had an idea and it points out that Unity has had the same idea what means that it isn't that bad at all; using HTML/CSS as the platform to write the editor/ game UI in. Unity uses HTML and USS (Unity Style Sheets), a subset of CSS in their newest release, I wanted and still want to implement my engine's UI in full featured HTML5 and CSS3. This has some advantages against any other UI toolkit that are

  • It is responsive by default; HTML runs in many browsers on different platforms and resolutions so it needs to handle that by default
  • Easy to use; HTML/ CSS can be written and tested outside of the game in any browser and is well known
  • Powerfull; HTML/ CSS offers a lot of layouting like Grid and Flex and even supports SVG rendering
  • Easy integration; HTML elements offer a lot of possible ways to integrate into the game. You just provide the name of a function into an elements attributes and your engine can bind the native pointer to it for whatever action you like

After ScaleForm this is the inevitable next step to go and I'm happy to see Unity share my opinion. This will also make editor integration very uncomplicated because you write the same UI using HTML/ CSS and use some desktop frameworks nodeJS + Elektron and possibly EdgeJS so you wont have any trouble porting your editor to different platforms and utilize a rich set of existing features but still have the possiibilities to embedd your engine into it. Writing the editor functions in JS/ TypeScript lets you provide a feature that Unity also offers: Editor Scripts and realtime compilation/ transpiling.

This is at least one of my goals for the current iteration of my engine framework

Thanks all info!

I have done something similar to Capcom's RE engine solution (at least based on godofpen's explanation, they require you to login to download the document and, meh). My editor is a WinForms application that starts up and connects to an instance of the game running in a special editor mode. The game then renders to a panel in the editor window, making it look like everything happens inside the same program. Technically the editor and the game are two different processes, but they keep track of each other reacting if the other one crashes etc.

You can see the editor in action in this development video: https://www.youtube.com/watch?v=fUFVh-2Aefg (video-production-wise not that great I know, working on getting better)

There are pros and cons to doing it like this:

Pros:

  • The editor is true WYSIWYG since the game's native renderer is used to render the scene, as opposed to having a separate editor.
  • The editor UI is quite easy to keep responsive as most of the heavy stuff is performed by the game side. It does mean you have to manually block the user from doing stuff while the back-end is busy though.
  • You get to use the large library of ready-to-use components available for WinForms (also true for WPF and, I assume, Qt)
  • In my case, since I actually run the game exe through the editor, I can add editor-side functionality on the game side. Eg. if I have an object type in a game which needs to behave in a special way in the editor, I can put that code into the game exe and the core engine libraries aren't touched.

Cons:

  • Many things that are simple in a single-process editor become much more complicated. Eg. changing the name of an object in the scene involves a step on the editor side, a step on the game side and finally a step on the editor side to keep everything in sync.
  • Debugging the editor becomes a lot trickier since you are working with two different processes.

My case is complicated further by the fact that the editor functionality wasn't planned from the beginning. Instead, the editor was bolted onto the engine at a later point. Because of this I still don't have "Play in editor" functionality, ie. the kind of thing you get by hitting Play in Unity. It is certainly possible to do but I would require me to re-architect some core parts of the engine. Another issue is that parts of my engine runtime never expects there to be any issues with missing resources or the like. Most editors show an error message in these cases. Depending on the case, my editor might just crash because the loading code hasn't had to handle these cases until the editor part was bolted onto the core engine. Most of these things are fairly simple to fix though...

If you are thinking about using web technology for your editors it might be worth reading through Insomniac's postmortem about their experiences: https://deplinenoise.files.wordpress.com/2017/03/webtoolspostmortem.pdf They switched away from web tech some time ago and it sounds like they weren't having that much fun going that route after all.

20 hours ago, Shaarigan said:

Unity is kind of complicated to explain if someone hasn't workded with it so far. I worked with it for years and shipped a lot of titles so may point myself out to have some good insights into how it works. Unity games run in a standalone player that provides anything platform specific so the abstraction layer and whatever is needed in the game like input, rendering and so on. Your game is just a set of assets along together with your IL code translated from C#. The player provides a mono runtime in the program itself to execute your game code together with a C# interface library (using UnityEngine.dll include statement). The Unity Editor is almost the same thing but improved with the UnityEditor.dll that offers all the features like Editor UI on top so basically it is the same environment and hitting the Play-Button just starts some kind of debug playback mode internally.

Unreal the other way is based on C++/CLI and partially C# so they have access to the .NET Window Toolkit but I need to guess too what they used for their current editor.

However, I have had an idea and it points out that Unity has had the same idea what means that it isn't that bad at all; using HTML/CSS as the platform to write the editor/ game UI in. Unity uses HTML and USS (Unity Style Sheets), a subset of CSS in their newest release, I wanted and still want to implement my engine's UI in full featured HTML5 and CSS3. This has some advantages against any other UI toolkit that are

  • It is responsive by default; HTML runs in many browsers on different platforms and resolutions so it needs to handle that by default
  • Easy to use; HTML/ CSS can be written and tested outside of the game in any browser and is well known
  • Powerfull; HTML/ CSS offers a lot of layouting like Grid and Flex and even supports SVG rendering
  • Easy integration; HTML elements offer a lot of possible ways to integrate into the game. You just provide the name of a function into an elements attributes and your engine can bind the native pointer to it for whatever action you like

After ScaleForm this is the inevitable next step to go and I'm happy to see Unity share my opinion. This will also make editor integration very uncomplicated because you write the same UI using HTML/ CSS and use some desktop frameworks nodeJS + Elektron and possibly EdgeJS so you wont have any trouble porting your editor to different platforms and utilize a rich set of existing features but still have the possiibilities to embedd your engine into it. Writing the editor functions in JS/ TypeScript lets you provide a feature that Unity also offers: Editor Scripts and realtime compilation/ transpiling.

This is at least one of my goals for the current iteration of my engine framework

I really agree with HTML based UI solution, too. I think Mr. @Shaarigan provide a really good point of view for this solution. And it will be much suitable if you already have a JS based script system because it will be much easier for dealing with UI and background logic instead of a string-based event handler.  

I want to add a little more information for Unreal. Unreal engine 3 mixed a lot of technologies for UI. However, for Unreal Engine 4, it used a totally self-written UI framework called Slate which is rendered by Unreal engine's own rendering system. This UI framework also used in the games with a higher abstract called UMG. UMG can be edited by a visual editor but Slate cannot.

Actually, the UI solution for your engine is based on a lot of things. Some of them are:

  • What's your engine for?
    • Is it a generic engine or just internal usage or just personal?
    • Does your game need a complex editor?
  • What's your team's background?
    • Could they handle a large UI framework's development or they just want to reuse existed framework?
    • Do they think it is valuable for a complex editing system? Or they think an in-game editor is enough? 

We can provide our own ideas. But the best solution can only be determined by yourself based on your own situation. 

9 hours ago, godofpen said:

Unreal engine 3 mixed a lot of technologies for UI

Thats right, I struggled about their ScaleForm implementation having a memory leak when porting a game to Playstation so Flash is a no-opt for me right now :D

Both have been discontinued, Flash in 2017 and ScaleForm in 2018, so I agree you shouldn't integrate them in new products.  

Some more input if you like the HTML route:

  • Coherent Labs' various products [Paid] (aka Scaleform's more modern alternative)
  • Ultralight [Free for individuals and small Indies] (aka the lighter-weight, open alternative to Scaleform and Coherent)
    • Essentially just an HTML rendering engine with Javascript binding support
    • Just got to version 1.0
    • Missing lots of features in comparison to Scaleform/Coherent
    • https://ultralig.ht/

This topic is closed to new replies.

Advertisement