This is a repost from my external blog http://codingintransit.blogspot.ca
I spent the last 2 weeks working on a 2 week sprint to try and get the beginnings of a UI editor in place.Week 1
The first step was cleaning up all the UI Elements I'd accrued over the last year or so. The big problem was that as my codebase evolved I didn't want to spend the time to go back and clean up old systems. On one hand this worked to my advantage because I was able to iterate much faster. On the other hand it left me with a lot of cruft in my codebase.
The first thing I wanted to clean up was the UIClickable. Maybe I should go into a bit of background though before I go into the details. Almost every UIComponent is made up of 4 core sprockets, TransformSprocket, UIDimensions, UIClickable, and UIRenderable.TransformSprocket: The transform sprocket is a common sprocket that just wraps a matrix, I use it in any component that needs a transform of some kind.
UIDimensions: This guy contains a width and height and is used to calculate the bounding box of a UI element.
UIClickable: Things get a little messy here, bad naming choice. This guy basically takes care of any input functions a UI component might need, these are currently mouse events and keyboard events.
UIRenderable: I haven't been a fan of this guy from the start, but basically any UI element that can be displayed has one of these and gets linked up through the IO system with a render callback which handles any necessary rendering.
Ok so where was I, ah yes the UIClickable. So previously I had an old message passing system which packed any necessary data into a message class so any function which took a message would have to pull out the data it needed from the Message object. The problem with this of course is that the person implementing the function doesn't actually have any idea what's actually in the message since it's really just a blob of data. Anyways over the course of the project this evolved into my Energon IO system which takes strongly typed messages so you know exactly what you're getting. The problem though was that I still had a lot of the old message handling going on and UIClickable had paths for both. So the first step was ripping out all the old message handling and clean up any of the UI elements that were still using that code path.
While I was cleaning up the code paths another problem I had with the UIClickable was that all of my coordinates for an object were based on their parents coordinate system, so a button at 0,0 that lived in a window, the 0,0 would refer to the center of the window. Now to accomplish this I had put all of the code in the mouse event callbacks in every UI element so that meant a lot of duplicated code, and in some cases the math was slightly different then other elements.
To clean this up I moved all of the conversion to local space calculations into the UIClickable. Since every sprocket can access it's parent and from there could access the TransformSprocket and UIDimensions it wasn't much work to refactor all this code. The downside to doing this though was that it meant not only did I have to rip out all the old codepaths in the UI elements, I also would have to refactor the current code paths to take into account that this was now being calculated inside of the UIClickable.
Another thing I put in that I didn't have before was mouse over events. I think this gives a really nice touch to the UI. Makes everything feel more alive when there's some kind of texture change or effect on mouse over. This led me to rework my buttons a little more. As it stood currently the buttons had support for textures on different states ie, mouse over, mouse down, idle. However I hadn't actually hooked those textures up to the UISprite which is a part of the button. So I set about adding a simple state machine for the button and setting up the sprite to take a different texture depending on the state that the button was in. I set up some safe guards so if a texture for a state isn't set in the create params it will just use the idle texture and of course if there isn't an idle texture set it will assert.
At this point I took a little break from the UI and had a look at revamping some of the code I had in place for hot swapping assets. What I had in place previously was a system which allowed you to specify a directory and a callback and if any files in that directory changed the callback would be called with the name of the file allowing you to perform any needed code to reload the asset.
This didn't really work very well for assets that were nested within multiple directories or if multiple asset types lived in one folder. What I decided to do instead was change my FileWatcher task to register with the AssetInventory callbacks based on asset types. So in this way there was a callback for Textures, Shaders, VoxelSprites, etc etc. I set up callbacks for all the main asset types I'm currently loading and plan on expanding it down the road with user specified type callbacks. So now it queries the AssetInventory every so often for the loaded assets. It then calculates an initial time stamp for any new assets. If the asset is already registered with the watcher it checks to see if the timestamp has changed. If the timestamp has changed then the asset is reloaded and swapped out of the AssetInventory. This way it's seamless to all the other systems, since all assets in the engine are passed around as handles which point back to the AssetInventory.
At least... all assets should be handles. The reality of it was that I had written a lot of my code that works with textures before the AssetInventory was in place. So this meant refactoring a lot of my material system to work with handles to textures instead of pointers. While there's a bit of a hit dereferencing the handles I think it's worth it in the long run. It'll help a lot when I get into streaming textures, so that I'll be able to seamlessly load in different mip levels by just swapping out textures in the AssetInventory.
Finally I started fleshing out the UIEditor. I set up a new game state that's selectable from the main screen. I also added in some of the components I had in the level editor, a protoform type list box for selecting what type of sprocket I'd like to create, and a protoform property window, which allows editing of the create params for a sprocket type. I also started to ponder how I would handle hooking up these UI screen's I create to actual gameplay logic. I settled on a solution where I would create an interface sprocket which would contain a number of inputs and outputs that I could link up in my IO Editor component to handle most of the logic that would be necessary for the screen. I also pondered setting up a scripting language to use with the front end. I'd like to get a scripting language in eventually for gameplay so I thought it might be as good a time as any. I put that on the back burner for now though because that comes with it's own set of problems that I'm sure will keep me busy for a few weeks.Week 2
While I had added the protoform property window in the previous week, with all my changes to the UI nothing worked on it. It didn't even render! So I spent most of the day getting all the UI sprockets it uses working again. It was mostly easy work, just a lot of ripping out code and waiting for builds.
The next thing I wanted to get going was a gizmo for quick editing the UI sprocket. My plan was basically a border around the object with 3 buttons in 3 of the corners. A button for translation, rotation and scale. Clicking and dragging on the appropriate button would perform that transformation. I was able to get a button spawning but it's a bit of a hack for right now and currently the gizmo isn't hooked up to anything.
I also began pondering what my next sprint would be. I think I'd like to spend one more week fleshing out the basics of the UI Editor and then start working on an Animation Editor. It's kind of a big jump but I'd really like to get going on gameplay mechanics again and one of the things that was holding me back was getting in some placeholder animations. I think an animation editor would help get me going again.
Unfortunately this week was mostly all about the builds. I was changing a lot of core headers which caused 15-20 minute builds on this little netbook, that's a good chunk of my commute! I ran into a lot of problems with my protoforms which are essentially an xml tree of variables, inputs, and output definitions with state stored with them. I use protoforms as a way of serializing in/out of objects as well as editing the create params for a sprocket before I actually spawn it. All of this is done via a few handy macro's that do some code generation to create some functions that the protoforms query to get reflection data on all of the create params, inputs, and outputs an object can have.
I also started cleaning up another element of my UI Sprockets. Every UI Sprocket took in an FEContext as a create param. The FEContext had pointers to all the necessary systems that the sprocket would need to create itself things like memoryAllocators, renderContext, fontSystem, assetInventory. The thing is though that I added a ConstructionContext that gets automatically stored with every create param and this already has all the necessary systems in it. I added that a while after I started building my first set of UI Sprockets so none of them were really taking advantage of the ConstructionContext.
I spent a lot of time at the end of the week thinking about refactoring how the protoforms currently work. One thing that really bugs me about them is that they kind of encompass both the reflection data and the state data. I'd like to separate them so each class only has one set of reflection data and the actual state for each of those peices of data is stored separately in the protoform as a tree of state data. For example I have a VariableInfo, it stores things like the name, the type, the data offset from the beginning of the create params, the size, etc... however it also store the actual state of the variable inside of an object that manages converting to and from a string representation of the variable. I'd like to make these two concepts very separate. It'd be a big overhaul of the system as it stands now though so I think for now I'll just leave it how it is.
So that brings us up to date. I'm currently working on a one week sprint that cleans up some of the leftover things like addressing the gizmo a bit more in depth, saving/loading screens and properly handling multiple UI objects. I also need to address properly parenting a child UI Sprocket to a parent UI Sprocket, but that's all for another week.
Before Mouse Over
After Mouse Over
First shot of my gizmo for editing, I know it looks awful it's just placeholder
Showing a spawned button
Illustrating a pressed button on the gizmo, when dragging that button the width and height of the UI Element gets changed.
A bunch of spawned buttons of different sizes.