class VzAppManager - base class for storing persistent data and providing support - GetGraphics(), GetWindow(), SetEditMenu(), etc.
class MeshEdit3Manager - inherits from VzAppManager. Explicit class for my mesh editor. I could have just expanded VzAppManager to handle everything but, as this is my first attempt at adapting L. Spiro's gamestate architecture to an editor, I want to keep classes reusable - e.g., VzAppManager.
class VzMsgProc - base class which has a function: virtual LRESULT MsgProc(HWND hWnd, UINT message, WPARAM wParam, LPARAM lParam);
Used for main window procedure dispatching. I.e.,
LRESULT CALLBACK WndProc(HWND hWnd, UINT message, WPARAM wParam, LPARAM lParam){ VzMsgProc* proc = (VzMsgProc*)GetWindowLongA(hWnd, GWL_USERDATA); if (proc != NULL) return proc->MsgProc(hWnd, message, wParam, lParam); ... // default processing}
class VzEditMode - base class for editing modes. Similar to L. Spiro's gamestate. VzEditMode inherits from VzMsgProc so any mode can be at the top of the messaging chain. This class also has L. Spiro's virtual methods:
virtual void Init(VzAppManager *appManager) {} virtual void Destroy(VzAppManager *appManager) {} virtual void Tick(VzAppManager *appManager); virtual void Draw(VzAppManager *appManager) {} virtual LRESULT MsgProc(HWND hWnd, UINT message, WPARAM wParam, LPARAM lParam);
Now, for example, I have a class for editing the mesh object as a whole (not individual verts or faces).
class VzModeObject - inherits from VzEditMode and implements:
Init - gets the mesh pointer (may be null if nothings loaded or the mesh was deleted) from appManager, and sets up the main window Edit menu and associated keyboard shortcuts for that menu. At the moment, the Edit menu for this mode is just "Verts V" -> go into vertex edit mode through the menu, or via the keyboard shortcut .
Tick - calls the app manager's CameraHandleMouseWheel() function which zooms/unzooms the view. That's what I want the mouse wheel to do in this mode. Not implemented yet - TBD - check for user commands to translate/scale/rotate the mesh.
Draw - draws the mesh with the current user's options for wireframe/solidfill, lighting on/off, cull-backface/cull-none, etc.
MsgProc - at the moment, just checks if the window message is WM_COMMAND with a message ID of IDM_EDIT_MENUSTART - that's the message ID sent by the Edit menu "Verts" option or accelerator (keyboard shortcut) . If that's the message, msg->SetNextMode( VZMODE_EDITVERTS ) is called and VzAppManager sets a flag to change the mode at the next Tick() call in the main loop.
MsgProc then simply returns mgr->MsgProc(hwnd, message, wParam, lParam) to give the MeshEdit3Manager instance a chance at the window messages.
MeshEdit3Manager::MsgProc does bunches of editor specific stuff - checks for commands to exit, to change wireframe, culling and lighting states, stores keyboard input, stores flags and mouse positions for LMB button down/up, mouse moves, and mouse wheel state; and handles window resizing, making calls to Graphics() to resize the backbuffer, etc. [ Come to think of it, WM_EXIT should probably be handled by VzAppManager instead. --> Added to the TODO list. ]
The MeshEdit3Manager::MsgProc then returns VzAppManager::MsgProc.
VzAppManager::MsgProc (at the moment) handles just WM_PAINT and WM_DESTROY. If the message is handled, it returns 0. Otherwise it returns the one and only DefWindowProc call in the chain.
Of note:
case WM_DESTROY: // the current mode and it's MsgProc will soon be deleted, so provide a MsgProc until the app is closed. GetMainWindow().SetMsgProc(this); PostQuitMessage(0); break; // eventually returns 0;
Next up, I'm thinking - setup VzMode_VertEdit - first part: how to display selected/unselected vertices.
Good approach. My modeller uses a similar system in that there is a base Tool class which each of the editing tools derive from, and the Tool interface looks a lot like what you have. The ToolBar widget takes care of setting the current tool, and the View class just interacts with the base Tool interface, which as well as having methods for mouse move, press etc also supports an interface for custom drawing on the view so the View is completely ignorant of what the current tool is actually doing.
Actually, there are up to four views in my editor on screen at any one time, so all the Tool methods take a pointer to a View as their first parameter so whichever view you, for example, click on is made available to the Tool method locally like that. That saved a great deal of pain.
Its a very nice, extensible approach, the way you are going.