Jump to content

  • Log In with Google      Sign In   
  • Create Account

Walking Towards The Sun

Nothing dev about it - Taking care of your self

Posted by , 23 February 2012 - - - - - - · 741 views

Pitter patter goes the heart. Something I forget so often, and I've noticed it amongst my peers. Not taking care of my self, no I'm not talking about that you perv. I find my self so caught up in the minds of others, that I don't think about how I feel. It's been almost two, weeks and that's exactly why. Too worried that I'm upsetting others, not realizing I'm not moving forward.

So lets all put those demons away, put on a fat smile and walk towards the sun. Not asking for a world of narcissist, but if not for our own lives, what's the point.

I'll leave it with my next ink... painful back arm to boot... going to be a great release. "The demons that distract us"
Posted Image

Code on dev world!


Posted by , 10 February 2012 - - - - - - · 727 views

I was listening to this song earlier and it just had this great verse in it.

Cause you
You're so calm
I don't know where you are from
You're so young
I don't care what you've done wrong

Too me it screams the essence of childhood, being an adult and whatching children grow up. To not take the world so seriously that you never do anything. Everyone makes mistakes and it's not the end of the world. I constantly find my self reminding my own kids to not worry so much. Seeing how easily people get fustrated these days, it's not what I remmeber as a kid. "Tokyo Police Club - Shoulders and Arms" is the song for anyone interested.

Well I started doing a little more remodeling then I thought I was going to. The following snippet is the current and probably old renderList. These lists are the back bone of my renderer and do all the heavy lifting for me. The list stores a struct called drawcall, which is the basic information about what is trying to be drawn and how; position, size, texture, shader, rotation/origin, and diffuse coloring. Which actually isnt the original layout either. The old drawcall, stored the actual vertecies along with the texture, shader and so forth.

A year or two ago, I switched to the concept that all I needed was rectangles/quads. Why store all the extra data when I could just translate them later? Since then I've had countless times where a trangle or line would've been so much more helpfull, then trying to aligned a quad textured with the shape just right. If you poked around my last post, you probably remember the DrawQuad function, there did used to be DrawTriangle, DrawLine.... hence why I didn't call it something more associative like DrawSprite.

Once all drawcalls have been made for this frame, everything move onto the renderer which mainipulates the render lists and presents them to the screen. Starting at the preFill. Any sorting of the list happens here. I sort all my calls so that transparent textures are at the back, orderer back to front, and opaque textures orderer front to back. Using the z depth of the textures, This allows alot of flexibility in game design. THeres no need to design you code around the fact that certain images have to be drawn at certain times to appear right. Aslong as they're positions are set right, they'' appear correctly.

After sorting any render target swapping happens here. This is one aspect that is being remodeled. Currently the renderList only supports one render target per renderlist. This meant I had to hardwire a seperate pipeline for doing lighting, else suffer the fact that Id have to have a seperate renderList withe the exact same data for each buffer required by the lighting system, and relock the vertexbuffer for each list and everything else that has to be done for each list, added up pretty fast. The new list is going to store chains of effects. each link in the chain is simply a renderTarget and shader to be used. This can even be the backbuffer just rendering with different shaders each time. Also each link can have a set of secondary textures that should be included into the stream. Such as passing the gBuffer along with what is being rendered to do lighting. It will be a bit experimental, as honostely some of the functionality I've never used before but always wanted to.

Once things are sorted, the rendertarget is set. Fill, lcoks the vertex buffer for the current texture and shader combo. Using the drawcall iterator, fill walks through the drawcall vector, any call that matchs the current state gets translated and placed into the vertexbuffer. If a new state is found, fill stops immediately. This is all to take advantage of the iterator and try to not process any drawcall more than once... ie the pipeline should walk through the list only once.

All the translation though is being moved to the point when the drawcall is made, except that or view -> screen translation which happens in the shader using an orthogonal projection. The pipeline uses it's own camera to tranlate the call from world to view space. Cull, then rotate if nessecary. Culling is really generic, I use an oversized view (bigger than the screen), and just do a quick bounds check with the center of the quad and the screen. Aslong as you're not trying to draw a 512x512 or bigger texture it shouldn't have any popping as it leaves the screen.

PostFill is pretty much the opposite of prefill. it resets the rendertarget to the back buffer if it was changed, then resets the drawcall iterator so new drawcalls will start by overwriting the old ones before pushing new ones on the stack.

namespace ZEGraphics

    class RenderList

	    LPDIRECT3DSURFACE9	  backBufferCopy;        // used to save the backbuffer while a secondary buffer is being used.
	    LPDIRECT3DSURFACE9	  tempRenderTarget;
        IDirect3DVertexBuffer9* vertexBuffer;        // dynamic vertexBuffer used by this renderList only.

	    std::vector<ZEGraphics::Sprite*> renderTargets;    // list of secondary targets that will get rendered to in order withe the assigned shaders.

        std::vector<ZEGraphics::Drawcall>		   drawcalls;
        std::vector<ZEGraphics::Drawcall>::iterator drawcallIterator;    // the iterator is used by the fill process only, to save the readers position inbetween state changes.

	    RenderList(ZEGraphics::Device* device);        // Basic constrctor, sets iterator and reserves space for 5K drawcalls.

        ~RenderList();                                // releases all data including vertexBuffer, drawcall vector... etc.
	    void Reset();                                // resets the drawcallIterator, and other temp settings.
        *    PreFill - sets up the list for converting to vertexbuffer. Sorts by state change,
        *              resets the iterator and prepares the render target to be used.
        void PreFill(ZEGraphics::Device* device);

        *    Fill - Enters the vertex data for the drawcalls of the same state.
        void Fill(ZEGraphics::Camera* camera, IDirect3DTexture9*& _texture, DWORD& _shader, UINT& _primitiveCount, bool& _isTransparent, bool& _isEmpty);

        *    PostFill - Resets the renderTarget and clears the list for re entering new draw calls.
        void PostFill(ZEGraphics::Device* device);

        *    AddDrawcall - pushs a single drawcall onto the stack.
        void AddDrawcall(ZEGraphics::Drawcall _drawcall);

        bool Full();        // Return true if the drawcalls have exceeded the vertexBuffer capacity. 100K calls (200K triangles).


        *    helper functions to split the big functions up and make them easier to read and edit, blah blah blah.

        bool NextDrawcall(); // move iterator to next available drawcall. return false if no drawcalls available.
        void GenerateQuad(ZE::VEC3 pos, ZE::VEC2 size, ZE::VEC3& vert1, ZE::VEC3& vert2, ZE::VEC3& vert3, ZE::VEC3& vert4);
        void SetVertex(DWORD index, ZEGraphics::Vertex*& vertexData, ZE::VEC3 pos, ZE::VEC3 normals, ZE::VEC2 uv, ZEGraphics::COLOR diffuse);




The new renderList will be moving one from the all in one bucket for every drawcall. I'm also going back to the old way of string vertecies in the drawcall to facilitae the difference between triangles, quads and lines. The later being the only one that is actually a special case. Circles I use a precision variable to determine how many triangles I use to build them.

Drawcall buckets, that's what I'll be calling it now. The idea is to store drawcalls in seperate buckets for each state combo. So shader and texture will be stored in the bucket instead of individual drawcalls. No need to check if a drawcall is valid anymore, if it's in the bucket then use it with this state. Shaders migh even be removed to only the renderlist and the render chains. So if a shader is going to be used by a rednerList it's going to apply it to all of them. Off the top of my head I can't think of any time I used a seperate shader within the same renderList. So buckets will probably be sorted just by texture.

Render chains will be another big change, already mentioned, a chain is a render target and shader to be applied. it will also have a way of assigning secondary textures to be applied to the stream. Say you build you color buffer, normals and other gBuffer goodies. Then need to render your scene to the screen using all that light data, but you can only use one texture at a time. Makes it kinda impposible right?

Well the skeleton is in place and some of the dta types are finished, I still got alot of coding before it's what I want. I am trying to keep the old pipeline untouched though, so all my old projects still work just fine. They just cant use any new features. :) But now I'm all typed out... good luck and happy coding dev world... YORDLES!

zoloEngine - Cloud and a walking disease

Posted by , 07 February 2012 - - - - - - · 767 views

Gotta love how easily kids pass along the sickness. I'm a walking disease today, no work, don't want to hang out in fear I'm may be too far from a bathroom ;) So here I am, clean or code... let's code. While were at it lets toot my horn aswell. Yes there is more than one in this head.

Wasn't expecting to start this till tomorrow, but with my situation, I started today anyways. Torn my old render interface a new one and put together the skeleton for the new interface. Be proud of me, I kept the old interface and everything in the engine is still completely intact. There's actually two interface to choose from at the moment. The only thing lacking from the new one, is font and particleSystem support. The old system had a hackish way of using fonts and the particle system was very reliant on the old interface to work properly.

Funny I'm looking at the two headers... old header 145 lines ~15 of it comments... the new header 95 lines ~ half of it comments. That's also including that I incorporated the window, input, and frame timer into the new interface aswell.

new interface
#ifndef _ZECloud_H
#define _ZECloud_H
#include <string>
#include "ZEMath.h"
#include "ZESystemWindow.h"
#include "ZESystemTimer.h"
#include "ZEInputInterface.h"

#include <d3d9.h>
#include <d3dx9.h>
#include "ZEGraphicsDevice.h"
#include "ZEGraphicsTextureCache.h"
#include "ZEGraphicsEffectCache.h"
#include "ZEGraphicsCamera.h"
#include "ZEGraphicsRenderList.h"
#include "ZEGraphicsInterfaceParameters.h"

            ZoloEngine - Cloud
            2/2012 - Corey Marquette

namespace zecloud
    class Cloud

            Default Constructor - This constructor should not be used and will popup a
            warning message if it is.

            Contructor(...) - This constructor must be used to initialize the engine.
            It initializes everything based on the data from a config file.
        Cloud(HINSTANCE instance, std::string configFilepath);

            Destructor - all cleanup is done here. releasees everything from textures to

            Update - calls update on all sub components, and returns false if the engine
            has shutdown from any errors. If it has shutdown, the engine should not be used.
        bool Update();

            Render - processess all renderLists and present to the screen, use this
            function after all drawcalls have been made for the frame.
        bool Render();

            These accessors are meant to be used to create resources for the engine to use
            instead of doubling implementations in the interface.
        ZEGraphics::TextureCache& TextureCache();
        ZEGraphics::EffectCache&  EffectCache();

            CreateRenderList - adds a new renderList to the queue and gives a pointer back
            to be used for rendering. ownership of the RL's still remains with the
            engine interface. RenderList are never released until the engine is.
        bool CreateRenderList(ZEGraphics::Sprite* renderTarget, bool clearTarget, ZEGraphics::RenderList*& renderList);


        ZESystem::Window   window;
        ZEInput::Interface input;

        ZESystem::Timer    frameTimer;

        // Renderer specific data
        ZEGraphics::Device				  device;
        ZEGraphics::Camera				  camera;
        ZEGraphics::TextureCache		    textureCache;
        ZEGraphics::EffectCache			 effectCache;
        std::vector<ZEGraphics::RenderList> renderLists;



simple, just the way I like it. Starting a new project is as easy as adding a couple directies and creating a winmain file like so.

#include <windows.h>
#include "ZECloud.h"

int WINAPI WinMain(HINSTANCE _instance, HINSTANCE _prevInstance, LPSTR _cmdLine, int _cmdShow)
	    Engine initialization

    zecloud::Cloud cloud(_instance, "data/config.xml");


    while (cloud.Update())


    return 0;

super simple... me really like. I have the tendency to be creating new projects constantly just to test out new ideas all the time. So having a fast setup proccess is very crucial to me.

The old interface took on a master type role... every sub component had a double set of functions in the interface that would call the actual functions in the sub components. Don't know if that would be considered "PIMPL" or not, eitherway it was cumbersome. Anytime I wanted to add new functionaliy, I basically had to do it twice. Then ownership started to get blurry, which brings up the truth behind font rendering and the particle system. because they are seperate from the render interface, and they should be. They required the interface to render anything. This meant any changes to how the DrawQuad function worked, would effect how those system worked aswell. Especially when I added diffuse coloring to allow quickly changing the fonts colors. I had to dig deep into the rendeerlist, then change things in the interface, then had to change thins in how textures were loaded... it was just a mess.

This time I've centralize, the act of drawcalls to renderList only. Similar to how XNA uses spriteBatch. The interface is no longer required to draw sprites to the screen. It is instead the point were resources are created, and manages the work flow of the system. It's still a major joint in the system, but should facilitate changes better.

The one hurdle I'm still contemplating, is the difference between translated and untranslated coordinates. My camera is run in software, so its simple enough to just skip it when process translated coordinates. ie for things like the UI or other screen aligned sprites. I'm not sure if I want to store a tranlated flag in the drawcall struct or make it global and keep it in the renderList class. The former would allow alot of flexibility, but require an if branch for every drawcall (upwards of ~150K per frame). Keeping it in the renderList would mean only one if branch per renderList, but also means more work for the user (me). Creating seperate renderList for translated and untranslated rendering. I'm leaning towards the later, since I kinda already do this... but not always. and could easily lead to renderlist with only 10-20 calls stored, but requirring that many state changes. Defeating the purpose of batching.

This will also problably mean more clutter in the draw function. I've been trying to minimize the require arguments over time, so I really didn't want to add more. Should I just create seperate function for translated and untranslated? Probably not.

RenderList.DrawQuad(position, size, sprite, shader, rotation, diffuse, translatedFlag)

not too bad I guess.

Another big change for me, is moving from only translated coordinates, to using world coordinates then letting the camera translate these. I've been comtemplating letting the engine do all the culling instead of doing it in game logic. Should this be done early at the point when drawQuad is used or later when the renderer starts processing the renderLists? I'm thinking early, but this can have problems. Take for example, we make some drawcalls that get rejected for being outside the camera, but before processing the renderLists, we move the camera and the old calls would be in view now. This would cause were popping, were objects aren't visable for a split second, then suddenly are. But would also keep the RenderLists at resonable sizes. They have always been a bottle neck in the pipeline, so anthing to make them a little quicker is beneficial... it's just the drawbacks are icky.

for the weirdos out there, heres the old interface for comparison and a look into my madness.
#include <d3d9.h>
#include <d3dx9.h>
#include <vector>
#include "ZEGraphicsInterfaceParameters.h"
#include "ZEGraphicsTextureCache.h"
#include "ZEGraphicsEffectCache.h"
#include "ZEGraphicsDevice.h"
#include "ZEGraphicsCamera.h"
#include "ZEGraphicsRenderList.h"
#include "ZEGraphicsSprite.h"
#include "ZEGraphicsColor.h"
#include "ZEVector3.h"
#include "ZEVector2.h"

namespace ZEGraphics
    /** DEBUG information struct. */
    struct DI_Interface
		    : drawPrimitiveCalls(0),
		    largestBatch(0) {


	    void Clear() {
		    drawPrimitiveCalls = 0;
		    trianglesDrawn	 = 0;
		    smallestBatch	  = 0;
		    largestBatch	   = 0;

	    DWORD drawPrimitiveCalls;
	    DWORD trianglesDrawn;
	    DWORD smallestBatch;
	    DWORD largestBatch;

    /** Graphics API main Interface. */
    class Interface
        ZEGraphics::Device device;

        ZEGraphics::Camera*	   camera;
        ZEGraphics::TextureCache* textureCache;
	    ZEGraphics::EffectCache*  effectCache;

        std::vector<ZEGraphics::RenderList> renderLists;
	    IDirect3DVertexBuffer9* vertexBuffer;


	    ZEGraphics::DI_Interface dInfo;

	    /** Data used by the deferred lighting pipeline. */
	    ZEGraphics::Sprite gbNormals;   
	    ZEGraphics::Sprite gbPositions;
	    ZEGraphics::Sprite gbColors;    

	    DWORD gbShader;
	    DWORD dirLightShader;


	    ZEGraphics::InterfaceParameters parameters;

        Interface() { };
        ~Interface() { this->Release(); };

	    int Create(HWND _windowHandle, ZEGraphics::InterfaceParameters& _paramters);

	    /** Display will process all the renderlists. Internally it will run these lists through the desired pipeline, specified by the renderList. */
        void Display();

	    /** Specific pipeline for rendering the drawcalls. */
	    void DefaultPipeline(DWORD rl, IDirect3DTexture9* _texture, DWORD _shader, UINT _primitiveCount, bool _isTransparent);

	    /** Release... releases all the allocated resources. */
        void Release();

	    /** Drawing functions for drawing textured primitives. */
	    void DrawQuad(DWORD _renderList, ZE::VECTOR3 _pos, ZE::VECTOR2 _size, ZEGraphics::Sprite* _sprite, DWORD _shader, ZE::VECTOR3 _rot, ZEGraphics::COLOR _diffuse);
	    void DrawQuad(DWORD _renderList, ZE::VECTOR3 _pos, ZE::VECTOR2 _size, ZEGraphics::Sprite* _sprite, DWORD _shader, float _alpha);

	    /** Texture loaders. */
        ZEGraphics::Texture* CreateTexture(std::string _filename, D3DFORMAT _format, bool _transparent);
	    ZEGraphics::Texture* CreateRenderTarget(int _width, int _height, D3DFORMAT _format, bool _transparent);

	    /** Effect Loaders */
	    bool CreateEffect(std::string _file, DWORD& _index) {
		    if (effectCache == NULL)
			    return false;

		    return effectCache->CreateEffect(device.direct3DDevice, _file, _index);

	    /** Deferred Lighting interface. */
	    bool SetupDeferredLighting();
	    void BuildGBuffer(LPDIRECT3DTEXTURE9 _texture, UINT& _primitiveCount);
	    void ProcessLights();

	    /** The renderer stores pointers to a light source, so that it can be moved easly outside the renderer.
		    The renderer won't cull the lights so it's up to the user to remove lights that are no longer visible. */
	    void AddLight();
	    void RemoveLight();
	    void ClearLights();

	    /** Misc resource loaders. */
	    DWORD CreateRenderList(ZEGraphics::Sprite* _renderTarget, bool _clearTarget);

	    /** Resource access. */
	    ZEGraphics::EffectCache* EffectCachePTR() { return effectCache; };

	    /** Debug info. */
	    ZEGraphics::DI_Interface& DebugInfo() {
		    return dInfo;

	    void ClearDebugInfo() {

	    /** anchors return the screen position at the specified position. */
	    ZE::VECTOR3 anchorTopLeft();
	    ZE::VECTOR3 anchorTopCenter();
	    ZE::VECTOR3 anchorTopRight();
	    ZE::VECTOR3 anchorCenterLeft();
	    ZE::VECTOR3 anchorCenter();
	    ZE::VECTOR3 anchorCenterRight();
	    ZE::VECTOR3 anchorBottomLeft();
	    ZE::VECTOR3 anchorBottomCenter();
	    ZE::VECTOR3 anchorBottomRight();




Tomorrow I bother you again about the renderLists, the backbone to my renderer... its a yordle, run!

Memory Blocks release - zoloEngine Cloud

Posted by , 06 February 2012 - - - - - - · 966 views

Later in the week, next week, same thing. Right?.... right? Real simple real quick, get the highest score possible and brag to your self about it, cause that's what life is all about. Right?

Playing the game - You're given a set amount of turns, each time you reveal a colored square it takes a turn away. Everytime you reveal two matching colors you get points and your turns back. Each pair starts a combo timer (4 sec), everytime you reveal another pair in that time, the clock gets reset and a chain counter goes up. Wghen the time runs out you get bonus points and turns based on the combo chain.

Each pair revealed, reveals the surrounding blocks for a few seconds, but also explodes blinding the area. good luck and have fun.

Download the game here -

For those who don't have visualStudio 2010 installed, can get the runtime (~5mb), here -

The game also requires directX9.c or higher to run, you can find that here -

Ok I feel better now, a quick project out of the way. Time to focus on a few other thingies. I've been dieing to do some rework on my framework. Mainly making it a little easier to translate from the physics portions into the graphics portion and vice versa. As they were designed so far apart, they used different variable layouts let alone different coord systems. ie my graphics always conciders the position to be the upper left corner and the bounding box to stretch outwardfs from that. My physics component took on the convention, that position is the center of the bounding box, and size stretchs in all direction from the center. This is mostly because of my choice to use Box2D, and it uses a similar approach.

A few major rethinkerings of my renderer interface needs to be done to finally get me back on track with "Kylar's Valcano". The current system has some major limitations on how much pixels I can push. A quick couple refactorings already pushed me from ~20K triangles @ 30fps to 250K triangles @ ~40fps. This ofcourse is all at 1280x720x32 using my particle emitter as a testing ground. Which brings me to another portion that did get rewrote, but lost when my pc decided to die.

A few of the changes were me hacking around with how the emitters handle generating new particles. Wasn't nearly as efficient before, and actually account for alot of the performance improvement. Lots of redundancies, that I can only assume were done the way they were for the sake of getting it coded faster. Always fun reading comments couple years old, and seeing the programmer mind set I was in back then. Boy do I see things differently now.

Posted Image

Like every one else, I've always had the idea that I would eventually release my framework to the masses. Then they would all praise me like a programming god. "Oh Corey you're a genius... how is this even possible", "Oh it was nothing, now throw away that useless UDK you got over there." I can dream right?.... right?

Currently as it stands, the framework is split into several major sections with their own interfaces. Non included components are all third party, such as sound.

Misc helpers

But as this is for me, and I alwas strive for simplicity, since my a.d.d. has me starting new project every other day. I always like to design for fast prototyping. So I'm going back to more of an all in one interface. "Cloud", is the new name I'm dubbing this interface. Though the only thing I'm rewritting is the graphics interface and renderer pipeline (sounds like alot, but really it ain't). I'm incorporting the other interfaces into the graphics interface basically. So I can have an all in one intitializer and container for the interfaces.

While you're puzzeled I'll take my leave and bid you all farewell... whatch out for the turnips.

Nextime - Talk about the rendering pipeline and Day 4 (a 2 imager to finish of the tease).

Latest Visitors