Jump to content

  • Log In with Google      Sign In   
  • Create Account

Banner advertising on our site currently available from just $5!

1. Learn about the promo. 2. Sign up for GDNet+. 3. Set up your advert!


Member Since 30 Aug 2010
Offline Last Active Jun 26 2014 11:25 AM

Topics I've Started

DLL Symbols Entity Framework

16 April 2014 - 11:20 AM

I'm not wanting to get full type information or anything crazy. I just am interested in creating a data driven design for my entities. Thing is I have NO idea how to get an enumeration of the DLL function names in Windows. I'm looking at stuff like DbgHelp and other stuff, but the documentation is pretty poor and I'm having a hard time figuring out how to enumerate it.


What I'm trying to do:

class GameComponent
virtual void XMLSave(XMLDocument& doc,XMLElement* parent) = 0;
virtual GameComponent* XMLLoad(XMLElement* node) = 0;
virtual const int TypeID() = 0;
virtual const char* Name() = 0;
virtual void Update(){}
virtual void Initialize(){}

Idea is a search through the symbol list when the engine initializes so that I can build a list of XMLSave, XMLLoad functions without having to hard code a static list of them. Also will allow me to be able to make additional dlls in the future and extend it.

class GameComponentFactory
map<const char*,delegate<GameComponent*(XMLElement*)>> ComponentLoadMap;
void Initialize(const char* dllPath);
Scene* LoadXMLFile(const char* path);

void GameObjectFactory::Initialize(const char* dllPath)
      How do you load the table?
   vector<string> enumerations = LoadDLLExportTable(dllPath);

      Determine if class is derived from GameComponent ...
      if so add to the map functions that belong to the main game loop
      and xml load / save files.
      I've read papers that said they've done this ... I just have no clue how.

Scene* GameComponentFactory::LoadXMLFile(const char* path)
   Psuedo-code would need to be much more safe
   XMLDocument doc;
Scene* scene = ResourceManager::NewScene();
for(auto obj = doc->FirstChildElement();obj;obj->NextSiblingElement())
   GameObject* go = new GameObject(obj->Name());
   for(auto component = obj->FirstChildElement();component;component->NextSiblingElement())
  return scene;

Getting color pallette of non-shaded texture that ignores aliasing

22 September 2012 - 02:28 AM

Edit: I've gotten it from 5k to about 60 by only checking 100% opaque values and introducing floating point tolerances since Unity converts all the colors to floating point. But that's still about ~13-15x the number in the actual number in the photo ... that's with pretty generous tolerances of .1f which is is ~ +/- 25 in the 8-bit color scale.

So I'm trying to write a tool for my Unity game that will allow me to make a color edit mode. I figured if I went the extra step to make sure my colors were all completed seperated that it would be fairly easy, but I'm still getting far more colors then I actually attempted using when I colored the picture. The picture was done with vector graphics so I know the precise color of each not including the aliased lines which are interpolated with a completely alpha color.

I'm not 100% that aliasing is the problem but otherwise I am completely unsure why there would be in the realm of 4000 different colors showing in my photo. They all seem to be really similar versions of the same color so aliasing is the only thing that makes sense. The idea is have a color mask, a and a pallete to map use selected colors too at run time.

This is just a sample file of peach from paper mario I'm using to test the system Attached File  PrincessColor.png   4.25MB   18 downloads

The goal is to use this color map with the BW and a shade map to construct the final picture. I was separating the BW from the color to try and solve the aliasing issue but that isn't the solution.

Hopefully looking like this in the end.Attached File  ColorPrincess.png   297.65KB   22 downloads

I got it working in photoshop but need to find out how to export this to my game. I could do it by exporting each clip mask out as a picture then recombining them mathematically to guarantee they are exact, but exporting 6-12 pics per frame then recombining is more work then it should have to be. Any help would be appreciated.


[source lang="java"]public class ColorEditorWindow : EditorWindow{ static ColorEditorWindow window = null; List<Color> ColorPallette = new List<Color>(); // Use this for initialization [MenuItem("Window/ColorEditor")] static public void Start () { if (window == null) window = EditorWindow.GetWindow<ColorEditorWindow>(); } public void GetPallette() { if (texture != null) { //Reset Pallette if previous one existed if(ColorPallette.Count != 0) ColorPallette.Clear(); Color temp = new Color(); //cache color value to use as temporary for (int i = 0; i < texture.height; i++) { for (int j = 0; j < texture.width; j++) { temp = texture.GetPixel(j, i); //Get pixel to test if (temp.a == 0)//Ignore pure alpha colors continue; //See if color is unique for (int k = 0; k < ColorPallette.Count; k++) //If color exists in pallette skip it -- goto is ugly consider something else { if (ColorPallette[k].r == temp.r && ColorPallette[k].g == temp.g && ColorPallette[k].b == temp.b) goto NEXT; } ColorPallette.Add(temp); //If not in pallette add to pallette NEXT: continue; //If temp exists go here and continue. } } } } public Texture2D texture = null; public void OnGUI() { texture = EditorGUILayout.ObjectField(texture, typeof(Texture2D)) as Texture2D; if (GUILayout.Button("Get Pallette")) { GetPallette(); } /* for (int i = 0; i < ColorPallette.Count; i++) ColorPallette[i] = EditorGUILayout.ColorField(ColorPallette[i]); */ EditorGUILayout.LabelField("Colors: " + ColorPallette.Count); }}[/source]

Depth Buffer Issue

21 April 2012 - 05:41 PM

Been following a tutorial to try and learn C++ DirectX. So far so good, but depth buffer is giving me problems.

I'll work this into classes after I have a better idea of how it all fits together but here's the code I have so far. The scene renders fine (well except depth buffer not clipping stuff) if I don't put my depth buffer pointer into OMSetRenderTargets() and comment out ClearDepthBuffer();

This makes it pretty obvious to me that it has to do something with how I'm creating the buffer. I've looked up at least 3 different tutorials and tried to adapt their code but none of it seems to work.

I've tried everything I can think of and have been trolling forums for about an hour and nothing suggested seems to work.

I know it's pretty bleh at the second. Just want to get a better idea of how everything works so I don't have to do to rebuild wrapper classes.

//How I Initialize Stuff
void InitRenderTarget()
ID3D11Texture2D* pBackBuffer;
hr = pDev->CreateRenderTargetView(pBackBuffer,NULL,&pRenderTargetView);
pBackBuffer = nullptr;
if(hr != S_OK)
  isPlaying = false;
texd.Width = 800.0f;
texd.Height = 600.0f;
texd.ArraySize = 1;
texd.MipLevels = 1;
texd.SampleDesc.Count = 1;
texd.Format = DXGI_FORMAT_D24_UNORM_S8_UINT;
texd.BindFlags = D3D11_BIND_DEPTH_STENCIL;
texd.Usage = D3D11_USAGE_DEFAULT;
hr = pDev->CreateTexture2D(&texd,NULL,&pDepthBuff);
if(hr != S_OK)
  isPlaying = false;
ddesc.Format = DXGI_FORMAT_D24_UNORM_S8_UINT;
ddesc.ViewDimension = D3D11_DSV_DIMENSION_TEXTURE2D;

pDepthBuff = nullptr;

rDesc.FillMode = D3D11_FILL_SOLID;
rDesc.CullMode = D3D11_CULL_NONE;
rDesc.DepthClipEnable = true;
rDesc.FrontCounterClockwise = false;
rDesc.MultisampleEnable  = false;
rDesc.SlopeScaledDepthBias = false;
rDesc.DepthBias = false;
rDesc.DepthBiasClamp = false;
rDesc.AntialiasedLineEnable = false;
rDesc.ScissorEnable = false;


D3D11_VIEWPORT viewport;
viewport.TopLeftX = 0;
viewport.TopLeftY = 0;
viewport.Width = 800;
viewport.Height = 600;
viewport.MinDepth = 0.0f;
viewport.MaxDepth = 1.0f;

//How I do some basic drawing
void RenderFrame()

Texture Batching?

01 April 2011 - 11:29 PM

Alright I'm working on improving my current sprite rendering solution. Currently I have no support for batching textures I simply reference a large numbers of samplers and send over the texture data and the quad find the correct sampler based on data sent with it. Is there another way? I'm just curious if there's a better way, as it just seems clunky to me.

Issue with creating a geometry shader for creating textured quads

27 March 2011 - 01:01 PM

Alright teaching myself Opengl by going through the SuperBible 5th ed. However, after you get into the later chapters they kind of leave you hanging as they glossed over certain things that were taken care of by the GLTool Library earlier in the book.

This leaves me with the issue of not knowing whether the problem is in how i'm creating the vertex array or if it's within my shader. My shader compiles and let's me assign uniforms and looks logically correct.

How I'm trying to init my vertex array

	const GLfloat data[] = {0.0f,0.0f,300.0f,400.0f};

	glVertexAttribPointer(0,4,GL_FLOAT,GL_FALSE,sizeof(float)*4,(const GLvoid*) 0);

How I try to draw it


The idea is I send a single vec4 position with data organized <x,y,width,height>. Once I get this working I'll elaborate on it into a full-scale spritebatch system, but one step at a time.

Vertex Shader

#version 330

varying in vec4 vPosition;

void main(void)
	gl_Position = vPosition;

Geometry Shader

#version 330

layout (points) in;
layout (triangle_strip) out;
layout (max_vertices = 4) out;

varying in vec4 vPosition[1];

varying out vec2 UV;

uniform mat4 Ortho;

void main(void)
	gl_Position = Ortho*vec4(vPosition[0].x,vPosition[0].y,.01f,1.0f);
	UV = vec2(0.0f,0.0f);
	gl_Position = Ortho*vec4(vPosition[0].x + vPosition[0].z,vPosition[0].y + vPosition[0].w,0.01f,1.0f);
	UV = vec2(1.0f,1.0f);
	gl_Position = Ortho*vec4(vPosition[0].x,vPosition[0].y+vPosition[0].w,.01f,1.0f);
	UV = vec2(0.0f,1.0f);
	gl_Position = Ortho*vec4(vPosition[0].x + vPosition[0].z,vPosition[0].y,.01f,1.0f);
	UV = vec2(1.0f,0.0f);

Basic Draw texture fragment shader.

Fragment Shader

#version 330

varying in vec2 UV;

varying out vec4 FragmentColor;

uniform sampler2D Picture;

void main(void)

FragmentColor = texture(Picture,UV);