Jump to content

  • Log In with Google      Sign In   
  • Create Account


Member Since 27 Nov 2007
Offline Last Active May 05 2012 08:19 PM

Topics I've Started

Pandemic 2.5 on App Store for iPad / iPhone

02 May 2012 - 07:10 PM

Posted Image

Pandemic 2.5, an expanded and refined version of my popular flash game Pandemic 2, has hit the App Store today for iPads, iPhones and iPod Touches.

So far the response has been great, but since this is my first commercial effort, I'm attempting to branch out and reach as many people as possible without spending money I don't have on marketing.

Posted Image

Brief feature highlights:
  • Create your very own custom disease and watch as it spreads across the world through the human population.
  • Combine real symptoms to produce the most infectious and deadly disease the world has ever seen.
  • Select from Viral, Bacterial or Parasitic disease classes based on your play-style.
  • Outmaneuver governments, health organizations and doctors as the human world tries to prevent the spread of your disease through vaccine research, quarantines, body disposal, martial laws and more.
  • Be opportunistic: take advantage of natural disasters whenever and wherever they may strike.
  • Unlock achievements and compare highscores will full Game Center support.
  • Purchase once for just $0.99, and enjoy the game on any of your devices.
More info and screens available on my site: www.darkrealmstudios.com
Link to Pandemic 2.5 in the App Store: http://itunes.apple....37492?ls=1&mt=8

And a special thank you to GameDev and its members for their insight and help!

Odd texture filtering issue

24 September 2011 - 01:00 PM

I'm approaching completion on my first OpenGL ES game (yay!) but just recently ran into an odd issue with texture filtering. I was trying to improve the game's performance when linear filtering suddenly broke. Code when creating textures was this:

glGenTextures(1, &textureID);
glBindTexture(GL_TEXTURE_2D, textureID);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, data);

But now, I'm having to set the texture parameters when binding textures and rendering geometry. Is this normal and was it just a fluke that it was working before?

Quick confirmation on Triangle Strips / Degenerate Triangles

11 August 2011 - 03:22 PM

In order to squeeze some increased performance out of iOS devices, I'm working on converting my geometry into indexed triangle strips. I found a program (htgen) that works with wavefront obj files and produces a list of indices to produce triangle strips. Brilliant I thought! But then when I tried rendering the newly generated data, the end result looked worse than this:

Posted Image

Now, initially I knew that I had to connect strips with degenerate triangles, however I had no idea that the number of degenerate triangles between strips were not constant.

After spending hours reading up on triangle strips, I finally discovered the winding of a proceeding strip can be altered if the incorrect number of degenerate triangles were inserted in between two strips. I spent a few more hours writing out some basic triangle strips over pages and pages of paper and figured out (or so I thought?) exactly how many degenerate triangles were needed depending upon the previous triangle's number.

So please correct me if I'm wrong, but:
1) If I have 4 triangles, defined as the indices 0 1 2 3 4 5, OpenGL will draw 4 triangles in the following order: 0 1 2, 2 1 3, 2 3 4, 4 3 5 (which let's assume is CCW for this example).

When joining two strips, normally only 2 extra indices need to be inserted, generating 4 degenerate triangles. So building on the previous strip, if I were to want to join a new strip consisting of 3 new triangles defined as a strip of 6 7 8 9 10 (triangles 6 7 8, 8 7 9, 8 9 10), the final joined list of indices would look like 0 1 2 3 4 5 5 6 6 7 8 9 10 where the bolded numbers are the extra inserted vertices creating 4 new degenerate triangles.

If joining two strips, and the first triangle of the new strip is odd (as in, the first strip had 3 triangles, numbered 0, 1 and 2, and now the starting triangle of the 2nd strip is numbered 3) three additional indices must be inserted in between the two strips generating a total 5 degenerate triangles in order for the winding of the 2nd strip to remain CCW.

Now based on what I just outline above, I fixed up my code to insert the correct number of degenerate triangles between strips and got the above image which while is an improvement over the original, it's still in no way correct (the object should look like Canada and the US). So now I'm stuck and I have to assume that if my above understandings are correct, that the htgen program is not generating strips with consistent winding orders (the objects render perfectly fine if I turn off culling).

If that's the case, it looks like I'll have try generating strips myself because I literally cannot find another suitable application that will generate triangle strips. So in the interest of saving myself even more headache and time spent on writing my own application to generate triangle strips, I wanted to confirm that my understanding of the subject is indeed correct.

Possibly corrupt VBO/IBO?

09 August 2011 - 10:26 PM

I'm currently attempting to get vertex buffer objects to work with interleaved data for indexed triangles.

I've manually confirmed that both the index data and the actual vertex data is correct prior to being loaded into their appropriate buffer objects. The array holding the index data is simply an array of unsigned shorts. The vertex array is an array of floats with the following format (with each bracket symbolizing a float):

[vect x][vect y][vect z] [texture u][texture v] [norm x][norm y][norm z] [r g b a]

In total, each vertex in array is a series of 9 floats. I have the following offsets and strides as:

newMesh.vertexOffset = 0;
newMesh.vertexStride = sizeof(GLfloat) * 6;
newMesh.uvOffset = sizeof(GLfloat) * 3;
newMesh.uvStride = sizeof(GLfloat) * 7;
newMesh.normalOffset = sizeof(GLfloat) * 5;
newMesh.normalStride = sizeof(GLfloat) * 6;
newMesh.colorOffset = sizeof(GLfloat) * 8;
newMesh.colorStride = sizeof(GLfloat) * 8;

Here is where and how I create the buffer objects. I was originally using glMapBufferOES() but replaced it with a simpler solution in an attempt to find what is wrong. The arrays being passed into the objects are allocated dynamically, but I'm unsure if I should free them after having passed them to the buffer objects.

glGenBuffers(1, &vertexBufferID);
glBindBuffer(GL_ARRAY_BUFFER, vertexBufferID);
glBufferData(GL_ARRAY_BUFFER, vertexBufferSize, vertexBufferList, GL_DYNAMIC_DRAW);
glBindBuffer(GL_ARRAY_BUFFER, 0);

//creating index buffer object
glGenBuffers(1, &indexBufferID);
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, indexBufferID);
glBufferData(GL_ELEMENT_ARRAY_BUFFER, indexBufferSize, indexBufferList, GL_STATIC_DRAW);

The buffer sizes have been checked and rechecked and are correct. And here is how I actually go about attempting to render the content. It's probably safe to ignore the texture code block but I included it just in case:


glTranslatef(tempMeshInstance.x ,tempMeshInstance.y, tempMeshInstance.z);
glRotatef(tempMeshInstance.rotX, 1.0f, 0.0f, 0.0f);
glRotatef(tempMeshInstance.rotY, 0.0f, 1.0f, 0.0f);
glRotatef(tempMeshInstance.rotZ, 0.0f, 0.0f, 1.0f);
glScalef(tempMeshInstance.scaleX, tempMeshInstance.scaleY, tempMeshInstance.scaleZ);

//binding vertices
glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, tempMeshInstance.meshPtr.indexBufferID);
//binding vertex buffer objects
glBindBuffer(GL_ARRAY_BUFFER, tempMeshInstance.meshPtr.vertexBufferID);
//passing the mesh instance data to OpenGL
glMaterialfv(GL_BACK, GL_AMBIENT_AND_DIFFUSE, tempMeshInstance.material);
glVertexPointer(3, GL_FLOAT, tempMeshInstance.meshPtr.vertexStride, ((char *)NULL + (tempMeshInstance.meshPtr.vertexOffset)));
glTexCoordPointer(2, GL_FLOAT, tempMeshInstance.meshPtr.uvStride, ((char *)NULL + (tempMeshInstance.meshPtr.uvOffset)));
glNormalPointer(GL_FLOAT, tempMeshInstance.meshPtr.normalStride, ((char *)NULL + (tempMeshInstance.meshPtr.normalOffset)));
glColorPointer(4, GL_UNSIGNED_BYTE, tempMeshInstance.meshPtr.colorStride, ((char *)NULL + (tempMeshInstance.meshPtr.colorOffset)));

//working with textures
if (tempMeshInstance.meshPtr.textureID != prevTextureID && tempMeshInstance.meshPtr.textureID != 0) {
	glBindTexture(GL_TEXTURE_2D, tempMeshInstance.meshPtr.textureID);
	//if 2d texturing is disabled, renable it
	if (!glIsEnabled(GL_TEXTURE_2D))
} else if (tempMeshInstance.meshPtr.textureID == 0 && glIsEnabled(GL_TEXTURE_2D))
prevTextureID = tempMeshInstance.meshPtr.textureID;

//rendering the mesh instance
glDrawElements(tempMeshInstance.meshPtr.renderFormat, tempMeshInstance.meshPtr.indices, GL_UNSIGNED_SHORT, ((char *)NULL + (0)));
//popping the matrix off the stack

//continuing through list
gameObjs = gameObjs.nextNode;
glBindBuffer(GL_ARRAY_BUFFER, 0);

And this is the lovely rendering I'm rewarded with:

Posted Image

Which looks nothing like what the actual object used to look like. So in summary, I don't know if the buffer objects are somehow becoming corrupted, or if I'm incorrectly using the buffer objects (strides and offsets?), or if I'm incorrectly creating the buffer objects, but something is clearly wrong.

And clues or hints as to what may be causing this would be more than welcome!

Odd Lighting Issue

02 August 2011 - 02:56 PM

Small disclaimer: new to openGL and am working from Xcode on my iPod touch in openGL ES.

I've been diving into openGL head first on the iOS over the past week and have just finished writing a file parser for loading in models. I'm currently just working with triangles and normals and lighting - I'm not attempting textures at the moment and am simply organizing the data internally in an array of triangles before I attempt to do triangle strips.

The model I've loaded is concave and looks correct unless at certain angles (I don't have a depth buffer in yet). I'm currently just rotating the model about it's x-axis and have one light in the scene. The issue is, even with culling on, the model is appearing completely white when turned 180 degrees and completely black while turned 0 degrees.

It's more than possible I'm just making a stupid beginners mistake. The light position should be well outside the geometry of the object and I've tried both back and front culling so I'm not sure what else could be the issue.

Here is some code:

//creating openGL lighting
GLfloat lightAmbient[] = {0.2f, 0.2f, 0.2f, 0.0f};
GLfloat lightDiffuse[] = {1.0f, 1.0f, 1.0f, 0.0f};
GLfloat lightSpecular[] = {1.0f, 1.0f, 1.0f, 0.0f};

glLightfv(GL_LIGHT0, GL_AMBIENT, lightAmbient);
glLightfv(GL_LIGHT0, GL_DIFFUSE, lightDiffuse);
glLightfv(GL_LIGHT0, GL_SPECULAR, lightSpecular);


GLfloat lightPosition[] = {0.0f, 0.0f, 10.0f, 1.0f};
glLightfv(GL_LIGHT0, GL_POSITION, lightPosition);

//face culling

[(EAGLView *)self.view setFramebuffer];

MeshInstance *tempMesh = [MeshLibrary createMeshInstance:MESH_RED_BLOOD_CELL];

glClearColor(0.5f, 0.5f, 0.5f, 1.0f);

glScalef(0.5, 0.5, 0.5);
glTranslatef(0, 0, 0);
rotX += 0.5f;
glRotatef(rotX, 1.0, 0, 0);
transY += 0.075f;

glVertexPointer(3, GL_FLOAT, 0, tempMesh.meshPtr.vertices);
glNormalPointer(GL_FLOAT, 0, tempMesh.meshPtr.normals);

glDrawArrays(GL_TRIANGLES, 0, tempMesh.meshPtr.vertexCount);
[tempMesh dealloc];

[(EAGLView *)self.view presentFramebuffer];

And here are some screenshots demonstrating the issue:
Posted Image Posted Image

Posted Image Posted Image

Any help would be greatly appreciated! :)