• Content count

  • Joined

  • Last visited

Community Reputation

100 Neutral

About reapz

  • Rank
  1. I also though that it might have something to do with a LH vs RH problem but the matrix -> quaternion -> matrix are well defined and don't know about handedness as I understand it. Funnily enough I ended up implementing them as you suggested. The quaternion issue was still a problem though because I use quaternion driven scene nodes, so even after building the matrix I would have to convert to a quaternion, which then gets converted back and would be wrong. The code I ended up using to do orientation: [code] // Create the right vector from facing direction and universal up direction PVRTVec3 right = direction.cross(VEC3_UNIT_Y); right.normalize(); // Create the local up vector PVRTVec3 up = direction.cross(right); up.normalize(); direction.normalize(); PVRTMat3 rotation(right.x, right.y, right.z, up.x, up.y, up.z, direction.x, direction.y, direction.z); PVRTQUATERNION q; QuaternionFromRotationMatrix(rotation, q); m_ProjectileNode->setOrientation(q);[/code] Works like a charm
  2. It turns out the issue was in the PowerVR library, but I'm not quite sure if I am doing something wrong that causes it. The Quaternion to Matrix function inside the library has the following code: [code] void PVRTMatrixRotationQuaternionF(PVRTMATRIXf &mOut, const PVRTQUATERNIONf &quat) { const PVRTQUATERNIONf *pQ; #if defined(BUILD_DX9) || defined(BUILD_D3DM) || defined(BUILD_DX10) PVRTQUATERNIONf qInv; qInv.x = -quat.x; qInv.y = -quat.y; qInv.z = -quat.z; qInv.w = quat.w; pQ = &qInv; #else [b]pQ = &quat; // Uses this[/b] #endif // Uses pQ to build matrix mOut here } [/code] I had to change to this for it to work as expected: [code] void PVRTMatrixRotationQuaternionF(PVRTMATRIXf &mOut, const PVRTQUATERNIONf &quat) { const PVRTQUATERNIONf *pQ; #if defined(BUILD_DX9) || defined(BUILD_D3DM) || defined(BUILD_DX10) PVRTQUATERNIONf qInv; qInv.x = -quat.x; qInv.y = -quat.y; qInv.z = -quat.z; qInv.w = quat.w; pQ = &qInv; #else [b]// Uses this PVRTQUATERNIONf qInv; qInv.x = -quat.x; qInv.y = -quat.y; qInv.z = -quat.z; qInv.w = quat.w; pQ = &qInv;[/b] #endif // Uses pQ to build matrix mOut here } [/code] Once this was done, I can convert a quaternion to a matrix and back again and it is correct.
  3. I'm developing a game for OpenGL ES 2.0 and using the PowerVR SDK which comes with Vector, Matrix and Quaternion functionality. I have a scene hierarchy similar to Ogre that is using Quaternions for Scene Node orientations. I am trying to orient a model to face a direction in world space. The model by default is oriented to face the Z axis. This is what I'm doing each frame: [code]PVRTQUATERNION q = getRotationTo(VEC3_UNIT_Z, direction); //q.w = -q.w; // Doesn't work unless I do this, seems to be inverted m_SceneNode->setOrientation(q); [/code] getRotationTo() code (from Game Programming Gems) [code]PVRTQUATERNION getRotationTo(const PVRTVec3& original, const PVRTVec3& dest, const PVRTVec3& fallbackAxis = VEC3_ZERO) { PVRTQUATERNION q; PVRTVec3 v0 = original; PVRTVec3 v1 = dest; v0.normalize(); v1.normalize(); float dot =; if (dot >= 1.0f) return Q_IDENTITY; if (dot < 1e-6f - 1.0f) { if (!(fallbackAxis == VEC3_ZERO)) { // Rotate 180 degrees about the fallback axis PVRTMatrixQuaternionRotationAxis(q, fallbackAxis, 3.14159265f); } else { // Generate an axis PVRTVec3 axis = VEC3_UNIT_X.cross(v0); if (axis.lenSqr() == 0.0f) axis = VEC3_UNIT_Y.cross(v0); // pick another axis if colinear axis.normalize(); PVRTMatrixQuaternionRotationAxis(q, axis, 3.14159265f); } } else { float s = sqrtf((1+dot)*2); float invs = 1 / s; PVRTVec3 c = v0.cross(v1); q.x = c.x * invs; q.y = c.y * invs; q.z = c.z * invs; q.w = s * 0.5f; PVRTMatrixQuaternionNormalize(q); } return q; }[/code] Inside the Scene Node class the quaternion gets converted to a matrix and then combined into a 4x4 scale, rotation and translation matrix. My problem is that unless I use the line q.w = -q.w; after the getRotationTo function then the orientation seems inverted and incorrect. I can't seem to figure out what the issue is and would appreciate some insight from a math guru [img][/img]
  4. I finished a games programming degree last year where we made a RTS game for our major project. It didn't become as polished as we would have liked nor did the design on the code side really get to where we wanted. The main reason for this was time pressure as we got an internship at the same time and prioritised that which is where I am still working now on mobile games. We decided to use OGRE, OpenSteer (for unit flocking), FMOD for audio but the rest was hand coded. We could have used something like Unity or UDK but we wanted to enhance our C++ skill set. I'd say the most important thing is to get it fun early, and start EARLY! We also did a multiplayer RTS game for a networking project that turned out to be more fun than our major. It is funny how things on paper can sound more fun but when they are implemented it doesn't always go according to plan. You should both shoot me a message if you want to have a chat about this stuff, I may shoot you one after work. I'd be happy to show you the game my reel is around somewhere and I think the source is still around somewhere too. But I wasn't too happy with the source truth be told, dirty rushed code.
  5. [quote name='YogurtEmperor' timestamp='1320131636' post='4879164'] An owner and a parent are not the same thing. I am wondering if you originally meant that the rocket should parent the smoke. [/quote] With the game I'm working on the game logic and rendering logic are separate, so on one side I have game Entity objects and the other I have Scene objects in the classic scene hierarchy with cascading transforms. When I say owner, I mean the RocketEntity owns a ParticleEntity. The RocketEntity looks after a SceneNode object which holds a Mesh object and the ParticleEntity looks after a SceneNode with the ParticleSystem object. I have these two separated because the ParticleEntity is freed after the Rocket is freed. There are two particle cases I've had to consider and in both cases I have wanted the ParticleEntity to be readded to a reusable pool when it has completed. The first case is that the particle has an infinite duration and will only stop emitting when told to, this does not mean the particles should just disappear but the emitters should stop creating new particles. The second case is a particle of finite length, in both cases once they are finished they get reset and added back to a reusable pool of ParticleEntity objects. This behaviour needs to occur regardless of other game entities. [quote name='Madhed' timestamp='1320146027' post='4879206'] ParticleEmitter: SphereEmitter (emits inside a sphere), MeshEmitter (emits particles from the vertices of a mesh), RingEmitter, ... etc. ParticleAnimator: animates particle properties over time like size, rotation, color, UVs, position, adds gravity, etc. ParticleRenderer: BillboardParticleRenderer (aligns quads with camera), TrailParticleRenderer (creates a trail from the oldest to the newest particle), etc... [/quote] My current setup is something like this: A ParticleSystem can have a number of ParticleEffectGroups. A ParticleEffectGroup has a ParticleRenderer (Billboard, Beam, Ribbon etc) as well as ParticleEmitters (Box, Point etc) and ParticleAffectors (Colour fade, velocity etc). This has allowed me to get some complex particles into the game and hasn't provided more than was needed. My main concern has been where to store all the instances I need over the course of the game so I am not calling new/delete constantly, my solution was to create a PoolManager essentially. Think about this case: If the player holds down the shoot button they can only ever have 2 rockets active in the game world at any one time. You might logically only pre-cache 2 smoke trail effects, one for each rocket but what if the smoke trail lingers after the rocket trail impact, and you go to fire another rocket but we have no cached particle effect yet because the smoke trails are still fading out and have not been freed yet. My main problem is still how to handle the lifetime of the ParticleEntity objects so that are not directly coupled to GameEntity objects. Right now I have wrapped a ParticleSystem into a ParticleEntity object which monitors the lifetime of the ParticleSystem and puts it back in the pool when done.
  6. What I didn't say is that this project is for mobile devices so as of right now I have no geometry shader. In my case a particle system is a collection of emitters and affectors so I can't just spawn an emitter at every projectile location. Each one is a self contained effect. My main issue is that I can't just give every Projectile its own particle system because sometimes the life of a particle exceeds the life of a projectile and vice versa. [quote name='J-dog' timestamp='1320128735' post='4879157'] I think also just consider how much optimization you really need - there's no need to go overboard, and work out a clean design first - "premature optimization is the root of all evil". [/quote] I don't feel like pooling objects which are reused over the course of the game to be premature optimisation. I consider this intelligent optimisation. Using the new and delete keywords at runtime is generally bad for performance in games, and I am not delving into low level memory management with custom allocators. [quote name='YogurtEmperor' timestamp='1320130181' post='4879161'] If the rocket owns the particle the particle will move with the rocket and die with it. [/quote] What if the smoke particle is a lingering effect, and I can't just kill it once the rocket dies. I understand what you're saying about linking them through the scene structure rather than ownership, I'll have to mull on that.
  7. I'm currently working on optimising the allocation of particle systems in a game I'm working on. I'm curious to see what other people think. Lets say we have Projectiles, in this case a Rocket projectile. The rocket projectile has a smoke trail which persists even after the rocket impact. Should the rocket have ownership of this particle effect? I say it should create a ParticleEntity which gets added to the world separately and store a pointer to it. My current solution has been to have a ProjectileFactory which manages the reusable projectile objects, each one has a particle type ID which refers to a particle pool inside the ParticleSystemManager class. The ParticleSystemManager holds pools of reusable ParticleEntity objects essentially. All of these classes exist on the game side of the code, not the engine. I guess I'm just curious how other people handle this sort of thing. I have also been meaning to take a look in Q3 or HL2 source code to see how they do it but haven't got around to it yet.
  8. Thanks for the response. I guess it is pretty much what I expected. I think you would probably need some sort of Shell based class in which you would derive an AndroidShell and IOSShell from, again using #ifndefs. These two classes would be where the OS specific Keyboard object is created and used from. So the Shell would be the interfacing layer between the engine/app and the OS. In other words, the app never needs to know about which OS is on the other side. Am I on the right track? Or is there a better way?
  9. I am currently working on an iPhone game which uses OpenGL ES where most of the code is in C++ so that I can port to Android later on. My problem is I'm not quite sure where to start in building a cross platform layer between the C++ of my engine and the Objective-C of the OS. An example would be accessing the native keyboard of the device. I'd like to be able to call a function in the wrapper which would return a keyboard object which handles the keyboard of that particular OS. I did try once with function pointers but ran into a nasty issue with Objective-C that wouldn't have been trivial to solve. Does anyone either have any suggestions, or a good source of material to read about the subject? Ideally I'd like to write something nicely object oriented and reusable for future projects.
  10. It turns out what I thought was working wasn't actually working. I need to stop thinking maybe perhaps it might look right and only accept when I know it looks and behaves 100% as it should. The problem for those wondering were these lines. [code] mediump mat3 tangentSpace = mat3(inTangent, normalize(cross(inTangent, inNormal)), inNormal); mediump vec3 light_pos_model = (matInvWorld * vec4(light_pos, 1.0)).xyz; = tangentSpace * normalize(light_pos_model - inVertex); [/code] For some reason it would not work correctly until I turned the above into the following: [code] mediump vec3 light_pos_model = (matInvWorld * vec4(light_pos, 1.0)).xyz; mediump vec3 light_dir_model = light_pos_model - inVertex; outLightVec.x = dot(inTangent, light_dir_model); outLightVec.y = dot(inBitangent, light_dir_model); // Also works as TANGENT X NORMAL outLightVec.z = dot(inNormal, light_dir_model); [/code] As far as my understanding goes, I thought that the dot of each component was equal to the matrix multiplication that is above. Clearly the matrix is not being created the way I expected in the first case.
  11. I've recreated the problem in RenderMonkey. File is here [url=""]RenderMonkey File[/url]. As far as I understand, if I rotate/scale and translate an object and use the inverse of this transform on the light position, that should give me the light position in object space. If I have the object space light position, I can get the direction using the vertices in object space (light_pos_model - vertex_model). Because we are in object space, the normal, binormal and tangent are also already in object space so everything is in sync, I should then be able to go from light_direction object space to tangent space. Is this not correct, I am at a bit of a loss as to why it is not working. *Update* Well, after lots of fiddling and tweaking I think I got it to work properly. I think it may have been working properly this whole time but just not on the cube. Two sides of the cube always seem to be incorrect/unlit which makes me think maybe it is just the cube mesh. I guess that leads to my next question, how do you know when normal mapping looks right or when something looks off, is that up to the artist to decide? I mean on a model I have it looks right enough but any wrong doings, much like the cube could be result of bad normals could they not?
  12. I seem to be having a problem with a normal map shader. It works fine when the object is at the origin without a world transform, IE in object space. I can also get it rotating and working correctly by rotating the normals with the rotation/scale portion of the world matrix. Although when I move the object away from the origin, IE translation I start to run into problems. I figured what I needed to do was to multiply the light position by the inverse of the world matrix to get the light position in object space, then calculate the direction and convert to tangent space. The problem I am having is the lighting looks close to right on two faces of a cube I'm testing on but wrong on the other two. I can't include a screenshot right now but here is my shader code, it would be helpful if anyone can point out any glaring flaws as I can't seem to pick it. Any other shader optimization tips would be appreciated also as this is my first foray into shaders in a commercial title Vertex Shader [code]uniform highp mat4 matViewProjection; uniform highp mat4 matWorld; uniform highp mat4 matInvWorld; uniform mediump vec3 light_pos; uniform mediump vec3 light_inner_outer_falloff; attribute highp vec3 inVertex; attribute mediump vec3 inNormal; attribute mediump vec3 inTangent; attribute mediump vec2 inTexCoord; varying mediump vec2 outTex0; varying mediump vec4 outLightVec; void main(void) { highp vec4 pos = matWorld * vec4(inVertex, 1.0); gl_Position = matViewProjection * pos; outTex0 = inTexCoord; // Calculate light direction in world space for the light attenuation mediump vec3 lightDirection = light_pos -; float dist = length(lightDirection); lightDirection = lightDirection / dist; // Calculate light intensity using world space distance outLightVec.w = pow(clamp((light_inner_outer_falloff.y - dist) / (light_inner_outer_falloff.y - light_inner_outer_falloff.x), 0.0, 1.0), light_inner_outer_falloff.z); // Create tangent space matrix mediump mat3 tangentSpace = mat3(inTangent, normalize(cross(inTangent, inNormal)), inNormal); // Convert light_pos into object space mediump vec3 light_pos_model = (matInvWorld * vec4(light_pos, 1.0)).xyz; // Convert light direction into tangent space = tangentSpace * normalize(light_pos_model - inVertex); }[/code] [code] uniform sampler2D DiffuseMap; uniform sampler2D NormalMap; uniform mediump vec3 light_color; varying mediump vec2 outTex0; varying mediump vec4 outLightVec; void main(void) { mediump vec3 normal = normalize(texture2D(NormalMap, outTex0).rgb * 2.0 - 1.0); mediump float diffuseAttn = clamp(dot(normal,, 0.0, 1.0); gl_FragColor = clamp(texture2D(DiffuseMap, outTex0) * outLightVec.w * diffuseAttn, 0.0, 1.0); } [/code]
  13. Major Project Engine Choice

    The idea is basically a RTS game that features a blend of combat and management elements like Theme Park or Rollercoaster Tycoon. Funnily enough I was just looking at OgreMax this morning, and was intrigued by the .scene loading. I'll have a better look at the .scene exporting with user defined data. Ideally what I was thinking is that I'd like the designers to be able to build RTS levels and do the user interfaces. The terrain will feature static assets and textures. The terrain would obviously need areas which units can't walk around, perhaps predefined pathing nodes and territory nodes (to break the terrain into regions). Ideally this information would be stored in the level file and not be insanely tedious to tweak. It would also be good if the designers could work on the GUI elements and script particle effects. We are currently doing a networking and AI project in Ogre, using CEGUI and Winsock2 for the networking. So another plus in using Ogre is that we have this framework in place and have the experience there.
  14. Major Project Engine Choice

    Cheers for the Panda and C4 links, I'll take a look. Yes but we only have 24 weeks. I mean, I've read some post mortems over at and read one in particular about a team who chewed up most of their project time making a level editor to justify the designer's roles.
  15. Major Project Engine Choice

    I will definately be keeping an eye on the new App Store rules, to see if Unity and Torque are banned or not because in the end they compile on XCode (right?). Ogre3D is probably the most appealing at this early stage from the point of view of us programmers, that is we will learn alot about C++ game architecture. But it will also be alot more work to get things up and running. Like I said because this is rendering only and we'll need to implement the other components. Plus Ogre3D has no level editor or any real tools that we could use. Do you think this is a good choice, understanding we will have no tools for designers to use? If we did use something really high level like Unity 3D hypothetically, how is this looked upon by hirers in regard to programmers?