Jump to content

  • Log In with Google      Sign In   
  • Create Account


Member Since 29 Apr 2002
Offline Last Active Yesterday, 09:34 PM

#5173038 QT vs. wxWidgets for OpenGL

Posted by NumberXaero on 12 August 2014 - 01:49 AM

Widgets does have its own delegates called events, you can call Connect() (Bind in newer versions) to connect a function to an object id, the object is used the function is called. The function doesnt have to belong to a unique class, it can be a previously derived class. It also has an event table that is similar to how MFC works.


class ClassButtonLivesIn : public ...


   wxButton* someButtonNotDerived;



// in ctor ClassButtonLivesIn

someButtonNotDerived->Connect( wxEVT_COMMAND_BUTTON_CLICKED, wxCommandEventHandler( ClassButtonLivesIn::OnButtonClick ), NULL, this);


and then call Disconnect() later

#5172993 How to calculate camera roll...

Posted by NumberXaero on 11 August 2014 - 08:34 PM

Build qpitch, qyaw, combine for qPitchYaw which now holds the new camera forward after pitch and yawing, build qroll from this new forward direction, combine qroll with qpitchYaw for qPitchYawRoll the new orientation.

#5168555 Post process order?

Posted by NumberXaero on 22 July 2014 - 08:35 PM

Wondering what people think, or what you do yourself and why, stick any I may have missed in.


Screen space reflections - havent done them, not sure where, close to start?...
Eye Adaptation - builds info probably done early?
Lens flare - get it in before bloom?

Bloom - before dof ??
DOF - you want the bloom 'under' it?

FXAA - you want to AA toward then end right?

Color Correction - want final color, before correcting...

Tone Mapping - fit for ldr

Gamma Correction - correct for screen output

Vignette - because its just black on top of everything else...

#5167043 PhysX - Stick controllers to kinematic actors

Posted by NumberXaero on 15 July 2014 - 01:36 PM

Yeah the move flags wont be able to solve everything, the horizontal stuff is trickier. The horizontal movement behavior would probably be best handled in the PxUserControllerHitReport which would return the world normal of the wall you hit and at what point, then you could handle custom sliding by reducing the xz velocity based on the angle you hit at, where the move flags would be used to kill vertical movement from hitting above or below and provide a hint about side hits.


My velocity use was more limited to vertical movement, for falling and jumping. When it came to horizontal movement I simply passed a move amount because the horizontal movement was tied to the motion extracted from the walk/run animations, so I knew how much to move based on the animation otherwise he would look like he was sliding if I moved more (or less) then what looked right according to the animation.

#5166930 PhysX - Stick controllers to kinematic actors

Posted by NumberXaero on 15 July 2014 - 01:45 AM

Well keep in mind these are move flags, returned as the result of a move call, I believe the sides flag is means that you hit something in the direction of the move around the middle part of the controller shape, meaning you are blocked from moving sideways by something which Im guessing cant be auto stepped, PxControllerDesc::stepOffset.

The controllers really just to keep you from go through the world (although you still do sometimes :-( ).

If your talking about getting side hit by projectiles in a gameplay fashion, you might want to catch hits in the callbacks and check the direction between the controller and the actor/shape hitting you to know exactly which side you were hit from.


For slopes, grab the world info in a PxUserControllerHitReport callback, check the normal of the object youre standing on and factor the slope angle and gravity force into your next move adjustment to slide move down the slope if the angles suitable.


They have changed things in PhysX so often I cant remember if there was an auto slide at one time or not depending on the slope angle, might still be Im not sure.

#5166799 PhysX - Stick controllers to kinematic actors

Posted by NumberXaero on 14 July 2014 - 02:22 PM

The px controller only uses a move interface, the way I did it is each frame calculate a move offset (displacement), pass it to the controller, and get the corrected position back after the update.

I dont pull velocity out of the physx controller, but I do use velocity in a way similar to how your doing it. I use my own force/acceleration/velocity/position to create a displacement amount, I pass my position change to the controller, and the controller move call returns collision flags. If the flags indicate a collision below, you simply zero out the y velocity, and similar for collisions on the side.

#5166620 PhysX - Stick controllers to kinematic actors

Posted by NumberXaero on 13 July 2014 - 03:22 PM

I also dont have nvidia in this comp, but I just did a test with some kinematic actors (platforms much like yours) and a controller jumping onto it, you need the ride and slide flags.


    physx::PxControllerBehaviorFlags getBehaviorFlags(const physx::PxShape &shape,const physx::PxActor &actor)
        return physx::PxControllerBehaviorFlag::eCCT_CAN_RIDE_ON_OBJECT | physx::PxControllerBehaviorFlag::eCCT_SLIDE;


without it, my platforms slide away from under the controller.


Edit: you probably want to check that the actor is kinematic first before returning those flags, riding dynamics might not work well.

#5166486 PhysX - Stick controllers to kinematic actors

Posted by NumberXaero on 12 July 2014 - 05:28 PM

The bridges demo shows how it works.

This is explained in the sample code, SampleBridgesCCT.cpp, by implementing the PxControllerBehaviorCallback, and allowing riding between the controller and a touched shape (which you would have to id as being able to ride).

#5165725 Public domain fonts for redistribution?

Posted by NumberXaero on 08 July 2014 - 09:54 PM

Might help







A web with web fonts is more beautiful, readable, accessible and open.

Google Fonts makes it quick and easy for everyone to use web fonts, including professional designers and developers. We believe that everyone should be able to bring quality typography to their web pages and applications.

Our goal is to create a directory of web fonts for the world to use. Our API service makes it easy to add Google Fonts to a website in seconds. The service runs on Google's servers which are fast, reliable and tested. Google provides this service free of charge.

Open Source Fonts

All of the fonts are Open Source. This means that you are free to share your favorites with friends and colleagues. You can even customize them for your own use, or collaborate with the original designer to improve them. And you can use them in every way you want, privately or commercially — in print, on your computer, or in your websites.

We are working with designers around the world to publish quality typeface designs that are made for the web. If you are a type designer and would like to discuss this, please get in touch.

Many of Google's own web pages are already using Google Fonts, such as Google's About page and Google's World Wonders Project which use Open Sans.

— The Google Fonts Team


#5163069 Creating an OpenGL context on Windows with Glew.

Posted by NumberXaero on 26 June 2014 - 01:15 PM

Using the code from your first link try


    int attribs [] =
#ifdef _DEBUG
I think "WGL_CONTEXT_PROFILE_MASK_ARB, WGL_CONTEXT_CORE_PROFILE_BIT_ARB," is needed for versions greater then 3.2

#5162245 Inverse Bine Pose

Posted by NumberXaero on 22 June 2014 - 10:11 PM

1) Its the transform of the joint at the time it was bound to the mesh, inverted. World coordinates.
2) Yes
3) ....

I use world/local terminology (local = parent relative, world = global = model)

A) Using the joint local  transforms, compute each joints world transform,  childJointWorld = parentJointWorld * childJointLocal
-> Convert to matrices first, or use quaternions/vector3 then convert, your choice, in the end youll need a world matrix to combine with the jointInvBindPose matrix

QWorld = QParentWorld * QLocal    // rotate part
TWorld = TParentWorld + (QParentWorld * (SParentWorld * TLocal)   // translate part
SWorld = SParentWorld * SLocal   // scale part

-> If a joint has no parent, world = local

B) Now you have all joint world transforms, lets call this next one jointSkinMatrix = jointWorld * jointInvBindPose
C) The model probably has skin matrix / bind matrix / bind shape matrix, whatever its called this gets applied to all the original vertices, so you can do this once and store the modified model


Long version....

vec3 skinnedVertex = vec3(0, 0, 0)
For Each Vertex v
    For Each Joint Affecting Vertex v
        skinnedVertex += ((jointWorld * jointInvBindPose) * (bindShapeMatrix * v)) * jointWeight


Again, some of this depends on matrices or quaternion use, compute on gpu or cpu, but thats the general idea. Normals, same thing, but wouldnt be translated, and would have to be normalized for correct lighting.

#5161642 AMD GLSL "subtle" or "quiet" error.

Posted by NumberXaero on 20 June 2014 - 12:09 AM

vec4 final;


isnt initialized so probably holds garbage, which you then add to. Plus Im not sure how you use the 4th component of final.

#5160457 Application/Programming side of Audio

Posted by NumberXaero on 14 June 2014 - 02:22 AM

Typically its no different then authoring a character or level with textures in the right places. The sound is a resource like a texture, model, level, shader, etc. Where textures and vertex data get passed to a graphics API to render, the same goes for a sound file. You would load sound files and use an audio API of some sort (ex. FMOD) which would play the sound.

When the sound will play would be largely dependent on the game/engine being used. In general it would be connected to the animation system some how if it were a spell being cast, for example when the spell cast animation reaches a specific frame (arm fully stretched out?) the sound would played.

If it were something related to physics, say something being hit, the physics system might trigger various events such as playing a sound on hit, you might even change the sound depending on whats hit. The thing being hit may provide the correct sound to play when its hit. It all really depends on how the code is setup and what its capable of doing.

#5156193 Entity component system designing

Posted by NumberXaero on 26 May 2014 - 10:37 PM

Not sure about

handling rendering with Position and rendering with Box2D?


Everything thats drawn needs to know where it should be drawn, typically a physics engine feeds its transform (pos/rot/scale) data to the object being drawn (I imagine box2d is similar although i admit ive never used it). Assuming this, then an object that gets drawn has position/rot/scale data, and an optional "physics" component which would update that objects data each frame based on the simulation.

If it doesnt have physics component it may be animated in some way (animation component?) or simply driven by a player, or by animation and physics components which talk to each other, or by all.

Specific components would be maintained by the systems they logically belong to, rendering, physics, animation. The trick is getting components to talk.

With entity systems Ive read lots of different ways it can be done, so I wouldnt say there is a hard rule, but from what you said, this is how I see it.

#5155514 What is happening here

Posted by NumberXaero on 23 May 2014 - 01:27 PM

A matrix generally has a rotation/scale part and a translation part. In DirectX the translation is in the 4th row, looks like the code pulls the translation part out of each matrix.