Jump to content

  • Log In with Google      Sign In   
  • Create Account

imoogiBG

Member Since 16 Jan 2011
Offline Last Active Today, 03:01 PM

Topics I've Started

3D Editor Transform Gizmos(Handles)

14 January 2016 - 03:25 AM

I'm about to write one of those Translation/Rotation/Scale handles, but honestly i don't know the math behind those GUI thingys.

The thing that i couldn't figure out is how do they appear with a constant size on screen? On the top of my head i would projected the target object position on a plane that is somewhere between the near and the far plane, and draw the gizmo at that point, but that won't work well with the perspective projection(EDIT: or will it?).

Another thing is to project the target object position(objPos) on the screen, and additionally project these points
objPos + (1,0,0); objPos + (0,1,0); objPos + (0,0,1) and determin how X,Y,Z change in screen space and then draw the gizmo in 2D. 

Do you guys know a trick on how to implement this esier?


GUI for tools

10 January 2016 - 12:56 PM

I currently use  https://github.com/ocornut/imgui but it doesn't work well for me. Its a bit of a pain to work with a bit more complex gui hierarchy. And to be honest I can't understand it well enough to extend it (and currently I want to avoid it).

 

EDIT(thanks Alberth) The libs must work on Windows, and it has to be renderer independent (as I have multiple rendering APIs).​

​What gui lib do you use for your editors and in-game tools?


D3D Debugging and Profiling..

14 November 2015 - 05:31 PM

After a few hours of profiling draw calls and getting strange numbers. I found out that the "Force on" check was set for the Debug Layer in the ​DirectX Properties menu smile.png


GL vs. D3D Texture/Viewport/Scissors Coordinates

25 October 2015 - 05:19 PM

I've been having this "liittle problem" for a while :

 

http://s7.postimg.org/k32r23lmz/Diff.png

 

Everything (except the ImGui Window is rendered to a texture and after that that texture gets rendered to the window).

 

The problem is the way APIs handle texture coordinates/Viewport/Scissors Rect. Knowing that OpenGL (0,0) tex-coord is at the left-bottom and glTexImage2D assumes that the image that is organized from bottom to the top it was really tempting to try to "force" OpenGL to deal with texture coordinates like D3D.

 

This works fine if we ignore the Frame Buffers.So In order to fake that I've flipped the projection matrices Y-cords, and I've inverted the Front Face Triangle Winding. In that case everything works fine, except for the moment when you actually draw to the window. Well the end result is flipped by Y.

 

One way to fix this is to reverse the flip when drawing to the default frame buffer(the screen).
With my custom API I know when a draw call is going to be submitted to the screen, the problem is that "in general" I don't know the name of the uniform that hold the projection matrix, In theory the projection matrix could be hardcoded into the shader code.

 

And basically I'm stuck here.. I could hardcode that case, but I really want find a solution to that problem?

...or maybe trying to make OpenGL look more D3D-ish is not the right solution?​ What is you approach?

 


Where to start with networking.

24 October 2015 - 04:39 PM

This is a topic that I really have a minimal experience with(just a simple sockets communion between 2 PCs), I really have no specific game idea in mind, I just want to learn the basics.

 

I've tried googling but to me all tutorial that I find look bad(not sure if I'm right). Can anybody give me some directions and links?

What are the multiplatform solutions/libraries that are out there?


PARTNERS