• 9
• 13
• 9
• 18
• 19
• ### Similar Content

• By elect
Hi,
ok, so, we are having problems with our current mirror reflection implementation.
At the moment we are doing it very simple, so for the i-th frame, we calculate the reflection vectors given the viewPoint and some predefined points on the mirror surface (position and normal).
Then, using the least squared algorithm, we find the point that has the minimum distance from all these reflections vectors. This is going to be our virtual viewPoint (with the right orientation).
After that, we render offscreen to a texture by setting the OpenGL camera on the virtual viewPoint.
And finally we use the rendered texture on the mirror surface.
So far this has always been fine, but now we are having some more strong constraints on accuracy.
What are our best options given that:
- we have a dynamic scene, the mirror and parts of the scene can change continuously from frame to frame
- we have about 3k points (with normals) per mirror, calculated offline using some cad program (such as Catia)
- all the mirror are always perfectly spherical (with different radius vertically and horizontally) and they are always convex
- a scene can have up to 10 mirror
- it should be fast enough also for vr (Htc Vive) on fastest gpus (only desktops)

Looking around, some papers talk about calculating some caustic surface derivation offline, but I don't know if this suits my case
Also, another paper, used some acceleration structures to detect the intersection between the reflection vectors and the scene, and then adjust the corresponding texture coordinate. This looks the most accurate but also very heavy from a computational point of view.

Other than that, I couldn't find anything updated/exhaustive around, can you help me?
• By drcrack

It is a combination of fundamental RPG elements and challenging, session-based MOBA elements. Having features such as creating your unique build, customizing your outfit and preparing synergic team compositions with friends, players can brave dangerous adventures or merciless arena fights against deadly creatures and skilled players alike.

This time with no grinding and no pay to win features.

We're still looking for:
1) 3D Character Artist
2) 3D Environment Artist
3) Animator
4) Sound Designer
5) VFX Artist

Discord https://discord.gg/zXpY29V or drcrack#4575
• By KarimIO
Hey guys! Three questions about uniform buffers:
1) Is there a benefit to Vulkan and DirectX's Shader State for the Constant/Uniform Buffer? In these APIs, and NOT in OpenGL, you must set which shader is going to take each buffer. Why is this? For allowing more slots?
2) I'm building an wrapper over these graphics APIs, and was wondering how to handle passing parameters. In addition, I used my own json format to describe material formats and shader formats. In this, I can describe which shaders get what uniform buffers. I was thinking of moving to support ShaderLab (Unity's shader format) instead, as this would allow people to jump over easily enough and ease up the learning curve. But ShaderLab does not support multiple Uniform Buffers at all, as I can tell, let alone what parameters go where.
So to fix this, I was just going to send all Uniform Buffers to all shaders. Is this that big of a problem?
3) Do you have any references on how to organize material uniform buffers? I may be optimizing too early, but I've seen people say what a toll this can take.

• Hi, I'm new here and would like to get some help in what i should do first when designing a game? What would you consider to be the best steps to begin designing my game? Give resources with it as well please.

• How do you handle situations where the physics engine produces a 129x129 height field which results in 2M vertexes for every frame. The frame rate suffers a lot, and drops to 2-3 fps.
Is there a way to reduce the traffic or burden over the graphics card? Is the technique of LOD and progressive mesh still apply?
thanks
Jack

# Transformation lags/overtakes vector when trying to deform along path via HLSL

## Recommended Posts

I'm trying to implement animation by deformation along a path in Miku Miku Dance.  This problem is interesting to me, it opens up new options to animators, and it seems like a good way for me to learn more about transformations.  I'm doing my deformation in a HLSL vertex shader, using bones as path nodes, using quaternions to create matrices to rotate my vertices, traveling down the path and rotating as I go.  I don't understand quaternion math, but I found this code online, and it's worked for me other places

It's almost right.  I really think I'm doing the right thing.  Almost.  But my angles aren't right.  Demonstrated in the picture (yes, I'm using an actual arrow model to test).

At 90 degree intervals, the angles are correct.  As I go from no transformation to 90 degrees, the transformation lags the vector.  From 90 degrees to 180 degrees, the transformation overtakes the vector.  This is symmetrical; transformation lags the -45 degree vector same as +45 degrees.

Here is the code I've written.  I'm trying to include only relevant bits.  I can include everything if anybody wants, just trying to spare you.  This is for shader model 3.0/DX9.

...
float4 pos0 : CONTROLOBJECT < string name = PATHMODEL; string item = "0"; >;
//leave at origin, indicates beginning of deformation
float4 pos1 : CONTROLOBJECT < string name = PATHMODEL; string item = "1"; >;
//first node, proceeding from origin
...
float3 rotateAxis(float3 pos, float3 origin, float3 axis, float angle) {
//rotates pos around origin in axis by angle in rads using quaternion
pos -= origin;
float4 q;
q.xyz = axis*sin(angle/2.0f);
q.w = cos(angle/2.0f);
q = normalize(q);
float3 temp = cross(q.xyz, pos) + q.w * pos;
pos = (cross(temp, -q.xyz)+dot(q.xyz,pos)*q.xyz+q.w*temp);
pos += origin;
return pos;
}
...

VS_OUTPUT Basic_VS...
float4 wPos = mul( Pos, WorldMatrix );
float3 vec0 = YVEC;
//primary axis, as vertices travel in positive Y axis they are deformed
float3 vec1 = normalize(pos1.xyz - pos0.xyz);
float extent = wPos.y; extent -= pos0.y;
if (extent > 0.0f) {
float3 axis = cross(vec0, vec1);
float angle = (PI*(1.0f-((dot(vec0,vec1)) + 1.0f)/2.0f));
wPos.xyz = rotateAxis(wPos.xyz, pos0.xyz, axis, angle);
}
Out.Pos = mul( wPos, ViewProjMatrix );
...

Am I misunderstanding the dot product here?  Does my function not do what I think it does?  Something else?  Any help is greatly appreciated.  I'm an amateur, I try to read and learn, but no formal education, no experience, and no people around me studying the same things, and I'm really grateful for the people on this forum that provide help.

##### Share on other sites

On a night's rest, it seems to me that I shouldn't be using angle = PI*(1.0f-((dotProd + 1.0f)/2.0f)); but should instead be using angle = acos(dotProd); .  However, this apparently gives me an apparently identical response through angles up to Pi/2 radians, and breaks down when it reaches something like Pi*4/3.  Seems like it's right theoretically, but looks entirely wrong.

The relationship is not a power relationship.  Nevertheless, using the code I provided above, but adding dotProd = 1.0f - pow(1.0f-dotProd, 0.25f); gives me something very close to correct.  Currently I'm just hacking my way through it with this correction, creating a new node at the not-quite-right angle in order to approach the correct path.