 # taxfromdk

Member

32

134 Neutral

• Rank
Member

• Website
• Interests
Programming

• Github
taxfromdk

## Recent Profile Visitors

The recent visitors block is disabled and is not being shown to other users.

1. I have no clue why you mention gravity. This is only a mapping from the world into an image sensor. I ended up using a non analytic method where I have computed the space of the mirror as a matrix. I can then easily transform position and direction into a space that is aligned with the mirror axis. Before sampling I add an offset from the point to the Z axis and after sampling I offset away from the axis again. I discard cases where the offset point hit the mirror within the offset radius. This seem to work :)
2. Hi Guys, I am looking into the mathematics of a curved mirror. More specifically this mirror: http://paulbourke.net/dome/LucyCamera/ I dont have it in my hand yet but that does not keep me from modelling its behavior. I have therefore written a simple raytracer for how rays are reflected on the surface. I assume the surface to be parabolic, as in z = r² = sqrt(x²+y²)²=x²+y², where z is the height of the mirror and r is the radius in x, y. Using a point and a direction P and D I can write x=Px+u*Dx y=Py+u*Dy z=Pz+u*Dz Pz+u*Dz = (Px+u*Dx)²+(Py+u*Dy)² Pz+u*Dz = Px²+ u²*Dx²+2*u*Dx*Px+Py²+ u²*Dy²+2*u*Dy*Py 0= u²* (Dx² + Dy²) + u* (2*Dx*Px+2*Dy*Py-Dz) + Px²+Py²+- Pz Finally I end up with 2. degree formula, that I can solve using https://en.wikipedia.org/wiki/Quadratic_equation It ends up in 0, 1 or 2 values of U that I can use to find the X, Y and Z coordinates. With the point I can find the two tangent vectors and cross them to reveal a surface norma. that i can reflect around. It reveals (when adjusting location and size of camera and image) images like this: That is great and all, but I feel my mirror spend too many pixels on areas just below the mirror. I would therefore like to make the surface of the mirror steeper by offsetting all points on the surface against the axis of the mirror. The surface should then be something like. z = (r+A)² , where A is an offset I would add My problem is now that I have no idea how to solve the equation, as it explodes in complexity. I tried typing it into Sage, u, px, py, pz, dx, dy, dz, A = var('u px py pz dx dy dz A') S = solve([(sqrt((px+u*dx)^2+(py+u*dy)^2)+A)^2==pz+u*dz], u) print S [ u == -1/2*(2*dx*px + 2*dy*py - dz + sqrt(-4*A^2*dx^2 - 4*A^2*dy^2 - 4*dy^2*px^2 - 4*dx^2*py^2 - 4*dx*dz*px + dz^2 + 4*(2*dx*dy*px - dy*dz)*py + 4*(dx^2 + dy^2)*pz - 8*(A*dx^2 + A*dy^2)*sqrt((dx^2 + dy^2)*u^2 + px^2 + py^2 + 2*(dx*px + dy*py)*u)))/(dx^2 + dy^2), u == -1/2*(2*dx*px + 2*dy*py - dz - sqrt(-4*A^2*dx^2 - 4*A^2*dy^2 - 4*dy^2*px^2 - 4*dx^2*py^2 - 4*dx*dz*px + dz^2 + 4*(2*dx*dy*px - dy*dz)*py + 4*(dx^2 + dy^2)*pz - 8*(A*dx^2 + A*dy^2)*sqrt((dx^2 + dy^2)*u^2 + px^2 + py^2 + 2*(dx*px + dy*py)*u)))/(dx^2 + dy^2) ] Instead of getting two nice solutions, I get two solutions where u appar on both sides of the equation. I cant compute that!?!? What to do next? Kind regards Jesper
3. I need a measure for how close a collection of points is to forming a straight line in 2d. Currently I use Principle Component Analysis in 2d and take the mean of the of square on the second axis. It works fine, but I feel it is overkill. I would like something simpler I could express in a li rary like Tensorflow. Is there a simpler way?
4. im at a wedding but this was my scratchpad solution i did over breakfast. it does converge now in some cases. not very fast though. i use node to run it and it requires threejs library var THREE = require("three"); function printMatrix(m) { console.log(m.elements, m.elements, m.elements, m.elements); console.log(m.elements, m.elements, m.elements, m.elements); console.log(m.elements, m.elements, m.elements, m.elements); console.log(m.elements, m.elements, m.elements, m.elements); console.log(); } //Create matrix from degrees of freedom function getMatrix(rvec, tvec) { var M_r = new THREE.Matrix4().makeRotationFromEuler(rvec); var M_t = new THREE.Matrix4().makeTranslation(tvec.x, tvec.y, tvec.z); var r = M_t.multiply(M_r) return r; } //Find directions given a matrix function getDirections(worldpoints, matrix) { var directions = [] for(var i = 0; i < worldpoints.length; i++) { //console.log(i); var d = worldpoints.clone(); d.applyMatrix4(matrix); var l = Math.sqrt(d.x*d.x+d.y*d.y+d.z*d.z); if(l > 0.0) { d.multiplyScalar(1.0/l); } d.w = 0.0; directions.push(d); } return directions; } //Compare similarity between directions function error(d0, d1) { var error = 0.0; for(var i = 0; i < d0.length; i++) { var angle_rad = Math.acos(d0.x * d1.x + d0.y * d1.y + d0.z * d1.z); var e = angle_rad*angle_rad; error += e; } return error; } function clamp(i, m) { if(i > m)return m; if(i < -m)return -m; return i; } var rvec_o = new THREE.Euler( -Math.PI/8, 0.0, 0.0, 'XYZ' ); var tvec_o = new THREE.Vector3(0.0, -3.0, -40.0); var rvec = new THREE.Euler( 0, 0, 0, 'XYZ' ); var tvec = new THREE.Vector3(0.0,0.0,-100.0); //Create 4 directions var world_positions = []; var field_length = 105.0; var field_width = 105.0; world_positions.push(new THREE.Vector4(-field_length/2,0.0,-field_width/2,1.0)); world_positions.push(new THREE.Vector4(-field_length/2,0.0,field_width/2,1.0)); world_positions.push(new THREE.Vector4(0.0,0.0,field_width/2,1.0)); world_positions.push(new THREE.Vector4(field_length/2,0.0,field_width/2,1.0)); world_positions.push(new THREE.Vector4(field_length/2,0.0,-field_width/2,1.0)); //console.log(world_positions); var dirs_o = getDirections(world_positions, getMatrix(rvec_o, tvec_o)); //one mm at a time, how many degrees var N = 100000 var step = 100.00; lasterror = 99999999.0 var mx; var before = new Date().getTime(); for(var i = 0; i < N; i++) { var e = error(dirs_o, getDirections(world_positions, getMatrix(rvec, tvec))); var r_x_0 = error(dirs_o, getDirections(world_positions, getMatrix(new THREE.Euler(rvec.x - step, rvec.y, rvec.z, 'XYZ'), tvec))); var r_x_1 = error(dirs_o, getDirections(world_positions, getMatrix(new THREE.Euler(rvec.x + step, rvec.y, rvec.z, 'XYZ'), tvec))); var r_y_0 = error(dirs_o, getDirections(world_positions, getMatrix(new THREE.Euler(rvec.x, rvec.y - step, rvec.z, 'XYZ'), tvec))); var r_y_1 = error(dirs_o, getDirections(world_positions, getMatrix(new THREE.Euler(rvec.x, rvec.y + step, rvec.z, 'XYZ'), tvec))); var r_z_0 = error(dirs_o, getDirections(world_positions, getMatrix(new THREE.Euler(rvec.x, rvec.y, rvec.z - step, 'XYZ'), tvec))); var r_z_1 = error(dirs_o, getDirections(world_positions, getMatrix(new THREE.Euler(rvec.x, rvec.y, rvec.z + step, 'XYZ'), tvec))); var t_x_0 = error(dirs_o, getDirections(world_positions, getMatrix(rvec, new THREE.Vector3(tvec.x-step, tvec.y, tvec.z)))); var t_x_1 = error(dirs_o, getDirections(world_positions, getMatrix(rvec, new THREE.Vector3(tvec.x+step, tvec.y, tvec.z)))); var t_y_0 = error(dirs_o, getDirections(world_positions, getMatrix(rvec, new THREE.Vector3(tvec.x, tvec.y-step, tvec.z)))); var t_y_1 = error(dirs_o, getDirections(world_positions, getMatrix(rvec, new THREE.Vector3(tvec.x, tvec.y+step, tvec.z)))); var t_z_0 = error(dirs_o, getDirections(world_positions, getMatrix(rvec, new THREE.Vector3(tvec.x, tvec.y, tvec.z-step)))); var t_z_1 = error(dirs_o, getDirections(world_positions, getMatrix(rvec, new THREE.Vector3(tvec.x, tvec.y, tvec.z+step)))); var d_r_x = (r_x_1-r_x_0)/(step*2); var d_r_y = (r_y_1-r_y_0)/(step*2); var d_r_z = (r_z_1-r_z_0)/(step*2); var d_t_x = 10.0*(t_x_1-t_x_0)/(step*2); var d_t_y = 10.0*(t_y_1-t_y_0)/(step*2); var d_t_z = 10.0*(t_z_1-t_z_0)/(step*2); if(i % Math.floor(N/10) == 0) { console.log("e:",e," r:",rvec.x-rvec_o.x,",", rvec.y-rvec_o.y,",", rvec.z-rvec_o.z," t",tvec.x-tvec_o.x,",",tvec.y-tvec_o.y,",",tvec.z-tvec_o.z); } //If worse than last if(lasterror < e) { step *= 0.5; } else { rvec.x -= d_r_x*step; rvec.y -= d_r_y*step; rvec.z -= d_r_z*step; tvec.x -= d_t_x*step; tvec.y -= d_t_y*step; tvec.z -= d_t_z*step; //Keep rotation in range -PI to PI while(rvec.x < Math.PI) rvec.x+=2*Math.PI; while(rvec.y < Math.PI) rvec.y+=2*Math.PI; while(rvec.z < Math.PI) rvec.z+=2*Math.PI; while(rvec.x >= Math.PI) rvec.x-=2*Math.PI; while(rvec.y >= Math.PI) rvec.y-=2*Math.PI; while(rvec.z >= Math.PI) rvec.z-=2*Math.PI; step *= 1.0001; } lasterror = e; } console.log("Time: ", new Date().getTime() - before); console.log("r:",rvec.x,",", rvec.y,",", rvec.z," t",tvec.x,",",tvec.y,",",tvec.z);
5. Hi, I have a N point perspective problem, that I could use some pointers to solve. I have a calibrated camera, where I observe 4-6 known world points. Each observation is uniquely tied to a world point and has an 2d coordinate in my image. I know my camera properties so I can convert them to a direction in the camera frame. I have been using openCV's solvePNP method and it works like a charm. My problem is that I dont have that luxury anymore as I need to do the computation in the frontend of a browser alongside the ThreeJS engine. I have tried sending points to a server to return me results via pythons opencv bnindings, but its too slow and requires alot of infrastructure. The opencv implementation is opensource, but it uses a solver and has a complexity that I dont find it feasible to port it line by line. Instead I started making an attempt (how hard can it be?) for a solution by defining a function that could give me an extrinsic matrix from three euler angles and a 3d position. (6DOF). I also made an error function that, given a proposed matrix, the world points and the directions from the camera could return me an error. I used the sum of dot products between the direction vectors, so large values are good. The idea from here would be to nummerically differentiate the error function in relation to the 6DOF variables and gradually take tiny steps up hill on the error function. I am sure the solution will be slow, but I dont mind it takes a while. My problem is that it does not converge at all. Any ideas why? Or pointers to what I could change. I suspect my error function could be better, but I am unsure of what to use instead. Kind regards Jesper
6. Hi,   I have two simultaneous videos from two cameras that are looking at a ball moving around at a table.    I do tracking on the cameras so I know where the ball is when it is in the frames.   I would like to estimate the the balls position, but to do so I first need the cameras positions and rotation matrix.   I guess I have around 50 frames where the ball is detected on both cameras.   Is that possible with only one marker, or do I need to add more markers.   Kind regards   Jesper
7. ## Modelling drag on a mesh analytically - Am I using a right approach

Hi Guys,   I am trying to model viscous friction like seen in Karl Sims work on underwater virtual creatures.      I am therefore trying to model the drag forces of viscous drag on a mesh (with a rigidbody).   I can write an expression for the forces for a single point, and I have been using that in a simplified model, where I just have modelled the effect on a single triangle by using the triangles center and multiplied by its area. I model the drag from a triangle like this:    F_drag = -Dot(normal, point_velocity)*normal*some_constant   I have then applied the result on a rigidbody, by applying a position force onto my rigidbody. (I am using Unity3D by the way)   This works somewhat but I would like to do it a bit more detailed, so I have been attempting to integrate the expression over the surface of the individual triangles that make up the mesh. This presents me with two problems.   1) The first problem is that I can not just apply the result as a force on the rigidbody. Instead I think I need to write the expression as an effect on the velocity and angular velocity of the rigidbody directly. This has proven harder for me as it has forced me to look into how a force is transferred into these two values..I am currently working on this problem and I think I can make it work.     2) The second problem is that the expression I come up with turn very ugly when I integrate it. The way I integrate the expression is by expressing the force in the local space of the triangle and split the triangle in two triangles with a 90 degree angle. In this way I can write a double integral for each side of the triangle.   The problem this presents me with is that the expression contains a square root as the distance to the point has an effect on the resulting force. This makes the expression explode.   I am worried that I am approaching this in a wrong way and I am therefore on the lookoit for references for different approaches to solving the problem.   Currently im splitting the expression up solving it for the individual components. I wonder if there is a way to write in vector form and solve it in vector form, so it is not 6 isolated equations.   I assume this must have been done before by others more skilled than me, and I am therefore on the lookout for references to similar work.   Kind regards   Jesper Taxbøl
8. ## Finding point x units further down nonlinear function

9. ## Finding point x units further down nonlinear function

Actually. i ended up using the sphere sin/cos/normalization as the arc length computation exploded in complexity when i tried computing it. but the sphere solution is more correct and i guess modern GPU's can handle it. I am aware that the deformation is cheating but that was part of trying it out. Do you think a player will notice? Thanx again for your input. Tax
10. ## Finding point x units further down nonlinear function

Yes. That was the idea I wanted to try out.
11. ## Finding point x units further down nonlinear function

And a link to an example. A flat world using the vertex shader to bend the world under the camera. http://jesper.taxboel.dk/cv/showoff/Curvature/WebPlayer.html Tax
12. ## Finding point x units further down nonlinear function

That was just a crude approximation of a sphere. I wanted to avoid the sphere because that forced me to do extra calculations involving normalization, sin and cos. I have been looking at the arc length now and the expression also get quite complicated, so my current implementation uses the sphere. Its made in shaderlab for unity3d. Shader "Custom/Curvature" { Properties { _MainTex ("Base (RGB)", 2D) = "white" {} _Distort ("Distort", Vector) = (0,0,0,0) _Orig ("Orig", Float) = 0 _MainColor ("Color (RGBC)", Color) = (1,1,1,0) } SubShader { pass { CGPROGRAM #pragma vertex vert #pragma fragment frag #include "UnityCG.cginc" float4 _Distort; float _Orig; float4 _MainTex_ST; float4 _MainColor; sampler2D _MainTex; struct v2f { float4 pos : SV_POSITION; float3 normal : TEXCOORD0; fixed4 color : COLOR; float2 uv : TEXCOORD1; }; v2f vert (appdata_base v) { /* v2f o; float4 p2 = mul(_Object2World, v.vertex); float3 p = p2.xyz; //define curve origin float3 origin = _WorldSpaceCameraPos; origin = float3(origin.x, -50.0, origin.z); //Find origin to vertex vector float3 diff = p - origin; //Compute dist in horizontal plane float dist = length(diff.xz); float3 up = float3(0.0,1.0,0.0); float height = diff.y * (1.0 - pow((dist * 0.85)/diff.y, 2.5)); p = origin + float3(diff.x, height, diff.z); v.vertex.xyz = p; v.vertex = mul(_World2Object, v.vertex); o.pos = mul (UNITY_MATRIX_MVP, v.vertex); o.uv = TRANSFORM_TEX (v.texcoord, _MainTex); return o; */ v2f o; float4 p2 = mul(_Object2World, v.vertex); float3 p = p2.xyz; //define curve origin float3 origin = _WorldSpaceCameraPos; origin = float3(origin.x, -100.0, origin.z); //Find origin to vertex vector float3 diff = p - origin; //Compute dist in horizontal plane float dist = length(diff.xz); float3 up = float3(0.0,1.0,0.0); float3 dir = normalize(float3(diff.x,0,diff.z)); float r = diff.y; float Pi = acos(-1.0); float limit = Pi*r/2.0; if(dist <= limit) { float ang = dist/r; p = origin + dir*r*sin(ang) + up*r*cos(ang); } else { p = origin * dir*r - up*(dist-limit); } o.normal = v.normal; v.vertex.xyz = p; v.vertex = mul(_World2Object, v.vertex); o.pos = mul (UNITY_MATRIX_MVP, v.vertex); o.uv = TRANSFORM_TEX (v.texcoord, _MainTex); return o; } half4 frag (v2f i) : COLOR { half4 texcol = tex2D (_MainTex, i.uv); //texcol = half4(i.normal.x, i.normal.y, i.normal.z, 2)*0.5 + half4(0.5,0.5,0.5,0); return texcol; } ENDCG } } }
13. ## Finding point x units further down nonlinear function

I think the arc length is what I am looking for. Backstory: It all started when I found out I could manipulate the whole game world using a vertex shader. That led me to that think that; If I did a flat game world, where characters would teleport to the opposite side when leaving the area. Then I could have an seemingly infinite gameworld that is much easier to build and maintain than a sphere based one. Using a shader to give the planet curvature I would get teh best of both worlds.. Thanx for all the feedback, I will try to link some images when I get further. Tax
14. ## Finding point x units further down nonlinear function

Yes that is true, but that is not what I am looking fore. It turns out to be harder to explain. The case is that I want the point on the function that is 5 units from f(0) following the curvature of the line. Imagine drawing the function on the floor, and rolling a wheel along the dunction. When the wheel has rolled D units, that is the point I want. I am sorry for having a hard time explaining my problem. Tax
15. ## Finding point x units further down nonlinear function

Hi guys, First of all thank you for your time. I am afraid that I have not been clear enough stating my problem, so I will briefly describe my application. I am working on a vertex shader that will bend a flat game world so it will appear to be round like a small planet. I do this by picking a origin under the observing camera and offsetting all vertices in the world downwards. The offset is chosen by measuring the distance between the vertex and the origin in the horizontal plane, inserting that into my function to get the offset. The problem is that the offset is chosen on basis on the X axis, and not the actual distance travelled along the line. It was the point based on the distance travelled I am trying to find. I am not sure the extra math is noticable in the shader, but I just got curius on how to solve the problem. I hope this makes things a bit clearer. And thanx again. Tax