Advertisement Jump to content
  • Advertisement


  • Content Count

  • Joined

  • Last visited

Community Reputation

102 Neutral

About willvarfar

  • Rank
  1. I am trying to compute where a sphere traveling along a line segment will hit a triangle. In Javascript. I have had no luck finding ready-to-use code, or converting other people's code, or even understanding the problem. I've got so far as trying to make it on first principles. Here's what I've got: function triangle_sphere_sweep(a,b,c,start,stop,radius,twoSided) { //clockwise winding, I think /* RETURNS a tuple of the fraction of the line (start->stop) that the the sphere moves, and the normal to point of impact with the triangle, or null if no hit */ // get the normal of the triangle var u = vec3_sub(b,a), v = vec3_sub(c,a), n = vec3_cross(u,v); if(n[0]==0 && n[1]==0 && n[2]==0) // triangle is degenerate return null; var dir = vec3_sub(stop,start), j = vec3_dot(n,dir); var colinear = float_zero(j); // parallel, disjoint or on plane // which side of triangle is line? var start_height = vec3_dot(n,vec3_sub(start,a)); if(start_height < 0) // line is beneath triangle return null; var stop_height = vec3_dot(n,vec3_sub(stop,a)); if(stop_height > start_height) // going away from triangle return null; // is parallel? colinear = colinear || (float_equ(start_height,stop_height) && float_equ(start_height,radius)); if(!colinear && (start_height > radius)) { // far enough above to hit the triangle itself? // get intersect point of ray with triangle plane that is radius above it var normal = vec3_normalise(n), start_plane = vec3_add(a,vec3_scale(normal,radius)), i = -vec3_dot(n,vec3_sub(start,start_plane)), k = i / j; if(k < 0) // line goes away from triangle return null; if(k > 1) // to far on line return null; // do we hit the triangle radius above? var hit = vec3_add(start,vec3_scale(dir,k)), // intersect point of ray and plane at radius above it uu = vec3_dot(u,u), uv = vec3_dot(u,v), vv = vec3_dot(v,v), w = vec3_sub(hit,start_plane), wu = vec3_dot(w,u), wv = vec3_dot(w,v), D = uv * uv - uu * vv, s = (uv * wv - vv * wu) / D; if(s>=0 && s<=1) { var t = (uv * wu - uu * wv) / D; if(t>=0 && (s+t)<=1) // hit in triangle return [k,normal]; } } // we are on right side of the triangle, but we're not hitting exactly; maybe we hit an edge then? var best_pt = null, best_ofs, line = [start,stop], line_scale, radius2 = radius*radius, sides = [[a,b],[b,c],[c,a]]; for(var side in sides) { side = sides[side]; var nearest_ofs = line_line_closest_point_ofs(line,side), //[ofs_on_line,ofs_on_side] nearest_pt = [vec3_lerp(start,stop,nearest_ofs[0]), vec3_lerp(side[0],side[1],nearest_ofs[1])], nearest_dist2 = vec3_distance_sqrd(nearest_pt[0],nearest_pt[1]); if(nearest_dist2 <= radius2) { // we want to move our hit centre-point so its radius distance from the intersection point on triangle edge line_scale = line_scale || 1/vec3_length(vec3_sub(stop,start)); // lazy compute var ofs = nearest_ofs[0] - (Math.sqrt(radius2-nearest_dist2) * line_scale); // is the first, or the best, hit? if((ofs > 0) && (best_pt == null || ofs < best_ofs)) { best_pt = nearest_pt[1]; // just keeping this for test best_ofs = ofs; } } } if(best_pt != null) { // hit edge // lets do a little test; this is often FAILING var hit = vec3_lerp(start,stop,best_ofs), hit_dist2 = vec3_distance_sqrd(hit,best_pt); assert(float_equ(hit_dist2,radius2),"wrong distance! "+hit_dist2+" != "+radius2+" ("+(hit_dist2-radius2)+")"); return [best_ofs,vec3_normalise(vec3_sub(hit,best_pt))]; } // no hit return null; } function line_line_closest_point_ofs(line1,line2) { var v1 = vec3_sub(line1[1],line1[0]), // should be normalised? v2 = vec3_sub(line2[1],line2[0]), // should be normalised? v1dotv2 = vec3_dot(v1,v2), denom = v1dotv2 * v1dotv2 - 1; if(float_zero(denom)) { assert(false,"lines are parallel"); // todo: how far } else { var c = 1/denom, dot1 = vec3_dot(vec3_sub(line2[0],line1[0]),v1), dot2 = vec3_dot(vec3_sub(line2[0],line1[0]),v2), t1 = c * (-1 * dot1 + v1dotv2 * dot2), t2 = c * (-v1dotv2 * dot1 + 1 * dot2); t1 = Math.min(Math.max(0,t1),1); // if v1 is normalised, what // would I do to map it to the real magnitude of line1? t2 = Math.min(Math.max(0,t2),1); } return [t1,t2]; } Now when intersecting with the edge of a triangle, it often puts the sphere centre too close or too far from the edge Here's a screenie: The blue sphere is the start of the sphere's path (the green line segment). The green sphere (drawn inverted so you can see in the middle) is where the intersection is computed; but its clearly too close to the triangle The triangle has normals arrows drawn from each corner just so I know which side is up on it.
  2. I am doing exactly this with my 'barebones' skeleton for my Ludum Dare entries. That'll be on github this weekend. I am using only that subset of OpenGL that is OpenGLES compatible, i.e. all shaders and attribute arrays. My utility code to compile a shader, for example, sometimes puts a "precision lowp float;" on the front before compiling etc; that kind of platform adaption. The game code is 100% and portable. Because I'm keen to compile for Google Native Client too, I'm having to abstract file io and make it all async; that's the biggest deal.
  3. Fwiw, I was not calling glViewport() in my NaCL-specific resize handler.
  4. Your cloud's bounds and centre (the cheapest kind of average, and likely works for your purposes) can all be computed in the cloud's relative coordinates and then that single point can be projected. > currently I'm rendering a point cloud by calling glVertex3f(p.x, p.y, p.z); on each point p inside a glBegin(GL_POINTS) block. Is likely your next target for optimisation ;)
  5. I'm appealing to people's psychic debugging skills: I have an OpenGL app that uses shaders and attributes in uploaded VBOs exclusively. It runs fine on the desktop (Linux with integrated Intel card). It compiles for and runs on Chrome NaCL, which is OpenGLES 2.0. It does not draw anything, however (Vista, an ATI card). The glClearColorf() and glClear() are working, and by picking a random clear colour I can get a nice flicker going so I know I've got the whole draw and flush cycle going fine in Chrome NaCL. However, as I said, my program doesn't draw any of its content. I have liberal glGetError() checks everywhere, and at first when porting they found lots of stuff such as needing to specify a float precision in shaders and not using full ints for element arrays and such. But now there are no errors reported anywhere. I have printf debugging so I know my code-path is right and that everything I can think of is asserted. The NaCL "tumbler" demo does run. Apart from their codebase being massively smaller, I can't spot what they are initializing that I aren't. Anyone got any ideas what classic thing I've run into?
  6. willvarfar

    Vertex seams

    is that line on a seam between two triangles/primatives that are not sharing vertices? is the texture mipmapped? Are you using GL_CLAMP or GL_CLAMP_TO_EDGE?
  7. Isn't this easier to do CPU side on the bitmap data itself?
  8. willvarfar

    Tricky question on shader

    try GL_CLAMP_TO_EDGE instead of GL_CLAMP on the texture.
  9. willvarfar

    Blending woes

    You can do this by drawing the main scene, then blending the ambient light and spotlight bitmaps over it, finally perhaps with a post step to saturate the middle of the light. The alternative, better approach would be to just use GL lighting..
  10. willvarfar

    spherical billboards

    My issue with the normals was that when I uploaded my normal map I made two elementary mistakes: I had swapped the y and zI had not moved from -1:1 to 0:1 - I just had to +1, /2 to get them into the right range My normal mapping is now working. This leaves me struggling to set the z-depth correctly. I do not yet think I have a problem with z-fighting - in that I'm not yet worried that my near and far are too far apart. But I do have them at the wrong z. My current math seems wrong; I have popping - its as though they are on the wrong plane, or all on the same plane or something; I think they might be very near z_near since they draw in front of my terrain even when they should be behind it. float depth = z - (texel.z - 0.5) * radius; gl_FragDepth = (z_near * z_far) / (z_far - depth * (z_far - z_near)); (where z is the incoming vertex shader's gl_Position.z, and not gl_FragCoord.z) What is the proper way to set the true frag z?
  11. willvarfar

    spherical billboards

    I am encoding the radius in the texture coords rather than computing the vertex positions - as rotated to face camera - in the calling code. I think this is fairly standard approach for billboards? (At least, I adapted it from the billboard code I absorbed on the 'net.) I'd recommend outputting the normal map as the colour from the fragment shader until that looks correct, before trying to do anything with lighting / z / alpha. [/quote] yes if I paint the normal map as colour without lighting it looks great. So I have a normal map that has convincing (but I am easily convinced) values in it, and a single light.
  12. willvarfar

    spherical billboards

    Thank you, that's very encouraging and a great looking demo! The GL_POINT_SPRITE hint will be well worth investigating when I've got the depth and normals right. I'll hold fire on moving to GL_POINT_SPRITE(S) because my quads are at this point working. So now I'm focused on my normals map. I have created a texture where the normals are in the RGB and the A holds a coverage; if the (interpolated) A is 0 then the fragment is discarded; if it is 1, then it should be shaded solid and the z set. If its in-between, its on the edge, and the z should be the same as the billboard and the pixel should be blended by A. The code to generate it is: const int R = 31, SZ = power_of_two(R*2); std::vector<vec4_t> p; for(int y=0; y<SZ; y++) { for(int x=0; x<SZ; x++) { const float rx = (float)(x-R)/R, ry = (float)(y-R)/R, d = rx*rx+ry*ry; if(d >= 1) { // outside sphere p.push_back(vec4_t(1,0,0,0)); } else { vec3_t normal(rx,sqrt(1.-d),ry); p.push_back(vec4_t(normal,1)); } } } My focus is trying to get the normals and right z set; what I have written is, first off, not working: [[vertex shader]] varying vec3 light_dir; varying float radius; varying float z; void main() { vec4 wpos = vec4(, 1.0); vec4 epos = gl_ModelViewMatrix * wpos; z = epos.z; light_dir = gl_LightSource[0]; epos.xy += gl_MultiTexCoord0.xy; gl_Position = gl_ProjectionMatrix * epos; radius = abs(gl_MultiTexCoord0.x); vec2 tex = gl_MultiTexCoord0.xy / radius; tex *= 0.5; tex += 0.5; gl_TexCoord[0].xy = tex; }; [[fragment shader]] varying vec3 light_dir; varying float radius; varying float z; uniform sampler2D tex; uniform float z_near; uniform float z_far; void main() { vec4 texel = texture2D(tex,gl_TexCoord[0].xy); if(0.0 == texel.a) discard; vec3 colour = vec3(1.0,1.0,0.5); float intensity = max(dot(-normalize(light_dir),texel.rgb),0.0); gl_FragColor = vec4(colour * intensity,texel.a); float depth = z - texel.z * radius; gl_FragDepth = (z_near * z_far) / (z_far - depth * (z_far - z_near)); }; I'm not convinced my normals are coloured right. The depth is wrong; they are generally drawn in roughly the right place, but there is popping between intersecting spheres when the camera moves. I've found several competing definitions of gl_FragDepth.
  13. I want to draw *lots* of balls, so I've billboarded them, with the rotating to face camera done in a vertex shader: void main() { vec4 wpos = vec4(, 1.0); vec4 epos = gl_ModelViewMatrix * wpos; epos.xy += gl_MultiTexCoord0.xy; gl_Position = gl_ProjectionMatrix * epos; float radius = abs(gl_MultiTexCoord0.x); vec2 tex = -gl_MultiTexCoord0.xy / radius; tex *= 0.5; tex += 0.5; gl_TexCoord[0].xy = tex; } for each ball, I have the 'center' in the vertex attribute (that's duplicated; each vertex gets a copy) and the radius and corner in the texture coordinate: for(boards_t::const_iterator i=boards.begin(); i!=boards.end(); ++i) { vbo.push_back(tex_vec_t(vec2_t(-i->radius, i->radius),i->center)); vbo.push_back(tex_vec_t(vec2_t(-i->radius,-i->radius),i->center)); vbo.push_back(tex_vec_t(vec2_t( i->radius,-i->radius),i->center)); vbo.push_back(tex_vec_t(vec2_t( i->radius, i->radius),i->center)); } and the vbo is then an interleaved array with GL_T2F_V3F. Is there a more speed-efficient or space-efficient-yet-not-massively-slower approach? My vertex shader knows the center and radius of the sphere so presumably I can pass that to the shader as varying since the interpolation of it won't alter the value as all four vertices in each ball's quad have the same values. Can I somehow use this to produce proper spherical depth and normals in the fragment shader? Should I have separate normal and depth textures, or is it cheaper to compute in the shader? And what would the math be?
  14. willvarfar

    MD2 vertex array problem

    Put the vertex values for each frame of animation into a separate VBO. Once its working, you can play with interpolating between the two adjancet sets of vertices for each tween frame of animation, using the mix() function.
  15. willvarfar

    Performance data on OpenGL ES

    The biggest problem I foresee is that OpenGL ES 2.0 often runs on chips that have deferred rendering strategies (e.g. PowerVR used on iPhone iirc). So how fast you can dispatch ops to them is not the same as how fast they can render them.
  • Advertisement

Important Information

By using, you agree to our community Guidelines, Terms of Use, and Privacy Policy. is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!