• Advertisement

lucky6969b

Member
  • Content count

    1653
  • Joined

  • Last visited

Community Reputation

1332 Excellent

5 Followers

About lucky6969b

  • Rank
    Contributor

Personal Information

  • Interests
    Art
  1. if you have 40x40 cells in the map, and there will be 10x10 cells per cluster then there are 4 clusters per side and 4x4 = 16 clusters in total if one cell is 10 meters the max. extent is 400x400 meters so the bigger cell size the bigger the extents, also dependent on maximum cells allowed if the size of North America is 24.71 million km² (or 2.471 × 10 pow(13) m square), how would you choose a good cell size for it and you have probably a limited number of cells allowed because of the calculation time complexity? thanks Jack
  2. Now the question turns out to be how to accurately plot out the portals. Whenever I define a cluster as one "tile" one cluster as described as Mikko, or one "extent" per cluster, either way, I have to define the portals, Some guys suggested that you can make use of the contours generated from recast, or the tiles themselves. While I am interested in generating portals in just an AABB kind of thing. I thought you have to, at the clusters' border, somehow scan along it, and check both side of the clusters, and seeing that if there are any obstacles there, and finding the longest line along the border, and push that into an array... don't know if that works.. thanks Jack
  3. That's what it means by storing "every" portal ever visited in the opened list, so that we can come back to it when we need it.
  4. Oh, maybe I've forgotten some properties of astar already (because it is a little bit differnet than normal a*), do you mean, if the astar finds that D is a dead end, the next iteration will choose portal #1 as an alternative path (where it starts searching again), oh my silliness... thanks Jack
  5. Let's say, on abstract level, the path, namely, A->B->C->D->E is valid, but the agent must choose portal #1 to reach E.... Presumably the agent has chosen portal #2, and go to B, C and D and finally ending up finding itself getting stuck at D and cannot move over to E... The whole computation is wasted. How do I avoid this problem? thanks Jack
  6. I call on the shatter function and it now has a series of chunks stored, and I can retrieve those to the main physics system. But do I hide the main object and re-construct the fragment pieces into some other brand new game objects or some sort? Thanks Jack
  7. The technique for generating infinite ground is simple. I just calculate the view distance from the camera to whatever location it is, then I scale up or down the ground geometry according to that. Working quite well. But I wonder if I had calculated the navigation data beforehand, will it get affected in anyways? currently I assume there is no problem, because after you calculated the navigation data, it is decoupled from the geometry, when I say put a new "tile" on an area I'm looking towards where there initially was no navigation data when I raise the camera up? because the ground is "scaled" every frame, and probably the navigation mesh sub-system is grabbing the geometry data while you turn your camera again, quite dangerous!!! secondly, the textures are all screwed up/stretched.. that's the only problems... thanks Jack
  8. No, the view matrix is right retrieved from the camera and pass right down, oh, I need to inverse transpose it? Thanks Jack
  9. The sky texture is a polar distorted map with planar mapping. When I multiply the uv a constant factor or time factor, the texture just goes crazy. How do I make it spin more decently? It just spinning like crazy, you have to get nausea for this Thanks Jack float4 SkyDomePS(float2 tex0 : TEXCOORD0) : COLOR { float2 rotator = float2(0.0, 0.0); float2 panner0 = float2(0.0, 0.0); float2 panner = float2(0.0, 0.0); float2 star = float2(0.0, 0.0); float4 horizonColor = float4(0.0, 0.0, 0.0, 0.0); float t = 0.0; //rotator = skyRotation * localTime * tex0; rotator = tex0 + localTime; //rotator.y = rotator.y + skyRotation; float4 sample0 = tex2D(skyTexSampler, rotator);
  10. I want it to stay at one place, say where the position with transformations by the world matrix only. But I don't think if I drop the view and projection matrices, it will work anyway. But now the thing is moving with the camera. outVS.position = mul(mul(float4(position.xyz, 1.0), modelViewMatrix), projectionMatrix); Thanks Jack
  11. Having a divine spike, when for (int i = 0; i < numVerts; i++) { D3DXVECTOR3 v; v = pVerts[i].pos - vCent; D3DXVec3Normalize(&v, &v); pVerts[i].norm = v; pVerts[i].tu = 0.5 + (atan2(v.z, v.x) / (2*D3DX_PI)); pVerts[i].tv = 0.5 - (asin(v.y) / D3DX_PI); }
  12. After generating a spherical distorted map (or just polar distorted) in Photoshop and creating planar mapped mesh in max, when I match up the parts together, I get a cracking sky right at the top.. I assume the texture is good enough, you can find the tutorial from the web, so the problem lies in probably in the u,v of the mesh. Since we are no artists, I don't want to always bake the u,v out of the dcc, I wish I could make 2 functions which create the standard planar mapping u,v and spherical mapping u,v on the fly, right built into my application. More then often, the standard ones are good enough. But How do I find the math? So this picture exhibits the cracking situation:
  13. Does u,v,w texture mapping exist?

    Sorry, I overwrote the texture buffer instead, and I've already cloned the mesh... but a new problem arises, when I map the volume texture over a box, what I get is a "block" of fire, do I change the vertex buffer to get a better look of fire inferno? Thanks Jack
  14. Does u,v,w texture mapping exist?

    But how to tell the difference between u,v and u,v,w in this case? DWORD VertexPosNormTexFVF = D3DFVF_XYZ | D3DFVF_NORMAL | D3DFVF_TEX0; When I clone a mesh to create room for Texture coordinates, and I specify this, it gives me u,v only. When I lock the vertex buffer and see the inside, I find the vb somehow get overrun? Thanks Jack
  • Advertisement