IndakungWoo

Members
  • Content count

    8
  • Joined

  • Last visited

Community Reputation

101 Neutral

About IndakungWoo

  • Rank
    Newbie
  1. I downloaded SC2, and tried its editor. Press Ctrl + Alt + H to show navMesh and put some buildings on it, it shows the dynamic generation of navMesh for the building's constructions and destructions clearly. Or you can also add polygon directly. It's Delaunay algorithm isn't it? But my question's still: how to deal with agents in different sizes? The editor doesn't shows if it generated various layers for different size of agents.
  2. I found that the buildings in SC2 are on grids, it seems like they combined both grid and navmesh? I cannot open the zip too, could anybody help?
  3. [quote name='Steadtler' timestamp='1335027897' post='4933550'] CoH's solution seems nice. If you dont need the information to be so dense, you can use a quadtree instead of a dense grid. That would reduce the memory, and the cost of pathfinding, line-of-sight checks, etc. Im not sure why your string pulling is so expensive, it should be very cheap to do on a grid, as all you need to do is intersecting a vector with axis-aligned lines. If you are sure your maths are optimal, then there is also the fact that you dont need to smooth the whole path at once. You really only need to smooth up to the next "corner", its useless to smooth a whole path that is likely to change. [/quote] I think my problem has been solved. I found a precise way to find corners. And it's not expensive as I was thinking. I've decided not to use navMesh, HPA* with grids is enough. Don't need to smooth the whole path at once, got it. Thank you again.
  4. [quote name='IADaveMark' timestamp='1334937109' post='4933258'] Poly B in that example is horribly inappropriate anyway. It shouldn't connect to the top and bottom walls at all. Instead, the corners of the blue boxes should be connected to each other making area B represent the area between them. That allows you to simply mark poly B as being off-limits for agents that can't use it. [/quote] Thank you IADaveMark. I have thought about this situation, but even poly B is generated as you say, this problem is still exsit. Or it would be an easy way to do some algorithem base on an appropriate generation? [quote name='JTippetts' timestamp='1334927815' post='4933185'] You might want to take a look at the [url="http://code.google.com/p/recastnavigation/"]Recast and Detour[/url] library, which provides automatic generation of navmeshes. A part of the parametric structure that you populate to generate the navmesh includes a member for agent radius and another for agent height, which affect the generation of the navmesh. A solution to the problem of agents of multiple sizes is to generate multiple navmeshes, and roughly classify the agents into a fixed number of size categories equal to the number of navmeshes. This way, you don't have to complicate the actual pathfind with additional logic to cull polygons based on agent size. [/quote] [quote name='Steadtler' timestamp='1334968276' post='4933406'] Always shrink your navmesh for agent size. We have tested this problem at work and we found that generating several navmeshes - as JT suggest - is far more efficient (less memory AND less cpu cost) than generating one navmesh that supports several agent radii. [/quote] Thank you for the advise of multiply navmeshes, Steadtler and JTippetts. But here comes another problem. The project I'm working on is an RTS game, so it involves dynamic addition and removing of obstacles like buildings, and it'll be complicated for me to generate multiply navmeshes dynamically in an efficient way. I've read the [url="http://aigamedev.com/premium/presentations/dealing-with-destruction/"]pathfinding of Company of Heros[/url], the obstacles change frequently and it uses grids not navmeshes, and that's the way I used in past too, the reason why I want to replace it by navmesh is that I couldn't find an efficient and precise way to smooth the path, it means the path should turn only at the corner of obstacles( I'm using a line of sight test to every waypoint choosed by bisection method ). The effect in Starcraft2 is what I want to achieve ( obviously the buildings in Starcraft2 are on grids, and the path of one unit is very smooth and precise ), not flocking behavior, just one unit one path. Could you give me some advises?
  5. [attachment=8361:navMeshQues.png] As shown in the graph above, without considering the size of agent, the result path from polygon A to polygon C would be A->B->C. But if considered the size, because of the minimum passable width ? from A to C in B is less than the agent's diameter, the detecting should avoid to across B from A to C while doing a pathfinding, and the result path should be A ->B->D->E->F->B->C. The question is how to know which polygon or which exit border of a polygon should be avoid during a pathfinding?
  6. [quote name='MJP' timestamp='1329113969' post='4912506'] [quote name='Indakung' timestamp='1329105128' post='4912460'] Thank you. But how could the program predict the device's losing? Or just copy the data every frame? It seems not so efficient. [/quote] I don't think you can in a reliable manner. You can detect a focus change, but I don't know if the device will already be considered "lost" at that point. If you want, you can detect if your app is running on Vista or Win7 and create D3D9Ex interfaces. If you use those, you won't get a lost device scenario. Doesn't help you for at all for XP, obviously. [/quote] Thank you. As a matter of fact, losing device is not an usual thing, and I think most people would ignored the resetting of particles after that case, so I'd better ignore this problem too.
  7. [quote name='MJP' timestamp='1329075007' post='4912317'] Once you lose the device, you lose all of your render targets. You would have to copy the data before you lose the device. [/quote] Thank you. But how could the program predict the device's losing? Or just copy the data every frame? It seems not so efficient. [quote name='mhagain' timestamp='1329079901' post='4912336'] Why not just re-render what was originally rendered into your render targets? [/quote] Because those render targets are used for saving states of a GPU-based particle system. I think it'll be weird when all particles are disappeared or reseted at their initial position or other states after a device' losing.
  8. For resetting device after it was lost, all D3DPOOL_DEFAULT resources should be released. I want to restore some render-target textures after the device is reseted, so I'm trying to save their data to D3DPOOL_SYSTEMMEM texture when device is lost, but I cannot call IDirect3DDevice9::GetRenderTargetData successfully because the device was lost. The DX debug report says: "Failing copy from video-memory surface to system-memory or managed surface because device is lost. UpdateSurface returns D3DERR_DEVICELOST" Is there another way to restore the D3DPOOL_DEFAULT resources after a device losing?