Manpreet Singh

  • Content count

  • Joined

  • Last visited

Community Reputation

271 Neutral

About Manpreet Singh

  • Rank
  1. Space Shooter Enemies

    Hi, For complex enemy paths, using Bezier curves and/or splines might be a good starting point.  I made a basic shmup in unity and used bezier curve components from   What I did: Add child objects with the spline component to the enemy spawners. Enemies which need to travel in a path have a "MoveAlongPath" component with a reference to the spline component from the spawner's child object. This reference assignment is done as soon the spawner instantiates the enemy. After that just keep incrementing the spline parameter using a hardcoded or configurable fraction (I went with 1.0/duration, explained on the link above) and get the position vector on the spline using that parameter      
  2. Hi, I use a method which is similar to tiago's but is extremely simplified. I use Bullet physics as my collision detection system, artemis-cpp for the ECS architecture.   Bullet has a function performDiscreteCollisionDetection() (or stepSimulation() if you want dynamics as well). After calling this, all overlapping pairs (ie colliding objects) will be available in an array that bullet maintains internally. Loop over this array (note: it's an array of btCollisionObjects so use the userPointer field to get the entity) and add a collided component to the entities in each pair The collided component in one entity stores the id of the other entity. Here is the system logic All entities having the collided component as well as a health component will have their health reduced by the amount in the other entity's damage component All entities having the collided component as well as a Damage component will just destroy themselves For your case, you can store additional info in the collided component, and use that to update the transforms of your entities so they no longer collide. Ofcourse this might be a poor method to do this, but hopefully you can see how a collision detection system can cooperate with entities and systems.
  3. Slow down: Artemis C++ Port & SDL 2

    hi,   I'm using artemis-cpp as well. In my case, if the number of entities passes a certain threshold the app starts slowing down. This is on a 4th gen i5 with gtx 860m. I suspect its due to the fact this implementation does not have a proper struct of array approach. Check the addcomponent function in entitymanager.cpp file, in that you can clearly see they use a Bag<component *>. A bag itself defines its array like E *data, where E can be any type. Effectively, you get Bag<component *> components; // which, inside entitymanager, translates to component **data; See? Its an array of pointers (EDIT: Its not a technically an array of pointers but a pointer to a pointer. I said array because later its allocated and used as a normal array). Now I may be completely off here but it looks to me that this causes the code to jump around when accessing these pointers, which is what the AoS approach aims to solve. I'm at work so can't try it, but IMO using an array of plain data instead of array of pointers might help.   EDIT2: A bit offtopic, but you can get rid of that if statement by using a delayedEntityProcessingSystem to get rid of branches
  4. You could take a look at artemis (C++ port) . What they do is similar to how phil_t described it earlier.   Each entity has an id, a type bitstring and a system bitstring. Everytime you add a component to an entity, its type string is updated, but not the system string. Also, that entity is added to a "refreshed" array At the start of your main loop, you need to call a loopstart() function ( world.loopStart() ). This function iterates over all the entities in the refreshed array In loopstart, each entity that was refreshed will be acted upon by all the systems registered with the world object For each system, the system's bits will be compared with the entity's type bits ( the system's bit indicates what all components/types will this system act upon) EDIT: The previous point is a bit incorrect. The systems themselves will have a system bit and type bits. The system bit will identify that particular system, while the type bits will identify what all components are required if an entity wants to be acted upon by that system If the entity is interested in that system(i.e. the corresponding type bit(s) is/are set), then its system bitstring will be updated, and that entity will be added to that system's actives list When you call any system's update() method, it will only iterate over the entities in its actives list. The only downside I see here is that your changes to an entity will not be available in the same loop. This is actually a good thing, as explained here
  5. Basic enemy design

    How about this ?
  6. Entity,Components: Issue in immediate refresh

    Thanks for the reply!   Actually, this is how artemis deals with entity changes by default. You add a component, then call refresh on that entity. Refresh() will just add that entity to an array. At the beginning of the next frame, you just need to call loopstart() which will take the refreshed array from the previous frame and update all the entities in it.   The reason why I asked about immediate updates is that the order in which I call my updates and handle the data in the systems was giving me trouble with deferred updating. I think I'll try to rectify that problem now instead of hacking my way around deferred updating.    Cheers!  
  7. My question: How do you tackle new entity/component addition on the fly? Background: I'm using the c++ port of artemis framework to architect my game objects. In artemis, whenever you create a new entity or add a component to an entity, you have to call refresh() on that same entity for the changes to take effect. entity->addComponent(xyz); entity->refresh(); Entities in artemis possess 3 attributes: a unique id, a type bitstring(identifies which components it has), a system bitstring(identifies which systems will call this entity). When calling addComponent, the type bitstring is updated, but not the system bitstring. When calling refresh, the entity is added to a 'refreshed' array. At the start of each game loop iteration, a function loopstart() is called. This function goes over all the objects in the 'refreshed' array, and updates the systembits by looking at the type bitstring of those objects. The last sentence makes it clear that all new updates will be reflected in the next iteration. If I shoot a bullet, it will appear 1 frame later. If I want to process collisions, it will start 1 frame later. How would you suggest I tackle this ? If I add a component, I would like that entity to be processed by a new system immediately   What I've done so far Change the order of system updates. Call the removal system first, before any other, so that all entities with a 'destroyed' component collected in the previous frame will be destroyed in the beginning of the current frame Problem: Even removal will need a refresh, which will not happen immediately. This causes invalid pointer access as the subsequent systems will expect to see some components which are now not there thanks to removal Instead of refresh, I call the function EntityManager::refreshEntity. This function is responsible for actually updating the system bits so that the entity is acted upon by any new system if a component is added to it. And the result is exactly how I want it Problem: This function has a loop inside it, to update the bits for all the systems present. This can be a problem if there are tons of entities each frame being updated. Which is why, design-wise, it is called inside loopstart, only at the beginning I recently came across entity state machines . Any inputs?   Thanks for reading !
  8. Single player boss battles?

    If you're not restricting yourself to 3D examples, you can take a look at these -Megaman series -Alien Soldier, Contra: Hard Corps, Gunstar Heroes( the "seven force" in both gunstar and alien is badass ) -Castlevania series 2D games (especially older ones) seem to have the coolest boss-fights.
  9. To be more specific, real-time rendering.   Most of the academia I checked are engaged in mathematical and visualisation work, but very few seem to be doing any sort of research in real-time graphics. Also noticed that majority of the recent publications have come from the industry. Did this kind of situation exist before as well, just went unnoticed?        
  10. By this, he means that your calculations should be consistent. Both the normals and the light direction should lie in the same cooordinate frame/space. If you're transforming your normals to viewspace ( transforming them with the view matrix ) then you must also transform the light vector from world space to view space.  Your normals are in world space alright, but as c0lumbo said, make sure that your light vector is also in world space. Maybe you accidentally multiplied it with the view matrix? ( just guessing)
  11. Can't get ssao to work properly

    Update. My normals were getting scored incorrectly. So after fixing that I got this [url=""][/url] [url=""][/url] Oh and by the way, I'm not doing any kind of blurring yet, so you see the noise. Am I on the right track or is there still something horribly wrong here? (mentioning again that I'm using a R32G32B32A32 buffer to store my normals and an identical buffer for view space positions, not reconstructing from depth Normals : [url=""][/url] Position : [url=""][/url]) Thanks for reading!
  12. Can't get ssao to work properly

    Me again. Sorry about not providing enough information, wrote that in a bit of a hurry. Ok so I tried messing around some more, and I *think* this has got to do with my rendertarget size? I'm just making the position and normal buffers with sizes 2048x2048 each, whereas my backbuffer is always resized to the application window(maximum size 1600 x 1200). Should the render targets be the same size as that of the backbuffer? Off topic : When viewing a view-space normal buffer, is it normal(no pun) for the scene to change color depending on the view direction? I look at the scene head on, everything is blue, when I turn left, everything facing me starts turning pinkish.
  13. Edit: Forgot to mention, I'm using slimdx with wpf ( if it matters ) Hi folks, I'm trying to get a simple SSAO shader to work, using this site as reference [url=""]http://www.john-chap...ontent.php?id=8[/url] And naturally I've run into problems again. I'm getting this output ( the first one is with a smaller radius, and the next one with a larger radius, see shader) [url=""][/url] [url=""][/url] Unfortunately, I can't use PIX since it crashes my application before it even starts. I realize that it may be harder to debug without showing the entire code, so first, here's my main SSAO shader. This is almost identical to the one being used in the aforementioned site except for the fact that I'm using a position buffer instead of reconstructing from depth. [CODE] float3 currentPosition = PositionTex.Sample(SSAOSampler,input.TexC).xyz; //Current view space fragment position from Position buffer float3 currentNormal = normalize(NormalTex.Sample(SSAOSampler,input.TexC).xyz); //Current view space fragment normal float2 NoiseScale = 0.0f; NoiseScale.xy = BufferSize/(SSAO_noiseSize); float3 randomVec = (RandomTex.Sample(SSAOSampler, input.TexC * NoiseScale).xyz * 2.0f) - 1.0f; //Random noise texture for kernel rotation float3 tangent = normalize(randomVec - (currentNormal * dot(randomVec, currentNormal))); float3 bitangent = cross(currentNormal, tangent); float3x3 tbn = float3x3(tangent, bitangent, currentNormal); float occ = 0.0f; float Radius = 50.0f/BufferSize; for(int i = 0 ; i < SSAO_kernelSize ; i++ ) { float3 vec = AO_SampleVectors[i].xyz; float3 sample = mul(vec,tbn); sample = sample * Radius + currentPosition; float4 offset = float4(sample, 1.0f); offset = mul(offset,Projection); offset.xy /= offset.w; offset.x = offset.x * 0.5 + 0.5; offset.y = -offset.y * 0.5 + 0.5; float sampleDepth = PositionTex.Sample(SSAOSampler, offset.xy).z; float range_check = abs(currentPosition.z - sampleDepth) < Radius ? 1.0 : 0.0; occ += (sampleDepth <= sample.z ? 1.0 : 0.0) * range_check ; } occ/=SSAO_kernelSize; float4 Color = {1.0f,1.0f,1.0f,1.0f}; Color.rgb = 1 - occ; return Color; [/CODE] To get the normal buffer and position buffers [CODE] Vertex shader matrix viewproj = mul(View, Projection); input.pos = mul(input.pos,rotation); //Note : rotation matrix = world matrix in this case, just so you don't confuse it input.norm = normalize(mul(input.norm,rotation)); output.pos1 = mul(input.pos,View); // pos1 is the view space position output.pos = mul(input.pos,viewproj); output.uv = input.uv; output.norm = normalize(mul(input.norm,View)); // norm is the view space normal return output; ////////////////////////////////////////////////////////////////////////////////// Pixel shader output.Position = input.pos1; output.Normal = normalize(input.norm); return output; [/CODE] I'll post pics of my position and normal buffer if those are required. Thanks!
  14. I'm currently trying to implement Variance shadow maps, from the original paper and nvidia slides. I'm very well aware of their drawback of light bleeding and some what sure about how or why they happen. In my app, I'm just applying a 5x5 gaussian filter over the 2 channel depth map, and with anistropic filtering enabled, simply following the steps as mentioned in the paper. I get the following With standard PCF [img][/img] With VSM and the 5x5 gauss blur [img][/img] I've seen several screenshots where users have applied 3x3 or 5x5 gauss blurs and still getting dark shadows ( albeit with the light bleeding problem, but I'm not worried about that for now). I need to know if this is a somewhat common problem alongside lightbleeding or have I done something so terrible that light bleeding is getting amplified? Without the blur, I get blocky shadows ( somewhat smoothed by the hardware filtering) but they're almost as dark as the pcf ones, ofcourse again with the light bleeding but like I said I want to focus on this problem first. Also a note: if you compare the screen shots side by side, the second one is a little darker overall. Now for some code [CODE] float4 PS_VSM( PS_IN input ) : SV_Target { //... //fancy-shmancy light stuff //... float2 ProjectedCoord; ProjectedCoord.x = (input.LightSpacePos.x/input.LightSpacePos.w)/2 + 0.5f ; ProjectedCoord.y = -(input.LightSpacePos.y/input.LightSpacePos.w)/2 + 0.5f ; float4 avg = 0.0f;//,avg2 = 0.0f; float value = 2; float Step = 1.0f/BufferSize; float4 Moments = SSTex.Sample(ShadowSampler2,ProjectedCoord); float M1 = Moments.r; float M2 = Moments.g; float Lit = (input.Dist < M1); float P = 0.0f; float t = M1 - input.Dist; float variance = min(max(M2 - (M1*M1), 0.0) + 0.00003, 1.0); //max((M2 - (M1*M1)),0.00001f); I was using this commented part earlier, then in my attempt to fix this I replaced with something that I came across in numerous forum post P = variance/(variance + (t*t)); float4 Output = 0.0f; Output = max(Lit,P); Output.a = 1.0f; return (Color*Output + ambient)*Tex.Sample(Sampler,input.uv); } [/CODE] I checked my depth buffer, and it seems to be getting blurred fine. I initially thought it might be due to some normalization problem with the gaussian function ( even though I'm using the normalization constant and the values do approximately add up to 1). But anyway here is the blurring just in case [CODE] ... float4 Blur_SinglePassH(ScreenSpaceOutput input) : SV_TARGET { float4 avg = 0.0f; for(int i = 0 ; i < kernelSize ; i++) avg += SSTex.Sample(ShadowSampler2,input.TexC + float2(Offsets[i]/BufferSize,0.0f))*GaussianWeights[i]; return avg; } float4 Blur_SinglePassV(ScreenSpaceOutput input) : SV_TARGET { float4 avg = 0.0f; for(int i = 0 ; i < kernelSize ; i++) avg += SSTex.Sample(ShadowSampler2,input.TexC + float2(0.0f,Offsets[i]/BufferSize))*GaussianWeights[i]; return avg; } ... ... ... technique10 BlurH { pass P0 { SetGeometryShader( 0 ); SetVertexShader( CompileShader( vs_4_0, ScreenQuadVS() ) ); SetPixelShader( CompileShader( ps_4_0, Blur_SinglePassH() ) ); } } technique10 BlurV { pass P0 { SetGeometryShader( 0 ); SetVertexShader( CompileShader( vs_4_0, ScreenQuadVS() ) ); SetPixelShader( CompileShader( ps_4_0, Blur_SinglePassV() ) ); } } //Before you mention it, yes I tried using 2 passes inside a single technique instead of 2 techniques but couldn't get it to work so stuck with this [/CODE] Thank you!