Jump to content

  • Log In with Google      Sign In   
  • Create Account


Member Since 01 Sep 2011
Offline Last Active Aug 03 2016 05:57 PM

#5127672 Need help with effects system

Posted by on 31 January 2014 - 01:39 AM

I've been working on getting an effects system set up but I'm having some trouble figuring out how to implement some parts of it.  I've made a sort of outline of the key parts of it, how I think some things should work, and what things I don't know how to do.  If anyone could look over it and comment on whether these are good ideas or not, or how to do some of these things, I'd appreciate it.  If anything in my rather abbreviated list doesn't make sense I can explain it better, if need be.

2 types of effects: active, reactive (not a code based differentiation)
    active: DoTs and instants, stat changes.  Not interested in other effects, just applies some kind of stat change
    reactive: shields, effect modifiers.
        Change effects when they update?
        Somehow need to pass computed change from active effects to reactive ones for modification
            How does sorting play into this? does effect type (additive, multiplicitive, exclusive) matter?
    effects have a priority number manually assigned as well as a type (additive, multiplicitive, exclusive)
    exclusive effects apply before additive/multiplicitive and prevent same-priority effects from applying (how?)
        possibly sort, then group same-priority effects together, stop processing group after hitting exclusive effect
    each tick, effect list is sorted
    loop over list, call update on each effect
    remove effects with remaining duration <= 0
    when multiple instances of a single effect are active, need to apply the visual only once (how?)
    visual probably can't be applied every update, needs to be applied once then removed, which means something has to hold a reference to the VFX object
        what holds the reference?  can't be held by the effect instance
    how do you ensure that as one effect wears off, the visual stays active until all effects of that type have worn off
    when are packets sent? after each stat change? after each effect is processed? after all effects are processed?
    is there any per-update data involved besides stat changes?
    for things like stun, root, etc, have a handful of hidden bool stats that determine state.  Input system can check these to selectively disable
        likely needs to use same system as visuals to ensure the state remains set so long as any effect is active that applies the state

#5099260 Custom text renderer performance

Posted by on 06 October 2013 - 10:19 PM

I'm trying to implement text outlines in direct2d using a custom text renderer. The problem is the performance hit from merely calling GetGlyphRunOutline is beyond absurd. Forget about drawing, just calling that function for a dozen or so TextLayouts costs about 30ms. It only takes about 0.02ms to render those same TextLayouts using the standard DrawTextLayout call. All the examples I've seen call it every time DrawGlyphRun is called.

I tried to cache the resulting PathGeometry but Direct2D seems to create a new GlyphRun object every time TextLayout.Draw is called and I don't see any other parameters that I could use as an identifier. Although even if that worked, it would be useless for dynamic text. How are we supposed to have any remotely-decent render times with a custom text renderer?

#5093883 how to write to constant buffer?

Posted by on 13 September 2013 - 03:39 PM

if I have defined 3 cbuffers, are they automatically assigned slot 0,1,2, so I will call
SetConstantBuffer(0, ...)
SetConstantBuffer(1, ...)
SetConstantBuffer(2, ...)
to update all three?, if so, are they assigned in the order they are declared?
When accessing the variable in the vertex shader, do I still only access it by name even if the buffer is in slot1?

I believe that's correct. I don't know if there's any documentation on how it assigns them but it most likely just starts at 0 and assigns in order, skipping over any already-used registers. Like I said, if you don't want to rely on what the compiler assigns, you can just assign them manually using the register keyword. And yes, you access them by name in hlsl, it doesn't matter what buffer they're assigned to or what slot the buffer is in. Also keep in mind that there's also a SetConstantBuffers function that allows you to set multiple in 1 call.

#5093714 how to write to constant buffer?

Posted by on 12 September 2013 - 09:58 PM

The association is made when you set the constant buffer: Device.ImmediateContext.VertexShader.SetConstantBuffer

The first parameter of that function is the register slot. If you don't specify the slot in the hlsl, it'll be automatically assigned one, which you can find from looking at the compiled shader code. See here for info on the register keyword: http://msdn.microsoft.com/en-us/library/windows/desktop/dd607359(v=vs.85).aspx

#5092416 Particle Collision

Posted by on 08 September 2013 - 12:31 AM

I suppose I could try that out. It may end up being rather expensive though considering how many draw calls it would need to draw all the terrain.

#5092352 Particle Collision

Posted by on 07 September 2013 - 03:39 PM

How would you create such a texture? Scanning all the blocks on the CPU would probably be rather slow. Rendering it out with the GPU might be quicker but I don't know how you'd map 1 block to 1 pixel.

#5089647 Question about downsampling

Posted by on 27 August 2013 - 05:29 PM

I did a few tests. Doing linear sampling ended up being very slightly faster than generating mipmaps. Also doing 4 linear samples in 1 pass instead of 1 sample in 2 passes was again very slightly faster.

#5089310 Question about downsampling

Posted by on 26 August 2013 - 04:50 PM

A lot of times, usually in bloom/hdr samples, I see people taking a fullscreen image, rendering it to a 1/2 sized RT, then rendering that to a 1/4 sized RT. The pixel shader for these 2 passes is nothing more then a linear sampler being run on the input.

Is this done because the linear sampler returns a 2x2 area sample and you don't want to lose any information by downsampling straight to a 1/4 sized RT? How bad would it be to just skip straight to the 1/4th size? And what about upsampling then? I've seen the same process done for that case as well.

Also, a side question: If I need to bring a fullscreen image down to 1/8th size, would it be faster to just ask DX to generate mipmaps and use the mipmap of the size I need rather than doing 3 downscale passes? The answer will probably be "try it yourself" but I figured I'd ask anyway...

#5087425 Determining effect order

Posted by on 19 August 2013 - 05:22 PM

If I could come up with an example of how it should work, the problem would be solved. All I know is that there are some effects where the order will matter, and in such cases, there needs to be a way to specify the order. I think it would be a nightmare if I tried to just specify that manually for each effect, so I want some more general way of doing it.

I agree that there probably isn't a perfect solution to this. But just because I can't think of one doesn't mean one doesn't exist tongue.png. I wanted to see if anyone could come up with "better" solutions. DrEvil's idea is pretty good and I may end up going with that.

#5087201 Determining effect order

Posted by on 18 August 2013 - 10:55 PM

So kind of like a more flexible version of solution A? The dependencies would be applied to the categories themselves and anyone could add new categories? Seems pretty good, but it might make it easier to end up creating circular dependencies. Also it would still have the problem of not being able to stick an effect in between two effects with the same category(s).

#5086877 Determining effect order

Posted by on 17 August 2013 - 03:55 PM

If you have impossible ordering constraints (A before B, but B before A), it's a mistake regardless of how they are represented.

Of course. I didn't state this but in that example, effects A and C don't actually care what order they're in, but since the system would require a priority be assigned for all effects, they were arbitrarily assigned those orders. When just A and C are used, it made no difference, but when B came into the equation, it caused a problem.

Of the three representations in the original posts, the third one is by far the most appropriate one because it represents an order relationship correctly; in particular, new "effects" can be added anywhere in the dependency graph, without running into trouble with missing gaps between "layers" or "priorities".

It isn't awful; realistically, a few dependencies imply a node's position with respect to all nodes that matter. For example, many different attack types, with various precedences among themselves, can be specified to happen before explosion rendering; then if you say that the two "final" steps of creating debris objects and swapping in damaged sprites and models take place after explosions, they automatically take place after all attacks too.

Correct me if I'm wrong, but I think we're thinking about where the order is assigned in two different ways. You seem to be thinking of it in terms of the spell specifying the order of each of its effects. I was thinking about it in more global terms, as in "effect C" always occurs before "effect B" regardless of whether its applied by a "spell A", "spell B" or "spell C". I'm not sure there would be a case where you'd want to have the order be one way for one spell and then entirely different for another spell given the same effects.

Also, I think the order is far more important when considering the effects currently on an entity. Entities can have effects like "damage reduction" or "fire resistance" applied to them from separate spells. Now when the entity is attacked, these effects will inspect the incoming damage and modify it. This is where I feel the order is important.

Consider a zombie that has the following two effects applied to it: "Converts healing spells to 'harm' spells, and 'harm' spells to healing spells" and "All damage is reduced by 50%". If you cast a 'harm' spell on the zombie, one of two things can happen, depending on the order those effects are applied. If the damage reduct is applied first, the zombie will be healed by 50%. If the conversion is applied first, then the damage reduct won't apply (since it's now a heal instead of a damage spell), and the zombie will be healed for 100%.

How would you specify the order in this case?

#5086715 Determining effect order

Posted by on 17 August 2013 - 03:34 AM

A fourth approach would be to ask for a total order statement. I.e. all possible effects are presented in a chain (or perhaps a tree), and a new effect needs to be inserted into this chain to become available in the game. This pushes the responsibility to the designer.

What makes this different from solution 3? And how does it handle the case where Mod A adds 5 effects and then Mod B adds another 10 effects. It seems like each mod would have to know about every other mod.

Now, because you can't have both A before C and A after C, you perhaps wanted to say that the order of A and C is arbitrary until B comes into play. This may be expressed by letting A and C have the same priority first, but requires automatic priority adaption.

That is exactly the situation I wanted to express with that example. How would you code/design such an automatic priority adaption system? It seems to just end up with solution 3.

#5086704 Determining effect order

Posted by on 17 August 2013 - 02:54 AM

I'm having a hard time trying to come up with a way to determine the order to apply/process effects on an entity. The ideas I've come up with all have downsides to them, so I'm wondering if anyone can provide some insight on this problem.

First idea is to have a preset list of categories. Every effect/spell is assigned to a category and the categories are all processed in a certain order. There's two downsides to this: If I have an effect that needs to be applied between "category A" and "category B", there's no way to do that without introducing another category. The other issue is, if two effects are both in "category B" and I now have an effect that needs to happen between them, I'm out of luck.

Second idea is to assign a priority to each effect/spell as a float value. Effects/spells with a higher value are applied first. The problem with this one is that if I have "effect B" that needs to be applied after "effect A" and before "effect C", but A has a value of 5, and C has a value of 10, no value assigned to B will create that order.

And the last idea, which is downright awful, is to have each effect (optionally) define every effect that needs to be applied before (or after) it. Aside from the issue of creating circular dependencies, this would really get out of hand if you tried to make an effect that always has to be applied last. Either it would have to specify every single effect must occur before it, or every other effect would have to specify that that effect must come after it.

I feel like the 2nd idea is the most workable, but I want to be as accommodating to modding as possible, so people don't run into issues like "oh well, they set the priority of this spell too high, there's no way to have my effect run in the proper order". Basically all of these work in a closed system when you know every spell/effect that could be applied, but are problematic when mixed with mods. Thoughts?

#5085296 Need some help with networking entity-component system

Posted by on 12 August 2013 - 03:48 PM

How would the receiving end know what to do with the exploding component? Does the exploding component's constructor examine the entity object it's being attached to, find the Projectile component and hook itself up to the Collided event? That seems like a bit too much interconnection. Does that mean someone has to make a "mid air explosion" component now if they want to have the arrow explode on a different event?

The system doesn't have the ability to serialize a whole entity across the network, only the update state, because of complications like this. I wasn't sure how to deal with serializing these one-time setup things (such as hooking up events), so it was designed to send only the needed info, like position, rotation, owner and entity ID, then call most of the same code the server just ran using that info. Clearly this has a bit of a flexibility problem though, as I'm now realizing.

#5081019 GPU particles

Posted by on 27 July 2013 - 03:10 PM

I may give that a try but I'm not so sure it'll help. I've been using the effects system for all my shaders so far with no problem. When I get a chance I'll set up the particle system in a separate project and switch between DX11 and 11.1 and see if that makes any difference. Thanks for trying though unbird, I appreciate it.