Jump to content

  • Log In with Google      Sign In   
  • Create Account

C0lumbo

Member Since 02 Nov 2012
Offline Last Active Today, 01:37 PM

#5198081 Rendering an atmosphere

Posted by C0lumbo on 14 December 2014 - 01:32 AM

My approach for rendering the atmosphere is to take the mesh (in my case, it's a subdivided cube warped into a sphere shape), calculate the silhouette by comparing adjacent faces and findings edges where one face points toward the camera and one point faces away, then I extrude out geometry for the atmosphere.

 

Clipboard11.png

 

It's a lot of work to be doing on the CPU, but it means I'm filling a lot less pixels when I draw the atmosphere, and the technique also works with the oddly shaped planets I have in my game:

 

Clipboard04.png

 

 

I think phil_t's approach is more elegant though.




#5197226 Planetary Annihilation networking

Posted by C0lumbo on 09 December 2014 - 12:53 PM

1. They mention the server ticks at 10fps. Do they mean that network updates are sent 10 times per second, or do they mean the server simulation loop runs at 10fps (and the network update is coupled with it)?
 
Yes. The servers simulation loop (the server's frame rate) is 10hz and network updates are coupled with it. However, they go to some effort to make sure they don't send updates for any entities that haven't changed.
 
2. I think the concept of "curves" is pretty cool... when applied to positions for example, they only send *new* keyframes and avoid sending redundant data. However, after reading I wasn't sure if they generate new keyframes at 10fps too, or if they have a different rate for generating those. I know special events like collisions can also generate keyframes, but I didn't understand if keyframe creation was generally done on each iteration of the game loop.
 
They would create new keyframes at 10fps, sometimes more than one keyframe (as in the bouncing ball example), but for many (probably most) objects, there wouldn't be a new keyframe to generate. Any object standing still sends no keyframes.
 
3. How does a client smoothly go from one position to the next? The server generates keyframes, but by the time the client receives them, they're old. Let's say the last packet on the client was generated by the server at t = 0.5 and received at t = 0.6. When the client's game loop runs, t = 0.67. How does the client compute the position at 0.67? Is it just extrapolating using the line equation (where the "true" line segment is from t = 0.4 to t = 0.5)?
 
The situation where t=0.67 when the latest receipt from the server is t=0.5 shouldn't happen in the general case. The client will try to keep itself at the server time + the average latency + a bit extra. Most of the time you're playing you will be interpolating toward the latest receipt from the server. Occasionally with a network hiccup you might accidentally catchup and overtake, I don't know what they do in that case. Probably they allow a little bit of running ahead with extrapolation, but if you end up too far ahead the time on the client probably slows then stops so you're not seeing too much made up stuff.
 
This also answers questions 4, 5, and 6, I think.
 
@Conk - From the article, it sounds like a pretty different approach to Supreme Commander. It isn't lockstepped at all, it's completely server-client in its approach
 
PS - Good article, thanks for sharing it gfxgangsta



#5196833 How is a game engine made?

Posted by C0lumbo on 07 December 2014 - 02:12 PM

If you want a good introduction on the topic, you could do worse than read Game Engine Architecture by Jason Gregory




#5195449 Some link to list of game sales?

Posted by C0lumbo on 30 November 2014 - 12:29 AM

There's this from the TIG forums (sticky post in their business forum): http://forums.tigsource.com/index.php?topic=35689.0




#5195259 Particle Systems & Handling Transparency

Posted by C0lumbo on 28 November 2014 - 04:20 PM

 

 

(it is possible to draw additive and alpha blended particles in a single draw call).

 

How do you do that? (I'm an OpenGL beginner)

 

 

I suppose with the blend function: Dest * (1 - SrcAlpha) + Src

 

Aka premultiplied-alpha.

 

If SrcAlpha is 0, it simply reduces to additive blending.

If SrcAlpha is between 0 and 1 you can achieve alpha blending but you have to premultiply your textures by the alpha value first.

 

This also allows you to seamlessly blend from explosion to smoke and other cool effects.

 

 

Yep, that's it.




#5195115 Particle Systems & Handling Transparency

Posted by C0lumbo on 28 November 2014 - 01:30 AM

Once you start looking for it, you'll see incorrectly sorted particles in a lot of very high quality AAA games, it's surprising how much incorrectness you can get away with.

 

One solid solution which I favour is to have all your particle textures on a single texture sheet. It's a bit limiting in terms of texture space (less so if your target platforms support texture arrays), but it means you can have a single correct sort for all your particles and render the lot with a single draw call (it is possible to draw additive and alpha blended particles in a single draw call). Particles can be perfectly sorted with one another, but aren't sorted with other transparent objects in the scene.

 

The more common solution is to just make do with a coarse sort. Rather than managing your particles purely by their type, you separate them according to the instance, sort the instances and render the instances according to their depth. So the particles from explosion #1 are managed separately form the particles from explosion #2. Arranging things this way might make more sense when you want to manipulate the particles, e.g. immediately destroy particles from explosion #1, or attach explosion #2 to a moving object. Plus by managing by instance this way, it becomes feasible to frustum cull particle effects.

 

PS: Your summary of the rest of the pipeline seems pretty good. I tend to make a distinction between 3 types of transparency, Decals (i.e. transparent stuff that's rendered just over opaque stuff, like blob shadows, bullet holes), 1-bit (e.g. foliage that uses alpha discard but no translucency), 8-bit (e.g. particles). Then you render in the order, opaque->decals->1-bit->8-bit. Only the 8 bit stuff needs to be sorted by depth.




#5194727 Projectile logic

Posted by C0lumbo on 26 November 2014 - 12:39 AM

I think googling "projectile motion equations" is going to direct you more toward equations that will help you answer 'where will it land' or 'what launch angle should I use' kind of questions, they'll be useful for AI aiming no doubt.

 

If you want the sort of projectile movement you see in DDTank (which from a cursory googling looks like a worms clone to me), then you're probably more interested in frame by frame movement. Which is even simpler.

 

You're going to have 3 vectors:

 

Acceleration (gravity)

Velocity

Position

 

Each update you need to know the duration of the frame/tick (dt = delta time).

 

void UpdateProjectile(float dt)

{

   Position += Velocity*dt;

   Velocity += Acceleration*dt;

   if (CollisionTest())

   {

      Kaboom();

   }

}

 

Updating the position/velocities of physics objects is known as integration, and that function is the simplest most intuitive approach and should work fine for your needs. There are better approaches which reduce calculation error slightly (verlet integration for example), but that's probably not necessary at this point.




#5194686 how to get the same repeating numbers for any of my imput number?

Posted by C0lumbo on 25 November 2014 - 04:37 PM

You want the modulo operator, which essentially gives you the remainder of a division (although there are some subtle differences for negative numbers)

 

It might look like 'mod' or '%' depending on the language.

 

For your specific case, try: ((n - 1) % 4) + 1




#5194554 Blurring a specific object in the scene

Posted by C0lumbo on 25 November 2014 - 12:43 AM

There's no reason your plan couldn't work, but there's any number of mistakes that could have occurred along the way to make it not work. Maybe check your blend modes when you're rendering the light?

 

Would it not be simpler to make the glow texture offline in photoshop/gimp, and slap that on a sprite instead though?




#5194367 Reverse normal in ray-plane intersection

Posted by C0lumbo on 24 November 2014 - 02:21 AM

If the dot product between the ray direction and the plane normal is greater than zero then it indicates that if you're going to collide with the plane at all, you'll be hitting the back of it.

 

Exactly what behaviour you want to take when a ray hits a back face is up to you. By reversing it, you're essentially making your plane double sided, which might be desirable, it probably depends on your usage case. You might want to be able to see through the back faces, or to ignore them because you have another plane at the same location but pointing the other way with different material properties.

 

Taking steps like precaclulating dot product results because your ray's direction is always 0,0,1 sounds like it might be a premature optimization at this stage. I would get stuff working before you put in special optimizations that take advantage of your ortho projection.




#5194085 Properly handling mixed languages at run-time

Posted by C0lumbo on 21 November 2014 - 11:53 PM

To be honest, I'd just replace any unsupported characters with something like an underscore _ or the white rectangle glyph ▯ and be done with it. I don't think it would adversely affect a significant number of users, and if you get any support requests raised, you can just tell them that they ought to set their in-game language to match their locale setting.




#5193830 Soft Particles and Alpha Blending

Posted by C0lumbo on 20 November 2014 - 11:34 AM

Not much info to go on here.

 

Soft particles do a compare against the depth value which controls a fade-to-transparent as they get close to intersecting.

 

If your transparent meshes were to write to the depth buffer that the soft particles are comparing against, then it'd probably work. Note that the depth buffer the transparent meshes need to write to may or may not be your main scene z-buffer. It may be part of a G-buffer or something.




#5193656 Compression questions

Posted by C0lumbo on 19 November 2014 - 01:29 PM

This is a nice article about compressing normals (in a g-buffer): http://aras-p.info/texts/CompactNormalStorage.html

 

In general, the trend on modern hardware seems to be that math gets cheaper and cheaper while bandwidth gets (relatively) more expensive. So compression at the cost of a few ops is often worthwhile. It does depend on the specific hardware and use case though (and I'm no expert).

 

I think half floats would struggle a bit to cover 1000m at 0.1m intervals. A half float is only 16 bits so only has 65536 possible values, plus most of them will be focused close to zero, so perhaps not appropriate for position data. Half floats are probably fine for direction and colour though.




#5189217 Spherified cube

Posted by C0lumbo on 26 October 2014 - 07:04 AM

Hi!

 

Just a quick question regarding the process of converting a cube to a sphere, like this:#

 

planets-1-cubemap.png

 

...am I right in guessing that this could be done in a vertex shader by taking each vertex, finding a vector from the vertex to the centre of the cube and moving the vertex along that vector until its length matches the desired sphere radius? Or is it more complicated than that?

 

Thanks a lot smile.png

 

That'll work pretty well.

 

If you want to reduce the distortion slightly (i.e. reduce the difference in size between the largest quads and the smallest quads), then there is another more complicated method you might want to look at (I use it in Rapture): http://mathproofs.blogspot.co.uk/2005/07/mapping-cube-to-sphere.html

 

I would only do the more complex approach if the distortion is a problem though.




#5188169 C++ header inclusion question

Posted by C0lumbo on 20 October 2014 - 11:46 AM

My guess is that you've accidentally used the same name for the include guards in both header1.h and header2.h

 

(as an aside, you should always use forward declarations when you can get away with it, not just as a last resort).






PARTNERS