Jump to content

  • Log In with Google      Sign In   
  • Create Account

gfxgangsta

Member Since 06 Dec 2011
Offline Last Active Dec 19 2014 10:41 AM

#5197206 Planetary Annihilation networking

Posted by gfxgangsta on 09 December 2014 - 11:01 AM

I recently came across Forrest Smith's excellent blog post:
 
 
It definitely goes into some of the networking concepts/algorithms/formats they used for the game. But some aspects weren't immediately clear to me. I was wondering if someone could provide some additional insight.
 
1. They mention the server ticks at 10fps. Do they mean that network updates are sent 10 times per second, or do they mean the server simulation loop runs at 10fps (and the network update is coupled with it)?
 
2. I think the concept of "curves" is pretty cool... when applied to positions for example, they only send *new* keyframes and avoid sending redundant data. However, after reading I wasn't sure if they generate new keyframes at 10fps too, or if they have a different rate for generating those. I know special events like collisions can also generate keyframes, but I didn't understand if keyframe creation was generally done on each iteration of the game loop.
 
3. How does a client smoothly go from one position to the next? The server generates keyframes, but by the time the client receives them, they're old. Let's say the last packet on the client was generated by the server at t = 0.5 and received at t = 0.6. When the client's game loop runs, t = 0.67. How does the client compute the position at 0.67? Is it just extrapolating using the line equation (where the "true" line segment is from t = 0.4 to t = 0.5)?
 
4. What if a unit stops? Based on the client's keyframes, the unit would continue to move. I'm thinking they can send at most one redundant keyframe (so the curve is now a line with two identical y-values), or they could send a "stop" pulse curve (one of those immediate events they talk about). Thoughts?
 
5. Speaking of pulse curves (instantaneous events), it seems like the client would just receive N of these during a server update, and process all of them on the next tick (because they're instantaneous on the server, but they're already old when they're received on the client). Or, is the server somehow generating these in advance? For example, when a unit will fire, the server knows that the unit fires two shots, so it can send both shot events right away. But the first shot would still be old, wouldn't it? And, depending on lag & the time between shots, the second one might be old too.
 
6. Based on the previous "shooting" example, how would health be in sync with the attack? If health is a separate curve, the health packets and the shot packets might reach the client at different times (or, if the client plays the shot animation but has already received the health keyframe)... this means health number and visual attack won't be in sync. Would they wait and send the health keyframe along with the attack keyframe? Or, would they send both in advance, and the client is responsible for "playing" them at the same time?
 
The original post is great, I'm just fairly new to networking (but fascinated by it) and would appreciate any help in making more sense out of this.



#5057142 Game Jams, when can I participate?

Posted by gfxgangsta on 26 April 2013 - 09:07 PM

Overall, I would say there is no level of knowledge *required*. You might even learn something new. (i.e. I got started with Unity at a jam). But I think it's important to communicate this to potential teammates. Usually there's a bit of mingling that happens during the first 30-60 minutes of a jam, and you can use that time to find a team in which you fit in.




#5047683 Article Monthly Themes

Posted by gfxgangsta on 28 March 2013 - 10:29 AM

I really like the idea of a monthly theme.

 

+1 on ByteTroll's request for topics like advanced rendering techniques... sometimes these are covered in different forum posts, but it would be great to see articles.

 

Other theme ideas:

 

-experimental gameplay

-native code for mobile

-a specific OS

-build systems (assets/data/code)

-use of scripting languages

-script language compiler implementation

-scripting integration with game engines

-making a game in a tool not meant for games (like Excel or Powerpoint)

-monetization

-"three-chord song" (many musicians write songs that just use different combinations of the same three chords.. perhaps three simple mechanics could be mixed and matched to make a great game that seems complex, but is really made up of just those three basic elements)

-animation (UV, vertex, skeletal,etc)

-texturing (programmers could chime in with procedural texture generation or articles about packing textures into an atlas)

-specific game genres (racing, shooters, etc)

-postmortem month (what went right/wrong/etc during someone's project)

-hardware (something you tried with Sifteo Cubes or Oculus Rift)

-motion control (gameplay opportunities, filtering motion input, etc)




#5006151 Feasibility of writing android apps purely through the NDK

Posted by gfxgangsta on 01 December 2012 - 06:48 PM

Do you think it is possible to write an entire android app with either C or C++ (with the exception of the lines required to initialise your native code.


I'm not sure how mature the NDK is at this point (how many features it supports), but you can already implement your Activity class, access sensors and do rendering in native code.

Here's a blog post that shows how to implement a basic native Activity: http://www.altdevblogaday.com/2012/02/28/running-native-code-on-android-part-2/


What problems do you think someone would come across?


Unsupported features, losing access to a useful set of Java UI elements, possibility of getting into trouble with native memory management, etc

Are there any apps that somewhat bizarrely use this approach?


I bet there are a few utility/enterprise apps that use the NDK, but I've only seen games use it.

Is there really any speed benefit? (I would normally expect there to be but apparently there are some weird quirks with how "native" code runs on android)


I would love to measure this at some point. Some algorithms probably benefit more than others. I think there is a speed increase, but it may not be huge. A big reason why people use the NDK is ease of porting (not having to translate to Java from C/C++).

Wouldn't you lose support for x86 and MIPS handsets?


You would have to make separate builds for those architectures. The NDK documentation (http://developer.android.com/tools/sdk/ndk/index.html) mentions x86 and mips are supported (but for API level 9 and greater):

"Removed arch-x86 and arch-mips headers from platforms/android-[3,4,5,8]. Those headers were incomplete, since both X86 and MIPS ABIs are only supported at API 9 or higher."


Also, from http://stackoverflow.com/questions/5089783/producing-optimised-ndk-code-for-multiple-architectures :
It looks like if you specify "APP_ABI := all", it builds for all architectures (NDK7 or greater).


#4994208 Android, C++, C4droid, and makefiles

Posted by gfxgangsta on 26 October 2012 - 11:21 AM

This is an Android.mk makefile taken from the NATIVE_ACTIVITY.HTML help doc:

8/ Create an Android.mk file in the jni/ directory of your project to describe your native module
to the build system. An Android.mk file is essentially a snippet of a GNU Make file. For
example:

LOCAL_PATH := $(call my-dir)
include $(CLEAR_VARS)
LOCAL_MODULE := my_native_module
LOCAL_SRC_FILES := my_native_code.c
include $(BUILD_SHARED_LIBRARY)


To add more .c or .cpp files, you can add them to the LOCAL_SRC_FILES section like so:

LOCAL_SRC_FILES := my_native_code.c my_native_code_2.c my_native_code_3.c my_native_code_N.c


OR like so:

LOCAL_SRC_FILES := my_native_code.c \
my_native_code_2.c \
my_native_code_3.c \
my_native_code_N.c




There's also the "wildcard" keyword so you can avoid listing all the files manually:

http://stackoverflow.com/questions/8350878/using-makes-wildcard-function-in-android-mk


Hope this helps!


#4932487 Help with d3d->CreateDevice

Posted by gfxgangsta on 18 April 2012 - 08:46 AM

Your problem is this line:

if (d3d = NULL)

...

it's assigning your d3d pointer to NULL, instead of comparing... what you want is :

if (d3d == NULL)

...


#4915985 Deferred Rendering

Posted by gfxgangsta on 23 February 2012 - 03:47 PM

Can you post the code for directionalLightEffect and combineFinalEffect?

And as for "but I believe things which are within the 0.0 to 1.0 range don't have to be normalized, since that's basically what normalizing does"...


Normalizing causes a vector to become unit length. Even if the individual XYZ are in the 0.0 to 1.0 range, that doesn't mean the vector's length == 1.


#4915667 Deferred Rendering

Posted by gfxgangsta on 22 February 2012 - 03:28 PM

I assume the QuadRenderer works correctly. Shouldn't the light direction vector be normalized?


#4906455 Water Reflection and Opacity

Posted by gfxgangsta on 26 January 2012 - 10:09 AM

I found the following resource particularly useful, but you would have to convert to OpenGL and use shaders:

http://www.riemers.net/eng/Tutorials/XNA/Csharp/Series4/The_water_technique.php

The explanation is pretty good though :)


PARTNERS