Jump to content

  • Log In with Google      Sign In   
  • Create Account

Banner advertising on our site currently available from just $5!


1. Learn about the promo. 2. Sign up for GDNet+. 3. Set up your advert!


sheep19

Member Since 20 Jul 2007
Offline Last Active Mar 25 2015 01:39 PM

Topics I've Started

Sending 'command duration' from client to server

25 March 2015 - 12:44 PM

I am making a client-server multiplayer game, where the server has the authority.

 

I am currently at the point where I must implement client-side interpolation.

 

The client sends Input packets to the server. Each has a unique ID, which are increasing.

The server responds with WorldState packets. Each has an ID which corresponds to the ID of the latest input packet processed.

 

 

Now when the client receives a state update, it needs to interpolate beginning from the latest received state and apply the inputs that have not been acknowldedged yet by the server.

According to this, each input command also contains its duration in milliseconds (I haven't implemented this yet).

 

The client can have a button pressed for some duration.

So when button A is pressed an input packet is sent with the button_a flag set to true.

When the button is released another input packet is sent with the button_a flag set to false.

 

My questions is:

Is the way I handle inputs correct? Because the way I am doing it, if one of the packets takes more time to arrive than the other one, the server will apply input for more or less time than the actual duration of the command.

Also, how can I measure command duration this way? -- I need this for client side interpolation. The way I do it, command duration does not make sense for the button press, only for the release.

 

What I could do is not send a packet when the button is pressed but only when it is released. This way I can measure the milliseconds for which it was pressed and send it as one command. But this has the downside that if the client presses a button and does not release it for a long time, the server will receive it with a big delay.

 

Or maybe instead of sending the command duration I could send the number of seconds (milliseconds included of course) elapsed since the game has started? This way it will be easy calculate the command duration if I am not mistaken.

 

So, what is a good way to handle inputs that can happen for a period of time?


[SDLNet] Server does not receive incoming connection from client

25 October 2014 - 02:40 PM

I am making a simple Client-Server application.

For now, I am trying to create a server and a client, and make the client connect to the server. When this is done, the server should print a message.

 

-- Server --

http://pastebin.com/xFT3Eau4

 

-- Client --

http://pastebin.com/24Trvf3E

 

First I run the server.

Then I run the client, it prints 0.0.0.0 but the server does not seem to receive the connection request by the client. No errors are printed however (really strange...)

 

If I try to change the IP that the client connects to to the local IP of the pc the server is running (192.168.x.y) or to localhost I get "Couldn't connect to remote host" from the client... But why? Shouldn't these work in the same way as 0.0.0.0?

 

What troubles me is that with 0.0.0.0 the client seems to connect to the server (no errors by the client) but the server never receives the request. Maybe it's blocked by a firewall or am I doing something really wrong?


Having a phone interview with a game company

01 September 2014 - 02:15 PM

Hello,

 

I am having a technical phone interview (about 30 minutes).

 

To prepare for it, I am going to:

  • Refresh my C++ knowledge (virtual functions, templates, operator overloading etc)
  • Revisit some of my old game projects so I will be able to answer questions about projects I have made
  • Study some algorithms (but what kind of algorithms), 3D math (dot/cross product)

 

The job title is "Junior Programmer".

 

What kinds of questions should I expect? It's a technical interview so I hope there won't be any stupid questions.

What else would you recommend to study for the interview?

 

Thank you,

Sheep19


[Flocking] How to improve performance?

12 April 2014 - 04:18 PM

I'm doing a simulation of flying birds for my thesis.

 

I am using the usual steering behaviors: Cohesion, Separation, Velocity Match and Obstacle Avoidance (to avoid terrain).

The program is written in C# and uses Unity3D. I have disabled physics for the boids.

 

I'm also using alglib to calculate K-nearest neighbors using AKNN. alglib uses k-d trees for this.

 

So here are the fps I get:

100 boids = 60 fps

200 boids = 35 fps

300 boids = 23 fps

500 boids = 13 fps

1000 boids = 5 fps

 

As the number of boids increases, performance degrades rapidly. I'd like to do something to improve it, but sadly I am using the free version of unity3D, which does not have a profiler.

 

Is the performance I'm getting good? I haven't found results online for this.

Should I find another library instead of alglib?

 

By the way, you can see the program online here. Requires unity3d plugin.


[Flocking] Avoid many "sub-flocks" in Cohesion

05 April 2014 - 08:20 AM

Hi, I am developing a simulation of flocking for my thesis.

 

I am using Unity3D and C#.

 

For fast calculation of K-nearest neighbors I use the alglib library, which provides methods for it (uses a K-D tree). It also supports approximate knn for faster results, which is what I'm using since I don't need 100% accuracy.

 

 

In my tests I have 500 boids (birds) flying around. They are spawned near to each other so they form a swarm. However, after some seconds, this big swarm begins to break into smaller ones. I'm 99% certain that this is because for Cohesion's a-knn I use 30 for k, which means take into account the 30 "closest" (not really the 30 closest since I'm using aknn).

 

This has the effect that I described above - many small swarms of about 30 boids each, which is very logical since each boid considers a neighborhood of 30 closest boids for Cohesion).

 

----------------

 

When I change the k for Cohesion to bigger numbers (e.g 100) less swarms are formed. If I set k to 500 only one is formed. But this has a very big downside.

For k = 30, I get ~20 fps

For k = 500, I get ~3 fps

[a-knn epsilon is 50.0 for both)

 

Is there a solution to this? I'd like less swarms but better performance.

Is alglib slow? I couldn't find something else... I'm not doing anything special in my code, just the usual behaviors - Cohesion, Separation and Velocity Match. And one more, Cohesion to an anchor object so the flock will follow a path I want.

 

Thanks in advance.

 

2eas4sh.png


PARTNERS