Jump to content

  • Log In with Google      Sign In   
  • Create Account


#ActualHeg

Posted 22 September 2012 - 04:35 AM

I´ve got one additional quick question:

Best practice is to send out game updates from the server and input from the client at a fixed rate, say every 100ms. When we define the "effective latency" as the time between issuing a command and the response from the server, this technique does increase the latency by quite a big deal, doesn´t it? For example:

Server: |----------|----------|----------|-----

Client: --|----------|----------|---------|----

The | are the sending points and the time between them is 100ms. If the player on the client issues a command right after input has been send to the server, it would take almost 200ms to get a response, even when the packages arrive instantly. The first X marks the input time, the second one the response from the server:

Server: |----------|----------|----------|-----

Client: --|-X--------|---------X|---------|----

This is noticeable even on a listening server. I fear, when there is actually traveling time needed between the server and the client, that this will have a too big impact. What are your thoughts on that? Is a 100ms interval simply too big?

#4Heg

Posted 22 September 2012 - 04:35 AM

I´ve got one additional quick question:

Best practice is to send out game updates from the server and input from the client at a fixed rate, say every 100ms. When we define the "effective latency" as the time between issuing a command and the response from the server, this technique does increase the latency by quite a big deal, doesn´t it? For example:

Server: |----------|----------|----------|-----

Client: --|----------|----------|---------|----

The | are the sending points and the time between them is 100ms. If the player on the client issues a command right after input has been send to the server, it would take almost 200ms to get a response, even when the packages arrive instantly. The first X marks the input time, the second one the response from the server:

Server: |----------|----------|----------|-----

Client: --|-X--------|---------X|---------|----

This is noticeable even on a listening server. I fear that, when there is actually traveling time needed between the server and the client, that this will have a too big impact. What are your thoughts on that? Is a 100ms interval simply too big?

#3Heg

Posted 22 September 2012 - 04:34 AM

I´ve got one additional quick question:

Best practice is to send out game updates from the server and input from the client at a fixed rate, say every 100ms. When we define the "effective latency" as the time between issuing a command and the response from the server, this technique does increase the latency by quite a big deal, doesn´t it? For example:

Server: |----------|----------|----------|-----

Client: --|----------|----------|---------|----

The | are the sending points and the time between them is 100ms. If the player on the client issues a command right after input has been send to the server, it would take almost 200ms to get a response, even when the packages arrive instantly. The first X marks the input time, the second one the response from the server:

Server: |----------|----------|----------|-----
^ v
Client: --|-X--------|---------X|---------|----

This is noticeable even on a listening server. I fear that, when there is actually traveling time needed between the server and the client, that this will have a too big impact. What are your thoughts on that? Is a 100ms interval simply too big?

#2Heg

Posted 22 September 2012 - 04:34 AM

I´ve got one additional quick question:

Best practice is to send out game updates from the server and input from the client at a fixed rate, say every 100ms. When we define the "effective latency" as the time between issuing a command and the response from the server, this technique does increase the latency by quite a big deal, doesn´t it? For example:

Server: |----------|----------|----------|-----

Client: --|----------|----------|---------|----

The | are the sending points and the time between them is 100ms. If the player on the client issues a command right after input has been send to the server, it would take almost 200ms to get a response, even when the packages arrive instantly. The first X marks the input time, the second one the response from the server:

Server: |----------|----------|----------|-----
^ v
Client: --|-X--------|---------X|---------|----

This is noticeable even on a listening server. I fear that, when there is actually traveling time needed between the server and the client, that this will have a too big impact. What are your thoughts on that? Is a 100ms interval simply too big?

#1Heg

Posted 22 September 2012 - 04:33 AM

I´ve got one additional quick question:

Best practice is to send out game updates from the server and input from the client at a fixed rate, say every 100ms. When we define the "effective latency" as the time between issuing a command and the response from the server, this technique does increase the latency by quite a big deal, doesn´t it? For example:

Server: |----------|----------|----------|-----

Client: --|----------|----------|---------|----

The | are the sending points and the time between them is 100ms. If the player on the client issues a command right after input has been send to the server, it would take almost 200ms to get a response, even when the packages arrive instantly. The first X marks the input time, the second one the response from the server:

Server: |----------|----------|----------|-----

Client: --|-X--------|---------X|---------|----

This is noticeable even on a listening server. I fear that, when there is actually traveling time needed between the server and the client, that this will have a too big impact. What are your thoughts on that? Is a 100ms interval simply too big?

PARTNERS