my multiplayer-game ran quite well until I increased the viewrange for the forests quite a bit.
When testing this, I got an incredible high packet loss.
Usually this would not confuse me, since I´m using UDP and packet loss is expectable.
The problem is that
- The loss is quite high, about 17% are lost (without increased viewrange: max. 0,8% over the internet!)
- I´m running server and client on the same machine, how can this loss be caused?!
I thought of the buffer of the Client not being big enoug, but changing the size from 8kB to 50kB doesn´t have any effect on the loss...
So I cannot think of a way a UDP-Packet can be lost, especially that many.
I´m programming in Java, using the common UDP-Sockets: http://docs.oracle.com/javase/1.4.2/docs/api/java/net/DatagramSocket.html
My initialization code (client) looks like this:
//CPORT is the port I´m using
socket = new DatagramSocket(CPORT);
//setting the time the receive-call blocks
socket.setSoTimeout(1);
So, do you have any idea how those UDP-Packets can be lost?
thx in advance