C# Server isn't reading Java bytes correctly?

Started by
15 comments, last by jeskeca 8 years, 8 months ago

Hey guys!

I am writing a multiplayer game and I'm going on to the networking now. But I run into an issue.

My C# server is not reading the bytes sent from my java client correctly.

This is the code on both sides:


Client:
public void sendPacket() {
		try {
			dos = new DataOutputStream(server.getOutputStream());
			dos.writeBytes(packet.getPacketID());
			dos.flush();
		} catch (IOException e) {
			e.printStackTrace();
		}
	}


Server:
NetworkStream playerStream = client.GetStream();
			
int recv;
byte[] data = new byte[client.ReceiveBufferSize];
			
Console.WriteLine("Player has connected: {0}", client.Client.RemoteEndPoint);
if (client.Client.Available > 0) {
	PacketReader s = new PacketReader(data);
	s.s(); // This does "ReadString()" from BinaryReader.
}

My Strings show up blank in the console. Any help?

Advertisement
getPacketID suggests you're sending an integer, but you're reading in a string instead? On top of that, you need to be extra mindful about text encodings and endianness. Take the following examples:


// The 32-bit integer, 4779:
Little endian:  AB 12 00 00
Big endian:     00 00 12 AB


// The string, "Kosher-Yosher™":
UTF-8:
4B 6F 73 68 65 72 2D 59 6F 73 68 65 72 E2 84 A2

UTF-16, big endian, with byte order mark: 
FE FF 00 4B 00 6F 00 73 00 68 00 65 00 72 00 2D
00 59 00 6F 00 73 00 68 00 65 00 72 21 22

UTF-16, little endian, with byte order mark:
FF FE 4B 00 6F 00 73 00 68 00 65 00 72 00 2D 00
59 00 6F 00 73 00 68 00 65 00 72 00 22 21
And etc, etc...
BinaryReader.ReadString expects a string in a different format than what Java is writing.

Java's writeBytes function writes a sequence of bytes, where each byte is the low 8 bits of each char in the string. So basically, it'll work on ASCII strings. However, C#'s BinaryReader.ReadString expects the string's length in VLQ form ( https://en.wikipedia.org/wiki/Variable-length_quantity ) to appear before the characters.

You can use C#'s BinaryReader.ReadChars, or BinaryReader.ReadBytes followed by Encoding.ASCII.GetString instead to avoid that problem. However, you will need to write the length manually, otherwise you won't know how many chars to read. So your code would look something like this:


// Java
string s = packet.getPacketID();
dos.writeInt(s.length);
dos.writeBytes(s);
dos.flush();


// C#
int length = packetReader.ReadInt32();
string s = Encoding.ASCII.GetString(packetReader.ReadBytes(length));

Alternatively, if you want to use an integer as your packet id instead of a string, it's much easier:



// Java
int packetId = whatever;
dos.writeInt(packetId);

// C#
int packetId = packetReader.ReadInt32();
In addition, a TCP stream does not guarantee that all the bytes written at a particular time arrive in a single packet -- just that they arrive in order.
So, if you write the six bytes (H E L L O !) on the sending side, you may receive (H E) at one time, and (L L O !) a little bit later.
Thus, you have to build up a buffer of all received data, and only start decoding a packet when you have enough for a full packet.
To make this possible, you typically send a packet length before the actual packet (say, a two-byte integer, for a max packet length of 65535 bytes.)
Your receive function would then not do anything if it has less than 2 bytes in the input buffer. Once it has two bytes, it would "peek" in the buffer, assemble the length, and see if there are 2+N bytes in the buffer; if not, do nothing, else decode the packet, remove it from the buffer, and repeat.
enum Bool { True, False, FileNotFound };

BinaryReader.ReadString expects a string in a different format than what Java is writing.

Java's writeBytes function writes a sequence of bytes, where each byte is the low 8 bits of each char in the string. So basically, it'll work on ASCII strings. However, C#'s BinaryReader.ReadString expects the string's length in VLQ form ( https://en.wikipedia.org/wiki/Variable-length_quantity ) to appear before the characters.

You can use C#'s BinaryReader.ReadChars, or BinaryReader.ReadBytes followed by Encoding.ASCII.GetString instead to avoid that problem. However, you will need to write the length manually, otherwise you won't know how many chars to read. So your code would look something like this:


// Java
string s = packet.getPacketID();
dos.writeInt(s.length);
dos.writeBytes(s);
dos.flush();


// C#
int length = packetReader.ReadInt32();
string s = Encoding.ASCII.GetString(packetReader.ReadBytes(length));
Alternatively, if you want to use an integer as your packet id instead of a string, it's much easier:



// Java
int packetId = whatever;
dos.writeInt(packetId);

// C#
int packetId = packetReader.ReadInt32();

The server still prints out a blank string even I use it exactly how you did...

The server still prints out a blank string even I use it exactly how you did...


Can you show us all of your code between the point where you read the packetId and print it out to the console? Your original post doesn't contain anything that will try to print the packetId to the console.

The server still prints out a blank string even I use it exactly how you did...


Can you show us all of your code between the point where you read the packetId and print it out to the console? Your original post doesn't contain anything that will try to print the packetId to the console.


Server:
public class PacketReader : BinaryReader
	{
		public PacketReader(NetworkStream str) : base(str) {
		}
		
		public void s() {
			int le = ReadInt32();
			Console.WriteLine(le); // Prints a big int.
			string m = Encoding.ASCII.GetString(ReadBytes(le)); // Doesn't print anything out but causes an exception when i close the client.
			Console.WriteLine(m);
		}
 	}

Client:
public void sendPacket() {
		try {
			dos = new DataOutputStream(server.getOutputStream());
			dos.writeInt(packet.getPacketID().length());
			dos.writeBytes(packet.getPacketID());
			dos.flush();
		} catch (IOException e) {
			e.printStackTrace();
		}
	}

These two show how the bytes are sent, and how the server tries to read them.

And...


Server:
public void handleConnection() {
			NetworkStream playerStream = client.GetStream();
			
			int recv;
			byte[] data = new byte[1024];
			
			Console.WriteLine("Player has connected: {0}", client.Client.RemoteEndPoint);
			if (client.Client.Available > 0) {
				PacketReader s = new PacketReader(playerStream);
				s.s();
			}
		}

This is currently how it checks if the data is available and uses "s.s" to print it out.

What does WriteLine(le) print out? Is it correct?

What does WriteLine(le) print out? Is it correct?

590782bfb1c57c9db5f92cd615e74f38.png

The server prints out a much bring length then the client does when testing...

OK, this looks like an Endian problem.

The integer 1 is encoded as the bytes [00 00 00 01] in big-endian and [01 00 00 00] in little-endian. But if you read [00 00 00 01] in little endian, you get 16777216.

In other words, it appears Java is writing the integer in big endian mode, and C# is reading in little endian mode. This is what fastcall22 was warning about in his post.

The way I usually deal with this is to make sure that both sides agree on a specific endian to use for the network traffic. I don't know if C# or Java have functions like C's host-to-network and network-to-host functions, but you can define them yourself. In C# you can use BitConverter.IsLittleEndian to determine the endianness of the computer your code is running on, and then decide whether to reverse the byte order when reading/writing data. You can probably do something similar in Java, but I'm not as familiar with it.

This topic is closed to new replies.

Advertisement