Jump to content
  • Advertisement
Sign in to follow this  
Luringrock

Unity Loading a 16-bit RAW heightmap from a bytearray

This topic is 2083 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hi.

I'm currently working on an MMO and I ran into a bit of a problem. My server is made in Java using the Netty framework (netty.io) and the client is made in Unity3d (unity3d.com) using TcpClient and BigEndian versions of the C# BinaryReader/BinaryWriter. My heightmaps were made using World-Machine (world-machine.com) and exported as 16-bit RAW files.

Right now I'm loading the heightmaps into a bytearray via the Java server and sending it to the client, but I don't know how to use that array to create a heightmap (using Unity's TerrainData.SetHeights). Since bytes are per default unsigned in .NET the bytearray in the client is actually an sbytearray.

If anyone could lend me a hand on this or point me in the right direction it would be greatly appreciated and considered a personal favour. Thank you very much in advance.

TL;DR

I need to know how I'd turn the binary content of a 16-bit RAW heightmap into an actual heightmap on a Terrain using Unity.

Share this post


Link to post
Share on other sites
Advertisement

I have used quite extensively the undocumented terrain functions in one of my older games, don't remember all the details now and i don't have the code at hand but:

 

When you use the SetHeights function, you are actually setting an array of floats with values in [0..1] range. 0 is the lowest terrain, and 1 is the highest terrain. So what you would need to do as you read the server response is to convert those bytes (I guess 2 at a time since they are 16 bit integers) into floats, mapping the [0..65546] integer range into [0..1] float range. all this must be set into a 2 dimensional array, and then you can feed it to the terrain engine.

Share this post


Link to post
Share on other sites

I have used quite extensively the undocumented terrain functions in one of my older games, don't remember all the details now and i don't have the code at hand but:

 

When you use the SetHeights function, you are actually setting an array of floats with values in [0..1] range. 0 is the lowest terrain, and 1 is the highest terrain. So what you would need to do as you read the server response is to convert those bytes (I guess 2 at a time since they are 16 bit integers) into floats, mapping the [0..65546] integer range into [0..1] float range. all this must be set into a 2 dimensional array, and then you can feed it to the terrain engine.

Thank you very much for your reply.

 

What I do now is I read the bytes as shorts into a short array, then put those shorts as float values of 0-1 into a 2D float array as needed for the SetHeights function, but it doesn't quite work yet.

			int size = (int) Math.Sqrt((double) (heightmapBytes.Length / 2));
			short[] sdata = new short[heightmapBytes.Length / 2];
			Buffer.BlockCopy(heightmapBytes, 0, sdata, 0, heightmapBytes.Length);
			float[,] heightmap = new float[size, size];

			for (int y = 0; y < size; y++) {
				for (int x = 0; x < size; x++) {
					heightmap.SetValue((float) (sdata[y * size + x] / short.MaxValue), y, x);
				}
			}

			terrain.GetComponent<Terrain>().terrainData.SetHeights(0, 0, heightmap);

The result:

5amiE.png

This is obviously not the desired landscape and doesn't quite correspond with my data.

 

The World-Machine and Unity Terrain settings might be relevant;

World-Machine export settings:

5amCU.png

The total map is 32km by 32km, and is cut into 100 tiles of 3200m by 3200m each and share a resolution of 512 by 512.

 

Unity Terrain settings:

5amR5.png

To my knowledge the settings of the Unity Terrain and the World-Machine heightmap match so I don't see what's wrong here.

 

Thank you very much in advance.

Share this post


Link to post
Share on other sites

Have you tried to set an actual heightmap (like a grayscale image or something) in Unity and see if it works that way? I mean setting a static example first and then trying to change the values through code.

Share this post


Link to post
Share on other sites

Have you tried to set an actual heightmap (like a grayscale image or something) in Unity and see if it works that way? I mean setting a static example first and then trying to change the values through code.

Yes, I have tried that and it worked.

 

I do see what's wrong now, and it makes sense.

(float) (short / short) will result in a short (in this case either 0 or 1).

((float) short / (float) short) works just fine.

5anC3.jpg

This is what the result currently looks like, it only sets heights lower than a certain point, which is I think because of the World-Machine height settings.

I'm gonna try to fiddle with that a bit and see if it works.

 

Thank you very much for your help.

Share this post


Link to post
Share on other sites

Are you sure you don't need to use unsigned shorts instead? (ushort I think in C#)?

 

If you do you need to change both of these lines

 

ushort[] sdata = new ushort[heightmapBytes.Length / 2];
Buffer.BlockCopy(heightmapBytes, 0, sdata, 0, heightmapBytes.Length);
float[,] heightmap = new float[size, size];

for (int y = 0; y < size; y++) {
    for (int x = 0; x < size; x++) {
         heightmap.SetValue((float) (sdata[y * size + x] / ushort.MaxValue), y, x);
    }
}

 

Endianness of the 16 bit data may be an issue as well (perhaps).

Share this post


Link to post
Share on other sites

Are you sure you don't need to use unsigned shorts instead? (ushort I think in C#)?

 

If you do you need to change both of these lines

 

ushort[] sdata = new ushort[heightmapBytes.Length / 2];
Buffer.BlockCopy(heightmapBytes, 0, sdata, 0, heightmapBytes.Length);
float[,] heightmap = new float[size, size];

for (int y = 0; y < size; y++) {
    for (int x = 0; x < size; x++) {
         heightmap.SetValue((float) (sdata[y * size + x] / ushort.MaxValue), y, x);
    }
}

 

Endianness of the 16 bit data may be an issue as well (perhaps).

Thank you very much, the signedness of the shorts was indeed the problem. Everything works just fine now.

Share this post


Link to post
Share on other sites
Sign in to follow this  

  • Advertisement
×

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

GameDev.net is your game development community. Create an account for your GameDev Portfolio and participate in the largest developer community in the games industry.

Sign me up!