• Advertisement
  • Popular Tags

  • Popular Now

  • Advertisement
  • Similar Content

    • By drcrack
      It is a combination of fundamental RPG elements and challenging, session-based MOBA elements. Having features such as creating your unique build, customizing your outfit and preparing synergic team compositions with friends, players can brave dangerous adventures or merciless arena fights against deadly creatures and skilled players alike.

      This time with no grinding and no pay to win features.

      We're still looking for:
      1) 3D Character Artist
      2) 3D Environment Artist
      3) Animator
      4) Sound Designer
      5) VFX Artist

      Discord https://discord.gg/zXpY29V or drcrack#4575
    • By Francisco Tufr
      Hi everyone! I'm currently working on a series of books about 2D Shader Development.

      The idea is to synthesize a bunch of techniques that are specifically useful for 2D, even if they work on 3D as well.

      I released the first book last week. It's 4.99 on Amazon or free on the series website, https://www.2dshaders.com

      This is an independent initiative, I don't work for any publisher whatsoever. The contents of the books are the result of a 4-year span where I started teaching this in Argentina and USA, always making the workshop better. Now I'm expanding it to make more sense in book form.

      I'd love to hear your opinions on the idea and if you get the book let me know what you think.

      By the way, the examples are in Unity, but the concepts from the book should be easily transferable to any graphics api/engine.

      Hope you like it!
    • By RoKabium Games
      While looking out for that pesky Terrator, our little alien is doing a bit of relaxed mining down on the new gas planet "Lelantos" this weekend.... 
      #gamedev #indiedev #madewithunity #screenshotsaturday
    • By vividgamer
      I have a native iOS game (objective c, XCode build) which I am considering to port to other platforms.
      Core gameplay is based on solely on geographical maps, and custom drawing over maps. It also has Core Data. This part is complete in development.
      What is not done yet is: monetization, gamification (leaderboards, challenges) and multiplayer functionality.
      As I think more about it, I am tempted to think if this is the right time to move to a cross platform tool such as Unity. But before dedicating time to port my 5 years side-project effort in Objective C, I really want to know if its worth it.
      - Does Unity support such plugins / assets that will fulfill all my above requirements?
      - Unity Personal seems to have only 20 concurrent users - is it too costly scaling if I decide for extending to web and android platforms?
      - What is the general workflow involved in publishing to iOS, Android, PC, and web platforms while using Unity? I mean to ask about various points of signing stuff, paying fees and getting certified.
      - How long will it really take to port my entire Objective C project into Unity? I am somewhat familiar with C# but I am finding it hard fidgeting with Unity IDE as lot of things are focused around FPS and 3D while my game is still 2d - not much action involved. I seem bit overwhelmed by the list of features I see there. All in all, I do not want to lose my momentum while still making sure its portable to everywhere.
      - Any assets I could use (for free to try basis in debug) that are relevant for my game?
      - Last but not the least, are there any costs that I need to be paying upfront to Unity, for using it (apart from their monthly subscription model)? I don't understand their costing for multiplayer in conjunction with their subscription fees - if someone could kindly elaborate.
      Thanks in advance for your time reading a newbie
    • By GytisDev
      me and few friends are developing simple city building game with unity for a school project, think something like Banished but much simpler. I was tasked to create the path-finding for the game so I mostly followed this tutorial series up to episode 5. Then we created simple working system for cutting trees. The problem is that the path-finding is working like 90% of the time, then it get stuck randomly then there's clearly a way to the objective (tree). I tried looking for some pattern when it happens but can't find anything. So basically I need any tips for how I should approach this problem.
      Use this image to visualize the problem.
  • Advertisement
  • Advertisement
Sign in to follow this  

Unity Optimizing 2D Tile Map

This topic is 1200 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello GameDev community!


How many here have valuable experience with 2D tile maps?


I have a question for you that may be very valuable to any readers of this page, which will give you good opportunity to share your expertise.


The begin, I have what is called here and 'Action Screen,' which loads a matrix of what are called here 'Maps.' Each map is 100 * 100 tiles, which also contain objects (in this implementation).


The Action Screen works great, and allows a player to travel across any number of maps.


The challenge is getting it to save and load maps efficiently. Currently, maps are serialized and deserialized as needed, along with a vector of Tile objects within them. 


Rendering and displaying are both very fast. The challenge currently lies in loading the maps from a directory (which takes a lot of time, due to the number of objects within them).


One solution I've considered is to serialize and deserialize tiles individually, which would work very well for loading, but horribly for saving new maps, which contain unique tiles. Also, that is an ungodly amount of individual files.



Is there anyone out there that might have what it takes to field this one, or some relevant resources?



Thanks in advance!

Share this post

Link to post
Share on other sites

Its all going to come down to how big those map files are, how large their dependencies (if any) are, and what their structure is. In general, I wouldn't have thought they'd be large enough to form a bottleneck, so I would guess that either you are storing a lot of things in the file, storing it inefficiently, or perhaps are booting dependencies of the previously-loaded maps and then reloading it anew, which if true is probably the real source of the inneficiency -- I think your last paragraph indicates that this is what you're doing.


I would absolutely keep tile images separate from their maps. There may be tiles that are unique to a map, but its still probably better to store them separately. You for sure want to share common tiles between maps, and once you're doing that, making a special case to support tiles also coming from within the map itself is just extra work for no real gain. But you can do that if it floats your boat.


If you only have map data inside the file and its still a bottleneck, then you probably want to look first at whether you are storing your maps efficiently -- preferably something that can be loaded directly into memory with minimal translation. Text formats like XML or JSON are right out as a runtime format (though might be appropriate during production), even binary formats that require fine-grained parsing can be troublesome.


If you're sure that the size and the shape of your data isn't the problem, then you'll want to look breaking the map into even smaller sections (by the way, one of the advantages of moving tile data out of the map file itself, is that you can break up the map into smaller sections without affecting which file it is where those unique tiles land) -- maybe 64x64, 50x50, or 32x32. Even if you end up loading the same amount of a map, you can load the nearest parts first and continue on your way.


Finally, since its often faster to load and decompress compressed data than it is to read the uncompressed data from a disk, compression is another avenue that you could explore, either on its own with larger maps or combined with smaller maps for even further reduced load times. very simple run-length encoding will do pretty well, Huffman encoding will do even better -- its more complex, but there are very good free libraries for it. You could also use a .zip archive library for all of your game assets and let it handle it all -- again, there are good free libraries for this.



If you can describe the contents/format/size of your map files in more detail, we can help you out more, or I can give you a better idea of what parts of the above are most relevant to your situation.

Share this post

Link to post
Share on other sites

Load speeds mostly come down to how your file format is laid out, and the amount of content being loaded.


What do your map files look like? Are they text files? If so, you can pretty much get an instant speedboost by using binary files instead.

Share this post

Link to post
Share on other sites

Servant of the Lord, yes, they are text files, although by using Boost, switching to binary is very simple.


Ravyne, the maps do contain a lot of data. Again the 10,000 tiles in each (I know, an atrocity), as well as all of the objects. One example is a forest map (100 * 100), each containing about 5000 trees. Currently, the whole purpose for the tiles is simply for rendering. If I know which tiles are active, then I can render just the objects within those tiles.


Will look into compressing files, as that is one major question that I had - having a database that will allow loading of files as needed, and at the same time being able to compartmentalize files. 


One other direct application of the same functionality was to create a 300 * 300 tile vector in the Action Screen, and passing all of the objects into them. This way, if an AI unit wants to travel from map to map, it won't actually have to load very many objects (just things like buildings and geographical data), in order to determine travel time and long-term pathfinding.


Understanding that this is not comprehensive, I will give you the data here, so you may be able to help me.


Each map, currently, is 170kb, and by removing the Tile portion of each, will reduce that number to about 100kb. Saving objects separately will reduce that very dramatically.


The maps contain a few private integers and a string variable, as well as a private vector of objects. When the map is saved, it clears a global vector of objects, passes it's objects into it, and then serializes all of the data. 


The funny thing is that I don't even use tile images yet!


Will change serialization to binary, save objects in a folder separate from the map itself, and implement object passing to the Action Screen Tile vector.


Thank you for your insights. Just with this, the bottleneck will be almost gone, less the need to still load all of the objects of a map.

Share this post

Link to post
Share on other sites
Have you actually measured what's being slow?

For example, a gzip-compressed text file can easily load far faster than certain binary formats. The CPU can do an awful lot of decompression and text parsing in the time it takes to load one file off of disk. Just switching to a binary file may help, but not help as much as just gzipping your currnet file (text files tend to compress better than binary ones, though you can of course also compress binary files and compare the two to see which works better for your data).

Time how long it takes to read the data. Time how long it takes to parse that data. Time how long it takes to populate the game objects in your world. The actual slow down could be something silly like your "insert physics object" taking >1ms each (seen it happen in weird cases on engines that have shipped far larger AAA games... weird stuff happens and you need to be experienced at profiling to find it).

Ravyne's advice is sound, but it's an awful lot of work that might be unnecessary to get your game to an acceptable performance envelope. Figure out your actual problem then just fix that until you have the time to fully revisit your asset conditioning and content loading pipelines.

Share this post

Link to post
Share on other sites

SeanMiddletich, thank you for your input. All things considered, compression is an obvious benefit in this scenario. I've tested all of the approaches and functions, and it is most definitely the loading from disc. 


ServantoftheLord, thank you as well. I would not have imagined that changing strings into binary format would be another approach.


Obviously, I have much to learn about this. Everyone here has given me a wealth of information on the subject, and I've found all of it useful and to be overall a better approach than what I was using. Not only that, it meets all of my optimization goals (even the lofty ones). 


Will leave an update after everything is switched over, with time differences, just for reference sake.


Thank again!

Share this post

Link to post
Share on other sites

Ravyne's advice is sound, but it's an awful lot of work that might be unnecessary to get your game to an acceptable performance envelope. Figure out your actual problem then just fix that until you have the time to fully revisit your asset conditioning and content loading pipelines.


In this case, its sounding more and more like compression would act as sort of a magic bullet. If OP is okay taking a compression library as a dependency, it might really be the path of least resistance while also giving biggest gains -- and as you said, if the data is semi-regular in either form, compressed ASCII ought to be just a bit larger than compressed binary (The number of dictionary entries should be the same or about the same, but the entiries themselves will likely be a bit larger). I generally assume that people want to put off dependencies if they can, which is where the other advice is relevant.


I think the thing I would do in either case is to just evaluate whether OPs current format is wasteful -- are you writing 32bit values to disk when the data would fit in 16bits? Are you padding ascii strings with additional "empty" characters to make parsing easier? Stop doing those kinds of things first. Also do pre-allocate large data structures and profile for unexpected bottlenecks (like in your physics system insert example) -- then, if performance is still to be desired at that point, compression is the next step.

Share this post

Link to post
Share on other sites

Another thing that might be affecting you if you're a Visual Studio user is _ITERATOR_DEBUG_LEVEL -- in release mode, this defaults to level 0 which is no checks, and in debug mode it defaults to level 2 which is iterator debugging. iterator performance is markedly different between these two settings, I've seen differences as much as an order of magnitude. You can also set it to level 1 which performs basic iterator checks that cost less than level 2. You can't do level 3 in release mode.


To set these to another value, for each configuration of the solution:

  • In Project Pages / Configuration Properties / C,C++ / Preprocessor / Preprocessor Definitions., add "_ITERATOR_DEBUG_LEVEL=<LEVEL>" where <LEVEL> is the level of iterator debugging that you want for that configuration.

Share this post

Link to post
Share on other sites
Sign in to follow this  

  • Advertisement