• entries
    64
  • comments
    178
  • views
    109803

HTTP for gamelogic... are you insane ?

Sign in to follow this  
Ashaman73

980 views

Well, maybe a little bit. In my answer to this post over here, I mentioned, that I use HTTP as guideline to refer method/function calls in my game. I expected, that some developers will think that I'm insane, and this might be true to some degree biggrin.png . Still I had my reasons to go this way and I want to talk about this.

[sharedmedia=gallery:images:6362]
When you develop, you need to consider many aspects. Clean code, bug-free, performance all the things which seems obviously, but you need to regard many more non-functional requirements. Budget, time, maintainability and many more. It is often just too expensive and clearly unnecessary to write the best code ever.

Being the sole developer in a hobby project puts a lot of pressure on you if you want to approach the goal of finishing your game in your lifetime. I've learned,that easy, fast to code and maintainable code is a key-element to support a hobby project of this scale. One reason to add (fake)HTTP request to my game was the easy maintainability of it. I don't use the full HTTP technology stack, just the general syntax and it reminds a little bit of REST, still parsing the HTTP string seems unnecessary, and to be honest, there are better and faster ways to archive it. Additionally I use it only for gamelogic events, which are executed quite seldom. The main benefit and the first reason to add it, was to support multiple Lua VMs, but this could be handled in other ways.

Nevertheless, so it was my decision and I believed, that it would be fast enough in my special case to justify the usage. So, here are some performance measurements. I tested it with 100 active game entities and measured 24330 frames. The decoding function (written in lua) was called 94091 times, that is roughly 4 times per frame for 100 entities (~roughly the max number of active entities in a game session). That is the first proof, that it is called quite seldom. All the decodes used 0.09% of the total frame time (~33 ms total frame time). Considering that the measurement will cost some time(I think that a large chunk of the time is sacrified to query the OS timer two times per call) and that it is written in Lua instead of C++. The decoding is neither rocket science nor of rocket speed, something you should review if you need to call it more often. Nevertheless, the key element is, that this kind of processing is called quite seldom in my case.

Still, this is not meant to be a general rule. One must evaluate the context you are developing in. Are you coding your hobby-game at night and weekend or are you working on a top AAA title with whole coder teams working on single game parts ? Does your game need every CPU power you can get, or can you afford the luxury to ease your development time. With the right reasoning everything should get a chance, even (fake)HTTP request in your game ;-)

I've added a screenshot of the debug output of a single entity, as you can see, the whole entity behavior is saved in character attributes utilizing HTTP-syntax.
Sign in to follow this  


0 Comments


Recommended Comments

There are no comments to display.

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now