• Content count

  • Joined

  • Last visited

Community Reputation

116 Neutral

About NetrickPL

  • Rank
  1. I'm currently making an RPG and everything is going great, I've been working on it for 6 months and I came to point where I need script language for easy writing quests, ai and configuring gui. I don't have time to write a big editor which would allow to create quests etc without scripting.   However, I'm stuck between Lua and Squirrel. Lua is much more used in the industry, but squirrel has nicer syntax to me (c++ like). Both have good c++ binders, oolua and sqrat. Both languages are wery well maintaned, but on the binder side, I found that almost all squirrel binders are abandoned (for version 2 only and now there is 3). Only sqrat is updated for 3, but still it's not that actively maintaned and I'm not sure about it. Lua has a lot of active binding tools.   On the other hand, I heard that lua garbage collection is very bad in terms of performance and it can slow down my application every 30-60 sec. And I need good performance all the time without slow downs. Actually one of the reasons why squirrel was made was to fix the lua garbage collection. However, in lua 5.2 they changed something with garbage collector, but many people say that it is still bad...   Based on your experience and knowledge, which language should I choose? I really don't know. If you know a language better than these 2, tell me about it.
  2. Hey, I'm implementing valve's networking model for my simple top down game but I have some design problems and I just can't think of good solutions. The one of the main ideas that there is a render time separate from simulation time. So for example I have interpolation time of 50ms (as my server update rate is 20Hz). So, on client my simulation time is 1000ms and render time is 950ms. But I have some design problems... When the user presses a key (in example cast spell), the logical choice is to trigger it in the render time (950ms), but that's impossible since my simulation is in 1000ms already. I think that the solution may be to trigger input using simulation time (and use "lag compensation", so if spell casting time is 100ms, user casts it at the time when render is at 950ms, and it will start in simulation at time 1000ms as 50% casted spell). However I'm not sure, is it a good solution? The second problem is about sending packets. So If my solution for the first problem is good, then I will send the newest packet from simulation (and don't bother with render time when sending packets). Is it good or should I send packets using render time (ie when simulation time is 1000ms I send to server packet from 950ms)? The third problem is about receiving packets. When I receive packet when the simulation time is 1000ms, should I save it as packet received in 1000ms or 950ms? In valve networking model (which I use as I said), when server receives info "started casting spell" at server time 250ms, it is compensated by rtt/2 AND by client interpolation time... So I don't know how valve clients handle input, looks like they handle it at render time which I don't understand. Anyway, as I want to use my solution for handling input, I think that I shouldn't compensate for interpolation on server, yes? Or maybe I should compensate on server also by client interpolation time? I hope that you can help me with my 3 questions. Thanks in advance!
  3. Hey! I have a question about fixes step game loop. Is it better to check input (mouse, keyboard and packets) in fixed step loop or in render loop? I mean #1: while (true) //main loop {     while(fixed step loop) //ie 50/100 times a second      {        check input        do logic       }       render } or rather #2: while (true) //main loop {       check input     while(fixed step loop) //ie 50/100 times a second      {        do logic       }       render } I think the approach #1 is better especially for lower framerates (you check input whenever you need for logic, and in case #2 @5fps you could press a key for a short moment and it could be processed 20 times by logic loop). Could you advice me what's better? Because I'm not sure. Which one is used in games? Thanks
  4. Hey. I have a strange problem. I play Guild Wars 1 on Windows XP and the behaviour is like that: 1) Fullscreen + VSYNC = input lag (seen when drag'n'moving windows) 2) Windowed + VSYNC = no input lag (why?) Why in case 2) I have no input lag? I think that's because I oftean hear that before Vista there is no true vsync for windowed apps in Windows xp, so in Guild Wars vsync in windowed mode is just limiting to 60 fps using internal timers (which does not causes input lag, because swapping buffers isn't lagged by refresh rate). So there is no true vsync for windowed apps in windows xp, yes? If not, why the behaviour is so strange on windows xp? I want to know an answer, because the game I developed for linux with vsync has input lag in windowed mode too (so the input lag is in both fullscreen and windowed vsync)... I just want to know if it is normal, because linux uses true vsync in windowed mode unlike windows xp? Thanks