• Announcements

    • khawk

      Download the Game Design and Indie Game Marketing Freebook   07/19/17

      GameDev.net and CRC Press have teamed up to bring a free ebook of content curated from top titles published by CRC Press. The freebook, Practices of Game Design & Indie Game Marketing, includes chapters from The Art of Game Design: A Book of Lenses, A Practical Guide to Indie Game Marketing, and An Architectural Approach to Level Design. The GameDev.net FreeBook is relevant to game designers, developers, and those interested in learning more about the challenges in game development. We know game development can be a tough discipline and business, so we picked several chapters from CRC Press titles that we thought would be of interest to you, the GameDev.net audience, in your journey to design, develop, and market your next game. The free ebook is available through CRC Press by clicking here. The Curated Books The Art of Game Design: A Book of Lenses, Second Edition, by Jesse Schell Presents 100+ sets of questions, or different lenses, for viewing a game’s design, encompassing diverse fields such as psychology, architecture, music, film, software engineering, theme park design, mathematics, anthropology, and more. Written by one of the world's top game designers, this book describes the deepest and most fundamental principles of game design, demonstrating how tactics used in board, card, and athletic games also work in video games. It provides practical instruction on creating world-class games that will be played again and again. View it here. A Practical Guide to Indie Game Marketing, by Joel Dreskin Marketing is an essential but too frequently overlooked or minimized component of the release plan for indie games. A Practical Guide to Indie Game Marketing provides you with the tools needed to build visibility and sell your indie games. With special focus on those developers with small budgets and limited staff and resources, this book is packed with tangible recommendations and techniques that you can put to use immediately. As a seasoned professional of the indie game arena, author Joel Dreskin gives you insight into practical, real-world experiences of marketing numerous successful games and also provides stories of the failures. View it here. An Architectural Approach to Level Design This is one of the first books to integrate architectural and spatial design theory with the field of level design. The book presents architectural techniques and theories for level designers to use in their own work. It connects architecture and level design in different ways that address the practical elements of how designers construct space and the experiential elements of how and why humans interact with this space. Throughout the text, readers learn skills for spatial layout, evoking emotion through gamespaces, and creating better levels through architectural theory. View it here. Learn more and download the ebook by clicking here. Did you know? GameDev.net and CRC Press also recently teamed up to bring GDNet+ Members up to a 20% discount on all CRC Press books. Learn more about this and other benefits here.
Sign in to follow this  
Followers 0
Will Atwood

Latency in Wireless Connections

7 posts in this topic

We have been investigating wireless latency (WiFi and cellular) and its effect on realtime multiplayer games.

 

First, we have some questions about real world wireless latency.  It seems that limited latency testing in the wild is not sufficient for preparing a game for real world use.  Ookla speedtest is the most used network testing tool with 5 million tests a day, but we found that it only gives the best out of 10 samples, so this naturally gives highly optimistic results.  What are better sources of real world latency data?

 

Looking at latency data, what’s a good way to decide the range of latency a game should accommodate? For example, you can look at the min/max/standard deviation, the mean, or the tail (95th percentile).

 

What is the best practice to actually test the effect of latency on a game?  Is it better to test with deterministic or random latencies?  For example, for deterministic testing you might use a constant latency at 95th percentile.  Deterministic latencies make it easier to find issues in a controlled and repeatable manner, while random latencies are a closer to what your game will see out in the real world.

 

We have posted more details about this topic and are very interested in what other developers have to say on the matter.  If you are interested in reading more about our thought process, check out our post about mobile latency here.

0

Share this post


Link to post
Share on other sites
I'm not aware of any good, thorough latency data. Your best bet is probably to try a number of different devices in different service areas. For example, a home 100 Mbps connection in Korea, a 4G MiFi WiFi adapter in the US, a 20 Mbps DSL connection in Europe, and a 1 Mbps satellite-downlink modem-uplink connection may all perform very differently along both loss, latency, jitter, and bandwidth.

If you're designing your game to be played well, then you should design for at least 99th percentile, if not 99.9th or better. Although percentiles are also quite hard to pin down -- latency and jitter may vary more over macro time (time of day, day of week, internet weather) than it varies from packet to packet.
2

Share this post


Link to post
Share on other sites

I have a strange habit of assuming nearly-worst-case scenario with end-user situations. Hardware, internet, etc... and aim to make everybody as happy as possible, without disappointing those who are more invested in their setups.

1

Share this post


Link to post
Share on other sites
The worst case is something like 33 kbps and 1800 milliseconds latency (GPRS, satellite, etc.)
2

Share this post


Link to post
Share on other sites

Thanks for your responses, guys.

 

I'm not aware of any good, thorough latency data. Your best bet is probably to try a number of different devices in different service areas. For example, a home 100 Mbps connection in Korea, a 4G MiFi WiFi adapter in the US, a 20 Mbps DSL connection in Europe, and a 1 Mbps satellite-downlink modem-uplink connection may all perform very differently along both loss, latency, jitter, and bandwidth.

If you're designing your game to be played well, then you should design for at least 99th percentile, if not 99.9th or better. Although percentiles are also quite hard to pin down -- latency and jitter may vary more over macro time (time of day, day of week, internet weather) than it varies from packet to packet.

 

hplus, how does one go about field testing on all the different connections you mentioned?  Would one use a third party testing company for field testing? It seems expensive/time-consuming for a typical studio to do this kind of testing and make it statistically meaningful. It appears hard to test and troubleshoot issues with latency using just field tests during development.

 

Our opinions below are informed by experience in other fields. We are hoping to understand current best practices in game development. 

 

A little more background information about our perspective—  personal computing has been moving from hard-wired to wirelessly connected devices.  Portable devices (smart phones, tablets and laptops) use only wireless connections, and even consoles and PC’s increasingly use wireless connections. Realtime multiplayer action games work best on hard-wired connections. This transition from hard-wired to wireless connections require developers to address higher and more variable latency. 

 

We’d like to start off by making a distinction between field tests in the wild and lab tests during development.  During development, there is a need for regular regression testing of builds; this would include unit, integrated functional and performance tests. Regression tests are most useful when automated (run without human input). 

 

Lab tests, both automated and live-player, require setting up repeatable, yet representative network conditions. This allows direct comparison of test results to assess software quality and uncover bugs. In contrast, for comparing field tests in the wild, both network conditions and quality of the software are variable.   

 

How does one know where to test, and under what conditions?  And if one has covered the instances that the customers are going to see?  If we spend most of our time testing for the tails of the distribution, we may not have confidence in the outcomes majority of players will experience.  The further you go out into testing the tail (i.e. a higher percentile) in field tests, the more tests you have to run for the data to be statistically significant (adding to the cost).  Field tests are more expensive than lab tests, so it is better to use fewer field tests and more extensive lab tests.  

 

Network engineers have been concerned with similar issues for a long time. IETF RFC 3393 addresses measurement of packet delay (latency) variations for IP Performance Metrics.  Interestingly, it deprecates use of the term “jitter” due to ambiguity. It views latency as a distribution whose statistics are inferred from measurements.  Network emulators are commonly used in development of networking  products.  Statistical parameters of the distribution are then used to configure the emulated test conditions. In that case, testing confidence is improved by long enough tests and importance sampling (stress testing). Repeatability in statistical tests can be achieved by using pseudo-random sequences.

Edited by SugarcaneGames
0

Share this post


Link to post
Share on other sites

Would one use a third party testing company for field testing?


Note that wireless LANs (WiFi) and wireless access (Mobile, Microwave, Satellite) are different kinds of beasts.
Testing over WiFi is reasonably simple, as all you need is a few kinds of hardware adapters and access points, covering the main chipsets and vendors in the market.
Testing wireless last-mile connectivity is harder, because it varies so much based on locale.

Many game developers who target console games let the platform owners (Microsoft, Sony, Nintendo) do the testing/validation. If you enter into the developer program for those platforms, you will receive recommendations that they have developed, and that they will test your game against to certify it for compliance for each platform. Because of the gatekeeper function of those companies, it's their requirements, not on-ground requirements, that matter.

Testing cellular is harder, although you can get a few different phones or cell modems with different technologies (GPRS, 3G, 4G, LTE, WiMax, etc) and drive around town to find different levels of reception, and then run a test through those set-ups.
You may also be able to use crowd-sourcing solutions, like odesk, freelancer, etc, to run tests for you if you can clearly define the tasks and reports you need.

I doubt most game developers go to that length, though. The big ones know their networking works because they certify for the consoles and likely use very similar code for PC. The small ones have bigger problems to worry about, and probably make do with a soft network emulators like netem or wanem or gns3 or whatever. Make some educated guesses (up to 500 ms round-trip latency, up to 100 ms jitter, emulate 10% packet loss, limit bandwidth at 256 kilobit, or whatever) and call it good when it works.
0

Share this post


Link to post
Share on other sites

Thanks again for your insight into the industry.

 

Latency on WiFi connections is actually just as high and variable as on cellular networks.  This is due to the quality of the radio channel (fading) and the traffic, and has little to do with the hardware involved.  Recent data collected over a 6-month period by OpenSignal demonstrates this point and is summarized here: http://opensignal.com/blog/2014/03/10/lte-latency-how-does-it-compare-to-other-technologies/

 

OpenSignal collects data from over 1.2 billion WiFi points and 800,000 cell towers.

 

Our limited testing confirms this as well:  http://sugarcanegames.com/latencysummary2.html

0

Share this post


Link to post
Share on other sites

This is due to the quality of the radio channel (fading) and the traffic, and has little to do with the hardware involved


I agree -- the fact that it's radio doesn't change between vendors. Unfortunately, it is also the case that some vendors hardware and drivers and firmware is better than others, and the race-to-the-bottom in cost which drives the US market means that some vendor that was great last year, may have a dud the next year :-(
If you then want to support Linux, your woes multiply.
0

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!


Register a new account

Sign in

Already have an account? Sign in here.


Sign In Now
Sign in to follow this  
Followers 0