Jump to content
  • Advertisement
Sign in to follow this  

[web] My server speed

This topic is 3380 days old which is more than the 365 day threshold we allow for new replies. Please post a new topic.

If you intended to correct an error in the post then please contact us.

Recommended Posts

Hello all! Im currently running some speed/load/stress tests on a server (VPS) that I am leasing, but I've realised that my 2Mbit/s connection is limiting how far I can actually go. If anyone is on a higher connection would they be able to assess the speed on my server for me (it would be handy if i had a 100mbit connection but unfortunatly i dont!). Here is the ip If anyone does do any testing please let me know the parameters and results you might get! Thanks!

Share this post

Link to post
Share on other sites
That site has a 379 byte "coming soon" page with a 6.8 kB image on it, not precisely something that is terribly useful for "testing" anything. It certainly shows up "instantenous".

The only thing one could really test would be serving static content out of the buffer cache... which means nothing.

If you want to test whether your hoster is "cheating" you on bandwidth, you'll need something bigger to download (50-100 MB), but you'll risk being rate-limited just for that if enough people download that. I'd not ask possibly 10,000 people to download 50+ MB if I wasn't 100% sure this will be ok with my bandwidth quota.
You can ping the host to get a rough idea of the network quality, but then again this means nothing. Not only does it not tell you anything about your server's performance, but also ping datagrams may be routed differently (and probably are!) than your TCP traffic.

If you want to know the "general performance" of your web server, it gets much more complicated, as there are different metrics, and different usage scenarios. Do you need fastest possible response times, or maximum throughput, or both? Or does it not matter at all, you only want to serve as many people as possible in "acceptable" time?
Do you serve static content or dynamic? Few or many "extra requests" for CSS and images? Does the static content fit into the buffer cache? Are there database lookups involved?

You'll need to have something that is representative for whatever you want to do, if there is supposed to be any chance of telling something. Obviously, serving a static 500 bytes page from cache is different from serving a 50 MB file from disk, and different from serving a page that is generated from a PHP script with 10 database lookups, and yet different from streaming a movie.

Share this post

Link to post
Share on other sites
Hello, thanks for your reply.

Yes, there are certainly a lot of variables and scenarios when it comes to load testing and the like. The reason I suggested that page is because I have been using JMeter to test certain parts of my site, and this page it could serve approx 25 times a second (with 40 users making concurrent ans simultanious requests. What I began to notice is that all my tests seems to peak at 160-210kB/s, the more images on a page the lower the overall kB/s, which im assuming is down to the overhead of having to request seperate files. As my net speed is about 220kB/s, i realised that i cannot really trust my test results as the bottleneck could be my own connection speed, which is why i am looking for someone with a high connection that myself to test the page and report back the max acheieved kB/s (and what their connection is)

I am very new to this so please tell me if this is a silly way to do this!

Share this post

Link to post
Share on other sites
If you have many images on your page, then you should consider keepalive connections. Normally a client makes one connection for one request, but with keepalives, it will make one connection and download several resources via the same connection.

Care has to be taken before doing any such "optimisations", though... it is not trivial at all and requires you to read the documentation very carefully.
For example, is tempting to set the keepalive timeout to some large value like 20-30 seconds with the idea that you'll save on extra connections. While that's true, it will also block one server process for that time, though. So, if you have 500 concurrent visitors, it may have 499 processes waiting for nothing (if the max. number of connections is that high).

For what it's worth, here are some benchmarks with 5, 10, 50, and 100 concurrent connections, the server seems to top out at roughly 150 requests per second (one full listing, and just the request count for the others).

ab -n 500 -c 5
This is ApacheBench, Version 2.3 <$Revision: 655654 $>
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/

Benchmarking (be patient)
Completed 100 requests
Completed 200 requests
Completed 300 requests
Completed 400 requests
Completed 500 requests
Finished 500 requests

Server Software: Apache-Coyote/1.1
Server Hostname:
Server Port: 80

Document Path: /
Document Length: 379 bytes

Concurrency Level: 5
Time taken for tests: 29.247 seconds
Complete requests: 500
Failed requests: 0
Write errors: 0
Total transferred: 301000 bytes
HTML transferred: 189500 bytes
Requests per second: 17.10 [#/sec] (mean)
Time per request: 292.473 [ms] (mean)
Time per request: 58.495 [ms] (mean, across all concurrent requests)
Transfer rate: 10.05 [Kbytes/sec] received

Connection Times (ms)
min mean[+/-sd] median max
Connect: 137 151 189.2 138 3135
Processing: 138 140 2.2 140 156
Waiting: 138 140 2.2 139 156
Total: 276 291 189.2 278 3275

Percentage of the requests served within a certain time (ms)
50% 278
66% 279
75% 279
80% 279
90% 282
95% 287
98% 291
99% 294
100% 3275 (longest request)

ab -n 500 -c 10

Requests per second: 35.06 [#/sec] (mean)
Time per request: 285.237 [ms] (mean)
Time per request: 28.524 [ms] (mean, across all concurrent requests)
Transfer rate: 20.61 [Kbytes/sec] received

ab -n 500 -c 50
Requests per second: 142.55 [#/sec] (mean)
Time per request: 350.748 [ms] (mean)
Time per request: 7.015 [ms] (mean, across all concurrent requests)
Transfer rate: 83.81 [Kbytes/sec] received

ab -n 1000 -c 100
Requests per second: 153.57 [#/sec] (mean)
Time per request: 651.152 [ms] (mean)
Time per request: 6.512 [ms] (mean, across all concurrent requests)
Transfer rate: 90.28 [Kbytes/sec] received

But, remember.... this really tells you nothing. There's a huge difference between serving one page which is practically guaranteed to be in cache and having to seek one out of 15,000 pages on disk, then running a script, and doing some database lookups (on another server maybe).
Requesting 300 byte files isn't any good as a bandwidth measure either.

Share this post

Link to post
Share on other sites
ok, thanks a lot for taking the time to do that, ive noted your comments.

keep-alive is already in place. Is that benchmark just the page without the image? Also what speed is your net connection?

Really im just trying to learn about this process, i didnt think it was going to be as confusing as it is, i want to make sure i am getting the most out of my server for my webapp, and how to scale efficently. its gonna take time i think!

thanks again

Share this post

Link to post
Share on other sites
Original post by rackham
Is that benchmark just the page without the image? Also what speed is your net connection?
Yes, just the page, and 20 MBit.

Share this post

Link to post
Share on other sites
Sign in to follow this  

  • Advertisement

Important Information

By using GameDev.net, you agree to our community Guidelines, Terms of Use, and Privacy Policy.

We are the game development community.

Whether you are an indie, hobbyist, AAA developer, or just trying to learn, GameDev.net is the place for you to learn, share, and connect with the games industry. Learn more About Us or sign up!

Sign me up!