# [web] My server speed

## Recommended Posts

Hello all! Im currently running some speed/load/stress tests on a server (VPS) that I am leasing, but I've realised that my 2Mbit/s connection is limiting how far I can actually go. If anyone is on a higher connection would they be able to assess the speed on my server for me (it would be handy if i had a 100mbit connection but unfortunatly i dont!). Here is the ip http://216.154.221.138/. If anyone does do any testing please let me know the parameters and results you might get! Thanks!

##### Share on other sites
That site has a 379 byte "coming soon" page with a 6.8 kB image on it, not precisely something that is terribly useful for "testing" anything. It certainly shows up "instantenous".

The only thing one could really test would be serving static content out of the buffer cache... which means nothing.

If you want to test whether your hoster is "cheating" you on bandwidth, you'll need something bigger to download (50-100 MB), but you'll risk being rate-limited just for that if enough people download that. I'd not ask possibly 10,000 people to download 50+ MB if I wasn't 100% sure this will be ok with my bandwidth quota.
You can ping the host to get a rough idea of the network quality, but then again this means nothing. Not only does it not tell you anything about your server's performance, but also ping datagrams may be routed differently (and probably are!) than your TCP traffic.

If you want to know the "general performance" of your web server, it gets much more complicated, as there are different metrics, and different usage scenarios. Do you need fastest possible response times, or maximum throughput, or both? Or does it not matter at all, you only want to serve as many people as possible in "acceptable" time?
Do you serve static content or dynamic? Few or many "extra requests" for CSS and images? Does the static content fit into the buffer cache? Are there database lookups involved?

You'll need to have something that is representative for whatever you want to do, if there is supposed to be any chance of telling something. Obviously, serving a static 500 bytes page from cache is different from serving a 50 MB file from disk, and different from serving a page that is generated from a PHP script with 10 database lookups, and yet different from streaming a movie.

##### Share on other sites

Yes, there are certainly a lot of variables and scenarios when it comes to load testing and the like. The reason I suggested that page is because I have been using JMeter to test certain parts of my site, and this page it could serve approx 25 times a second (with 40 users making concurrent ans simultanious requests. What I began to notice is that all my tests seems to peak at 160-210kB/s, the more images on a page the lower the overall kB/s, which im assuming is down to the overhead of having to request seperate files. As my net speed is about 220kB/s, i realised that i cannot really trust my test results as the bottleneck could be my own connection speed, which is why i am looking for someone with a high connection that myself to test the page and report back the max acheieved kB/s (and what their connection is)

I am very new to this so please tell me if this is a silly way to do this!

##### Share on other sites
If you have many images on your page, then you should consider keepalive connections. Normally a client makes one connection for one request, but with keepalives, it will make one connection and download several resources via the same connection.

Care has to be taken before doing any such "optimisations", though... it is not trivial at all and requires you to read the documentation very carefully.
For example, is tempting to set the keepalive timeout to some large value like 20-30 seconds with the idea that you'll save on extra connections. While that's true, it will also block one server process for that time, though. So, if you have 500 concurrent visitors, it may have 499 processes waiting for nothing (if the max. number of connections is that high).

For what it's worth, here are some benchmarks with 5, 10, 50, and 100 concurrent connections, the server seems to top out at roughly 150 requests per second (one full listing, and just the request count for the others).
ab -n 500 -c 5 http://216.154.221.138/This is ApacheBench, Version 2.3 <$Revision: 655654$>Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/Licensed to The Apache Software Foundation, http://www.apache.org/Benchmarking 216.154.221.138 (be patient)Completed 100 requestsCompleted 200 requestsCompleted 300 requestsCompleted 400 requestsCompleted 500 requestsFinished 500 requestsServer Software:        Apache-Coyote/1.1Server Hostname:        216.154.221.138Server Port:            80Document Path:          /Document Length:        379 bytesConcurrency Level:      5Time taken for tests:   29.247 secondsComplete requests:      500Failed requests:        0Write errors:           0Total transferred:      301000 bytesHTML transferred:       189500 bytesRequests per second:    17.10 [#/sec] (mean)Time per request:       292.473 [ms] (mean)Time per request:       58.495 [ms] (mean, across all concurrent requests)Transfer rate:          10.05 [Kbytes/sec] receivedConnection Times (ms)              min  mean[+/-sd] median   maxConnect:      137  151 189.2    138    3135Processing:   138  140   2.2    140     156Waiting:      138  140   2.2    139     156Total:        276  291 189.2    278    3275Percentage of the requests served within a certain time (ms)  50%    278  66%    279  75%    279  80%    279  90%    282  95%    287  98%    291  99%    294 100%   3275 (longest request)ab -n 500 -c 10 http://216.154.221.138/Requests per second:    35.06 [#/sec] (mean)Time per request:       285.237 [ms] (mean)Time per request:       28.524 [ms] (mean, across all concurrent requests)Transfer rate:          20.61 [Kbytes/sec] receivedab -n 500 -c 50 http://216.154.221.138/Requests per second:    142.55 [#/sec] (mean)Time per request:       350.748 [ms] (mean)Time per request:       7.015 [ms] (mean, across all concurrent requests)Transfer rate:          83.81 [Kbytes/sec] receivedab -n 1000 -c 100 http://216.154.221.138/Requests per second:    153.57 [#/sec] (mean)Time per request:       651.152 [ms] (mean)Time per request:       6.512 [ms] (mean, across all concurrent requests)Transfer rate:          90.28 [Kbytes/sec] received
But, remember.... this really tells you nothing. There's a huge difference between serving one page which is practically guaranteed to be in cache and having to seek one out of 15,000 pages on disk, then running a script, and doing some database lookups (on another server maybe).
Requesting 300 byte files isn't any good as a bandwidth measure either.

##### Share on other sites
ok, thanks a lot for taking the time to do that, ive noted your comments.

keep-alive is already in place. Is that benchmark just the page without the image? Also what speed is your net connection?

Really im just trying to learn about this process, i didnt think it was going to be as confusing as it is, i want to make sure i am getting the most out of my server for my webapp, and how to scale efficently. its gonna take time i think!

thanks again

##### Share on other sites
Quote:
 Original post by rackhamIs that benchmark just the page without the image? Also what speed is your net connection?
Yes, just the page, and 20 MBit.

## Create an account

Register a new account

• ### Forum Statistics

• Total Topics
628300
• Total Posts
2981894

• 9
• 9
• 11
• 10
• 10