Ever since I built my cluster people have been asking me why I used Apache and not Nginx. I started using Apache because I was just used to it. People say it's slow and takes up too much memory, but I find that with a little tuning it can perform quite well.
Still, Nginx does have a reputation for being fast. I wanted to see which web server would be best for my cluster, so I installed Apache on one Pi and Nginx on another.
I used two Raspberry Pi model B servers, each with identical Sandisk 4GB class 4 SD cards. In raspi-config, I overclocked each Pi to 800MHz, and allocated 16MB of memory to the GPU. I used the same version of Raspbian (released on 2013-09-25) on both servers. I used exactly the same test data and scripts on each Pi.
Follow this link to see how I set up Nginx and uWSGI.
I tuned Apache by removing modules and increasing the number of server processes. These tuning techniques don't apply to Nginx.
I tested each server with three different types of request: a static HTML file, a simple CGI script, and a complex CGI script. The HTML file is a cached page in my Content Management System (the CMS doesn't need to execute for cached pages to be served, they can be served by Apache as normal HTML files). The simple script just prints an HTTP header, prints "Hello World!" and exits.
The complex script used in these tests was the CMS that this site is built on. I disabled page caching on both Pi servers, so that pages had to be generated dynamically by the CMS. When a page is served, the CMS script has to parse two XML files to get meta data, read several snippets of HTML from the file system, and print them to a socket.
Requests were generated with Apache Bench using a command like this:
ab -n 1000 -c 250 http://192.168.0.21/spec.html
where 1000 is the number of requests to issue, and 250 is the number of concurrent users.
The Raspberry Pi running Nginx had IP address 192.168.0.21, and the Pi running Apache had 192.168.0.22. I tested each server over a range of concurrent users for each type of request.
Static files
Static files are easy to serve, so I used a range of 50 to 250 concurrent users in these tests. Apache handled 220 connections per second, while Nginx handled around 300 connections per second.
Nginx came out ahead on these tests.
Dynamic content tests
Simple script
In these tests I used ab to request this URL: http://192.168.0.21/cgi-bin/hello.py. I set the number of request to 100, and tested over a range of 10 to 50 concurrent users.
Apache handled 4.78 connections per second, and Nginx handled 4.65 connections per second, but the results showed that the mean transaction time was lower for Nginx than Apache, so Apache was slower in this test. The difference was not very pronounced under a low load, but it increased as the load increased.
Complex script
The URL used in these test was http://192.168.0.21/spec.html. This test is the most CPU intensive so I used from 5 to 25 concurrent users in these tests.
Under a low load, Apache's performance was slightly better than Nginx, but only by a very slim margin. With 25 concurrent users, Apache was slower than Nginx. The difference under a low load is negligible, but with 25 concurrent users, Nginx was noticeably faster than Apache.
Conclusions
There are many variables involved in server performance. These tests don't definitively determine which server is 'better', they just helped me decide which one is best for my specific needs.
Most of the pages on my site are static, and Nginx is faster when it comes to static pages. It looks like Nginx is a better choice for me.
No comments:
Post a Comment