ServerStack » Nginx https://www.serverstack.com/blog Scalability Blog Sun, 03 Mar 2013 23:58:43 +0000 en-US hourly 1 http://wordpress.org/?v=4.2.2 Load Distribution with Nginx and Cloudflare https://www.serverstack.com/blog/2013/01/21/load-distribution-with-nginx-and-cloudflare/ https://www.serverstack.com/blog/2013/01/21/load-distribution-with-nginx-and-cloudflare/#comments Mon, 21 Jan 2013 16:59:23 +0000 https://www.serverstack.com/blog/?p=294 Nginx is a popular reverse proxy application that is very efficient at serving static content and forwarding requests to other webservers.  It can provide a much needed performance boost for websites that have a lot of visitors and static content like images, videos, PDF files, etc.  While dynamic content like PHP, Python, Ruby, and other scripts, are passed off to an interpreter.  This is usually an Apache webserver, which receives a request ... ]]> nginx-cloudflare-header

Nginx is a popular reverse proxy application that is very efficient at serving static content and forwarding requests to other webservers.  It can provide a much needed performance boost for websites that have a lot of visitors and static content like images, videos, PDF files, etc.  While dynamic content like PHP, Python, Ruby, and other scripts, are passed off to an interpreter.  This is usually an Apache webserver, which receives a request for dynamic content like a PHP code, and renders it for a user.  When scaling these services, it is important to note that Apache uses a lot of memory to serve these requests, so optimization of content delivery is important.  This is where Nginx is very handy, as it serves static content like images very quickly with a minimal memory footprint.  By combining the two you can serve a lot more traffic.

If you choose to use Nginx for reverse proxy, you’ll also be able to customize where content is delivered from.  For example, you’ll be able to serve images from one cluster of servers, and videos from another:

nginx-cloudflare-diagram

This helps to optimally scale your servers and minimize idling.

For our example, suppose we use Nginx on 192.34.56.28.  The DNS record would look like this:

domain.com.            300     IN      A       192.34.56.28

Keeping the refresh rate to something small like 300 seconds (5 minutes) would allow you to scale up your infrastructure horizontally pretty quickly, but those IPs are best reserved for front-facing Nginx proxies.  These proxies, in turn, can have as many webservers in upstream as you’d like, handling actual traffic.  This protects the webservers from being exposed to DDoS attacks, and also allows optimizing the traffic delivery by splitting destination for different content.

Here is a snippet of Nginx config for multiple IPs that you can place on main entry point:

location / {
proxy_pass http://LOAD-BALANCED-IPS;
proxy_redirect     off;
proxy_set_header   Host             $host;
proxy_set_header   X-Real-IP        $remote_addr;
proxy_set_header   X-Forwarded-For  $proxy_add_x_forwarded_for;
}

upstream LOAD-BALANCED-IPS {
#LBENTRYPOINT
server 192.34.56.29:80 max_fails=1 fail_timeout=1;
server 192.34.56.30:80 max_fails=1 fail_timeout=1;
}

This essentially forwards all requests for domain.com from Nginx proxy (192.34.56.28) to 192.34.56.29 and 192.34.56.30, evenly distributing requests between these two upstream servers.  The best part about this setup is that if the upstream server is down, the Nginx would not send a visitor to that server, and there would be no 404 error displayed.  Nginx would continue to poll the server to see if it is alive, and once that upstream server is back online, the traffic will resume.

Placing tags like “#LBENTRYPOINT” would allow you to create a script that would insert or delete a line based on IP address of your webserver. You can use command line tools like sed to accomplish this in Linux.

Once we have added our SSH key on the Nginx proxy, we can make a script that would insert a new server address with IP of 192.34.56.31 on our Nginx proxy (192.34.56.28):

sed -i '/#LBENTRYPOINT/a\server 192.34.56.31:80 max_fails=1 fail_timeout=1;' /etc/nginx/nginx.conf && service nginx reload

This assumes your configuration file is in /etc/nginx/nginx.conf but on Nginx compiled from source this could be in /usr/local/nginx or /usr/share/nginx.  Make sure to tailor it to your own system.

After the line is inserted, it would also be prudent to check for any duplicate entries and remove them, as the load balancing evenly distributes traffic among all entries of ‘server’ on the list, so having duplicate entries would send multiple requests to that server.

The following command would remove this upstream server (192.34.56.31) from Nginx:

sed -i "/$192.34.56.31/d" /etc/nginx/nginx.conf && service nginx reload

 

With these simple tools you can now automate the process of cloning a VM and placing it into proxy server’s upstream rotation.  This would essentially be scaling up your proxy server vertically.

To add additional proxy servers and scale horizontally, we would need to use a DNS manager with API toolset.  Cloudflare offers just the solution.  Click Account and copy your API key:

Cloudflare allows you to modify DNS records with three API commands: rec_new, rec_edit, and rec_delete.  Their documentation covers each in greater detail.

For a quick example, we will create a new subdomain for our images using Cloudflare’s API.  We’ll call this subdomain images.domain.com and give it a 300 second TTL (5 minutes):

[root@web ~]# curl "https://www.cloudflare.com/api_json.html?a=rec_new&tkn=62a946da58115cc89cff61f84b4a6c8f401b3&email=
root@domain.com&z=domain.com&type=A&name=images&ttl=300&content=192.34.56.28"

{"request":{"act":"rec_new","a":"rec_new","tkn":"62a946da58115cc89cff61f84b4a6c8f401b3",
"email":"root@domain.com","z":"domain.com","type":"A","name":"images","ttl":
"300","content":"192.34.56.28"},"response":{"rec":{"obj":{"rec_id":"32696770","rec_tag":
"b469d45498fc38d7792a46bafcff0136","zone_name":

"domain.com","name":"images.domain.com","display_name":"images","type":"A","prio":null,
"content":"192.34.56.28","display_content":"192.34.56.28","ttl":"300","ttl_ceil":86400,
"ssl_id":null,"ssl_status":null,"ssl_expires_on":null,"auto_ttl":0,"service_mode":"1",
"props":{"proxiable":1,"cloud_on":1,"cf_open":0,"ssl":0,"expired_ssl":0,"expiring_ssl"
:0,"pending_ssl":0,"vanity_lock":0}}}},"result":"success","msg":null}

Adding more proxies is as simple as running the same command but replacing IP address with that of a new proxy.  This is an example of round-robin DNS load balancing. You can also modify how the request will be handled by the DNS server by using service_mode.  By setting service_mode to 1, the requests go through ‘orange’ cloud of CDN proxies.  Setting service_mode to 0 points the A record directly to the IP address you specified.

a-record-proxy

Now we can modify this record and add new Nginx proxies to scale it horizontally.  The entire process can be automated using Bourne shell, PHP, Python, Ruby, and so on.

]]>
https://www.serverstack.com/blog/2013/01/21/load-distribution-with-nginx-and-cloudflare/feed/ 0
How Does Nginx Speed Up Server Performance? https://www.serverstack.com/blog/2012/10/04/how-does-nginx-speed-up-server-performance/ https://www.serverstack.com/blog/2012/10/04/how-does-nginx-speed-up-server-performance/#comments Thu, 04 Oct 2012 19:54:28 +0000 https://www.serverstack.com/blog/?p=186 Nginx, a web server exported from chilly Russia, was officially found to be the 2nd most popular webserver for active websites in June of 2012 by Netcraft. In terms of overall marketshare, the study placed it third, behind Apache and Microsoft IIS. However, considering that the server’s first public release was in 2006, the software’s growth has been nothing short of exponential. Nginx is now the official web server for over 60 ... ]]>

Nginx, a web server exported from chilly Russia, was officially found to be the 2nd most popular webserver for active websites in June of 2012 by Netcraft. In terms of overall marketshare, the study placed it third, behind Apache and Microsoft IIS. However, considering that the server’s first public release was in 2006, the software’s growth has been nothing short of exponential.

Nginx is now the official web server for over 60 million domains and services some of the internet’s most trafficked sites, such as Hulu, Facebook, Netflix, Instagram, and WordPress. Its rapidly growing popularity and great reception make it a force to be reckoned with for competitors and a great help to developers looking to set up a fast and efficient site.

How exactly does nginx achieve blazing speeds? It sets itself apart from the other top web server powerhouses (such as apache) by being able to handle an enormous amount of concurrent connections. Nginx architecture does not require each connection to spin up a a new process.

Instead of Apache’s threaded method, Nginx speeds up server performance by using an asynchronous event-driven approach. The software responds quickly to server requests and serves static files at the brisk pace: javascript, CSS, various media, and movies charge through the nginx infrastructure at high speeds.

Nginx is also easy to scale. When it works as a reverse proxy, the program can be a powerful load balancer, spreading connections out across as many available servers as needed. This increases the speed of each server and beefs up the security—nginx offers sophisticated Denial of Service Attack protection by dividing the harmful and harmless traffic and dealing with them separately.

All of this highlights one of nginx’s greatest strengths: its flexibility. Depending on the situation, nginx can either act as an independent web server or be configured as a reverse proxy with an application server behind it to process the dynamic code (like php, python, perl, and others).

While the program is nearly unparalleled in the way that it serves static files, making it into a reverse proxy can increase speed even more. Combining nginx with an application server behind it (apache is great choice for the job) allows nginx to continue serving site visitors while the application server works through the meatier back end.

What started eight years ago as a way for the giant Russian site Rambler to process its hundreds of millions daily requests has now become a sleek, efficient, useful program that can speed up the processes of any server. It may only serve around 10% of the internet’s domains now, but nginx moves quickly, and that percentage can only keep zooming up.

]]>
https://www.serverstack.com/blog/2012/10/04/how-does-nginx-speed-up-server-performance/feed/ 0
5 Ways To Speed Up Your Website With Nginx Web Server https://www.serverstack.com/blog/2012/02/07/5-ways-to-speed-up-your-website-with-nginx-web-server/ https://www.serverstack.com/blog/2012/02/07/5-ways-to-speed-up-your-website-with-nginx-web-server/#comments Tue, 07 Feb 2012 21:06:59 +0000 http://serverstack.nyconrails.com/blog/?p=55 We’ve have been using Nginx server for nearly 5 years now at ServerStack with amazing results. Our favorite implementation is something we’ve nicknamed “Apachix” which is a Nginx reverse proxy to an Apache server. This provides a transparent configuration that it requires no code changes to your application, allows you to continue using .htaccess files and delivers all of the performance boosts that are associated with Nginx. Recently we ran a set ... ]]>

We’ve have been using Nginx server for nearly 5 years now at ServerStack with amazing results. Our favorite implementation is something we’ve nicknamed “Apachix” which is a Nginx reverse proxy to an Apache server. This provides a transparent configuration that it requires no code changes to your application, allows you to continue using .htaccess files and delivers all of the performance boosts that are associated with Nginx.

Recently we ran a set of benchmarks to gauge how well Nginx can outperform Apache. The best results were delivered by Nginx combined with PHP-FPM, and we tested it against Apache running with mod_php. What we saw was an overall 300% increase in site performance based on successful concurrent requests per second, and the amount of total requests that were served during the test period.

Nginx consistently beats Apache in high concurrency environments, and not only does it deliver faster performance it also uses less memory. This means you can do more with the exact same dedicated server you already have. This is worth repeating, instead of upgrading processors or memory, or adding another web server to your account, just switch your web server over to Nginx and instantly gain the performance advantages.

Our fully managed hosting means you’ll gain all of our expertise in running and configuring Nginx server for high performance hosting environments. Here I’ll detail the top 5 configurations that will dramatically improve your website performance.

1. Use Nginx server as a static web server

Since Nginx runs as a high performance web server utilizing very little resources, it is ideal to deliver static content instead of Apache. Not only does Nginx use very little resources but its code is designed to allow a very large number of simultaneous (concurrent) connections. It is also very good at answering slower clients like mobile phones without tying up the server. Overall you’ll see a 2-3x improvement in website speed if your static content is served via Nginx.

2. Use Nginx server as a reverse proxy to Apache web server

This is our favorite upgrade as it requires no code changes, all of your applications continue to work exactly as they did, including .htaccess files. One of the hardest things that we’ve heard from our customers about switching to Nginx is having to rewrite all the rewrite rules. This can be the trickiest part of the migration to Nginx. With Nginx proxying requests to Apache on the back end your application just runs exactly the same way as it did in Apache. All the performance without any of the pain.

3. Use Nginx with FastCGI PHP server

Nginx combined with a FastCGI PHP server like PHP-FPM produces the most powerful PHP server available. This is the most effective way to serve PHP based sites. The most popular software packages that run on top of this are WordPress, Drupal, Joomla, OpenX and ExpressionEngine. In this environment Nginx is serving all of your static content and the PHP-FPM daemon is processing all of your dynamic php web requests. Overall you’ll see a 300-400% increase in performance as compared to a traditional Apache + mod_php setup.

4. Use Nginx server as a cache proxy

Instead of having to dynamically generate every request that comes to your website, Nginx has a built in cache system. This will allow dynamic requests such as PHP, Java or any dynamic page to be cached directly on the server thereby reducing the load on your CPU and system in general. Your server won’t have to generate the exact same dynamic page for 2 different users if there is no difference in content.

5. Use Nginx server as the origin for your CDN

We’ve already covered that Nginx is fantastic at serving static content. But what can you do to make your site even faster? Layer a CDN on top of the server to distribute your content across the globe. A content distribution network (CDN) will distribute your media, pictures, movies and other large objects to a network of thousands of servers across the globe. Depending on where a visitor is coming from to get to your site they’ll pull the media objects directly from the closest physical server location from the CDN instead of reaching all the way to your server. And for really intensive sites the Nginx server can deliver so much static content to a CDN that we’ve been able to feed a 100Gbps+ per second global site with just 2 back end Nginx servers.

These are the most popular configuration options for Nginx, if your server is starting to struggle with the loads you’re generating, or you’re getting too much traffic, it may be time to switch to Nginx. Selecting a fully managed hosting partner can make that transition a lot easier as all the configuration and prep work is taken care of. All that’s left for you to is to see your site performance improve, and the ability to serve tons more users from the same existing server.

 

]]>
https://www.serverstack.com/blog/2012/02/07/5-ways-to-speed-up-your-website-with-nginx-web-server/feed/ 0