Varnish vs. Nginx

Caching contributes significantly to website speed and latency. This makes it crucial to choose the right proxy server to handle your website needs. Varnish and Nginx are two such services with distinct strengths and weaknesses that could make one a better fit for your website.

Frank Garland - Technical Researcher
Varnish vs. Nginx

Speed is king for websites. Something as simple as the time it takes a webpage to load affects visitor retention rates and acts as a signifier of the website’s quality. A key ingredient in website speed is caching, and the proxy servers and reverse proxies that websites use to handle caching needs. In this article we’ll discuss the similarities and differences between two popular proxy servers, Varnish and Nginx, and hopefully provide some insight on which one might be most useful to your website needs.


First released in 2006, Varnish immediately differentiated itself from competitors by primarily focusing on HTTP acceleration. That’s the acceleration of webpage access speed specifically through HTTP connections. That laser focus has drawn in such big name companies as Pinterest and Twitch, incorporating Varnish’s HTTP acceleration into their server stacks. It is important to note that as it stands currently, Varnish cannot run an entire application on its own, since it is an accelerator and not a complete web server. You still need to run a separate web server package, but Varnish can further supplement that by accelerating the server’s HTTP connection speed.


A few years senior to Varnish, Nginx is an open source web server that first popped up on the market back in 2002. Built as an attempt to resolve architectural concurrency issues, Nginx has become a popular solution for many of the world’s most heavily trafficked websites, including Netflix, Twitter, and Facebook. Nginx’s focus on concurrency resolution led to the development of an asynchronous algorithm that constantly processes and monitors system-wide events. That may sound a tad technical, but the important thing is that this process allows Nginx to field multiple connections through the same work processor, rather than having to make a new thread to match each new request.


Varnish and Nginx have a lot of overlap, and there are several things that both software packages can handle quite well. Both Varnish and Nginx can be configured to serve as reverse proxies and load balancers, regulating and balancing incoming server traffic. Both provide caching capabilities, and each contains their own built in security measures, including DDOS protection tools. Varnish and Nginx both have commercial support options, for companies willing to pay to get advanced support and features (Varnish Enterprise and Nginx Plus respectively). When you move past the similarities, many of the differences in functionality, while interesting to dive into, don’t speak to the quality of one service over the other. For instance, Nginx supports and relies heavily on SSL termination. Standard Varnish doesn’t have that capability, instead reading ESI requests, which Nginx can’t interpret without a plugin. You can quibble over which setup is better, but this kind of differentiation really just falls into programmer preference. That doesn’t mean, however, that this is an all-products-are-equal scenario. Varnish and Nginx do have a handful of more pronounced differences that we’ll dive into… now.

Why Varnish?

Varnish is a very streamlined service, and that sleek design offers a high level of flexibility. One perk of that increased flexibility are options to enable a more distributed server configuration. Greater distribution simplifies server maintenance and performance monitoring, allowing admins to rewrite urls and adjust caching policies on the fly. Nginx follows a more centralized configuration, meaning it can be a little more rigid and inflexible in the face of rapid administrative changes. Another perk to choosing Varnish is its Grace Mode feature. In order to prevent local caching locations from becoming overstuffed with old data, all cached data has a TTL (Time To Live) expiration time. Once it reaches that point, the data is purged to free up space and can no longer be accessed locally. Unless you have Varnish. Varnish’s Grace Mode overcomes this TTL barrier, allowing you to access cached data that has already expired in the system. This is particularly useful during maintenance or during an unexpected system failure, when specific expired content may not have a channel to be re-cached. Beyond all the nice features, Varnish’s greatest selling point is its singular focus as a web accelerator. Varnish focuses on one thing, and does it quite well, offering a truly impressive level of caching and purging flexibility.

Why Nginx?

While Nginx can’t always keep up with Varnish’s flexibility, it makes up for it in speed. The centralized configuration that it follows allows it to service all requests from a singular directory, drastically cutting down on system response delays. This high speed makes Nginx extremely resilient in the face of high traffic situations. Servers handling unexpected requests can’t afford even the slightest delay in response time. Nginx offers stability during those critical moments, thanks not only to its single-directory searches, but also its state of the art load balancing software. These days any modern caching service is expected to provide its own load balancing solutions, but Nginx’s load balancing goes above and beyond, supporting HTTPS layer 7 connections, offering geographic traffic distribution, and generally providing a truly impressive breadth of capabilities for a package of its kind. Compared to Varnish, Nginx has much more complex infrastructure. This can make it more difficult to configure but gives it a greater capacity to handle complex tasks like backend node-monitoring. Nginx is particularly efficient at handling static material. Its single thread approach allows it to use less memory for static caching, boosting response and recovery times.

Our Take On Caching
If you’re looking for something a little newer, our Azion product suite is fully equipped to handle all of your caching needs, while providing built in system compatibility with the Azion edge network. Our layer 7 load balancer is state of the art, with a focus on content-aware balancing that adjusts the flow of traffic based on the kind of content received. What’s more, by moving your caching to our distributed edge network, you’ll be able to take advantage of the low latency connections our edge nodes provide. Their distributed set up means they are likely to be much closer to your local system than conventional servers, boosting the speed of your data flow by shortening its distance travelled. In particular, if your system in any way relies on making connections with smaller, low-bandwidth devices, such as IoT, you’re probably going to want to move your caching to an edge environment like Azion to take full advantage of that cutting-edge technology. We’ve built upon the groundwork laid by top-tier caching services like Nginx and Varnish, inspired by the flexibility of Varnish and the speed of Nginx, we’ve built a unique and comprehensive product suite to provide a whole new take on caching.


Ultimately, Nginx and Varnish are both great pieces of software, at the top of their field. You can use either (or both) to great effect. What you should choose depends on the specific needs of your digital platform. If you’re looking for user-friendly administrative flexibility through a distributed configuration, you should probably check out Varnish. If your priorities lean more towards the power of raw speed and stability, then Nginx might be more up your alley. And if you’re looking for a modern approach that brings to bear the full power of the global edge, Azion may be just what you’re looking for. Whatever you choose, remember to take some time out of your day and give some love to the proxy servers in your life, they’re hard working little guys, and your website can’t function without them.

Subscribe to our Newsletter