AEM With Varnish

What is Adobe Experience Manager?

In today’s digital world, it is essential for any organization to have a digital presence. It helps in generating new business, attracting new customers & more importantly, keeping the existing ones. For any organization to have an effective digital presence, it requires digital content in the form of sites, images, videos, content etc.

This is where a Content Management System (CMS) such as Adobe Experience Manager comes into picture. Adobe Experience Manager (AEM), is a vast content management solution which allows building websites, mobile apps and forms. In addition, it provides enhanced capabilities to manage the content & digital assets.


Why do we need caching in AEM?

Just like any other traditional web application, caching serves following benefits when used with AEM:

  • It decreases network congestion & cost, by serving the cached content.
  • It improves performance. Content is served faster & hence increases the responsiveness of the website.
  • It ensures content availability during failure scenarios. Since the content is served from the cache & hence, even if the application is enduring failure downtime, the content is still served to the end user.


What does Adobe recommend for Caching?

Any standard AEM architecture has 4 layers of caching. This includes Web Browser, CDN, Dispatcher and AEM instances. Out of these 4 layers, Web browsers & AEM instances provide inherent caching, whereas CDN & Dispatcher caching are added as additional caching layers. In this blog post, we are going to focus on dispatcher. We will have a look at its advantages, disadvantages & also try to analyze Varnish as an alternative to dispatcher.


What is dispatcher?

The Dispatcher is Adobe caching and/or load balancing tool that helps realize a fast and dynamic Web authoring environment. For caching, the Dispatcher works as part of an HTTP server, such as Apache, with the aim of storing (or “caching”) as much of the static website content as possible.

For caching, the Dispatcher uses the inherent ability of web server to serve static content. It places the cached documents in the document root of the Web server.


As highlighted in Adobe documentation, the Dispatcher has two primary methods for updating the cache content when changes are made to the website.

  • Content updates removes the changed content/pages & its associated files
  • It flags relevant pages as being out of date, without deleting anything.


Is dispatcher caching perfect?

No, it is not. Dispatcher caching rules are often inflexible. The dispatcher performs cache invalidation on the entire directory & all its sub trees. This results into unnecessary load on your AEM instance because even a single invalidation request will require all child files & folders to be re-fetched.

There are few other limitations of dispatcher as well, which are as follows:

  • Holds onto the document
  • Does NOT hold onto the associated HTTP response


Is there an alternative?

Yes, there is. According to official documentation, Varnish Cache is a web application accelerator also known as a caching HTTP reverse proxy. You can install it in front of any HTTP server and configure the necessary caching policies. Varnish Cache is very fast & it can speed up the performance of your website 300 to 1000 times faster.

When a web page is accessed for the first time then the web server will process that request as usual, but Varnish will make a copy of what is returned to the user. So, the next time the user accesses that same page, Varnish will recognize that these requests have been made before and it can quickly respond with a cached version of the result.


Benefits of Varnish cache

One of the most important features of Varnish Cache, in addition to its performance, is the flexibility of its configuration language, VCL. VCL (varnish configuration language), defines the rules as to how incoming request should be received and handled. You can define what contents to serve, where such contents can be gotten from and how to alter requests and responses. Also, there are many run-time parameters that help to control limits on variables such as timeouts, threads, etc.

Few other prominent benefits of Varnish cache include:

  • Granular cache invalidation
  • Excludes from cache
  • Invalidation of a page using “smart bans”
  • Invalidation from AEM Publisher with Modification Listener


VCL – Varnish Configuration Language

Every inbound request flows through Varnish and you can influence how the request is being handled by changing the VCL code. You can direct requests to back ends, you can modify the requests and responses & also have Varnish take various actions depending on arbitrary properties of the request or the response. This makes Varnish an extremely powerful HTTP processor.

Varnish translates the VCL code into binary code which is in turn executed when anything is requested. The VCL files are ordered into subroutines. The different subroutines are executed at different times during the execution cycle.


As mentioned earlier, Varnish works with help of some pre-defined sub routines. 3 of the most important modules which are executed during varnish processing are as follows:

  • vcl_recv: This is the first sub routine which is called. This module will decide if the content should be looked up in cache or fetched from the server.
  • vcl_fetch/vcl_backend_response (After Varnish 4.0): This module is responsible for processing the response received form the backend.
  • vcl_deliver: This module is responsible for delivering the response back to the browser. It can be used to add header information.

There are other modules as well which can be overridden such as vcl_hash, vcl_purge, vcl_pipe etc.


Sample Varnish Configuration File


About The Author