Forum Discussion

AvinashP_138219's avatar
AvinashP_138219
Icon for Nimbostratus rankNimbostratus
Nov 27, 2013

pro-actively check for specific updated content

Hi,

 

I'm using a Web Acceleration profile (configurend from the web interface) to do caching on a BIG-IP 3600 LTM running version 11.2.0 Build 2451.0 Hotfix HF1. That works fine.

 

The issue rises when content on the webservers is updated at any given time; the BIG-IP has no built-in method to check for updated content. It just expires the cache at a given time (aging timer etc.).

 

Question: how can I get the BIG-IP to pro-actively check specific updated content at the webservers ?

 

5 Replies

  • Hi, You can't do exactly what you are asking, however you do have several options;-

     

    • trigger a 'cache invalidation' from the external process that creates new content - this is the most powerful method
    • Place low TTL on the cached content
    • Use IBR
    • Let the application use an IBR-like process by adding a 'cache-buster' query parameter to each request

    Are you using a CMS, an off-the-shelf app or a bespoke app?

     

    Do you have a use-case? Otherwise I can give an invalidation trigger example when I wake up (yawn).

     

    • AvinashP_138219's avatar
      AvinashP_138219
      Icon for Nimbostratus rankNimbostratus
      Hi, thanks for replying on this thread. I’m working with a Windows engineer (web servers are Windows running IIS) on the first option; using a ‘cache invalidation’ method using the HTTP_LAST_MODIFIED flag in the HTTP header. I’ll also look into the low TTL, thanks for that suggestion. Could you elaborate some more on IBR and the 'cache-buster' query parameter suggestions ? I haven’t come across those as of yet. The application running on the web servers is a CMS called Tridion. There’s a piece of software that parses an excel sheet on every request for certain webpages, which generates CPU load on the machines and high response times to the client and presents the parsed output to the client. Caching the parsed output on the LB significantly reduces the CPU load on the boxes.
  • Sorry that screenshot didn't really work;

    Request Header Matching Criteria

    Name (Ordinal)  Value
    Host            IS invalidate.in.test.com
    Path            STARTS WITH /inv
    Client IP   MATCHES 61\.9\.61\.*
    

    Cached Content to Invalidate

        Name (Ordinal)  Value
    Host    IS www.test.com
    Path    Source  path [Query Parameter]
    
  • Cache busting is using 'version tag' query parameters to differentiate versions of static content (js/css/images). Whatever is generating the pages needs to do this. Sometimes the version gets created every time a new package is released, sometimes when the webserver restarts (we do it both ways).

    You end up with this kind of thing when the page loads;-

    https://test.com/theme/js/ajaxportlet.js?t=1385450658000
    https://test.com/theme/css/stuff.css?t=1385450658000
    

    This way you can have far-future proxy/client Cache-Control headers on your static content.

    I'd google it and talk to your app guys about whether they feel like doing it.

  • Another thing to think about is setting the http-acceleration profile to cache pages indefinitely, and then using an iRule to invalidate the cache (making sure you restrict who can invalidate) and force the request to be sent to the origin web server ie.

    when HTTP_REQUEST {
    
         Check for expiration cookie from CMS
        set fExpirePage 0
        if {[HTTP::header "CMS-Expire"] ne "" && [IP::addr [IP::remote_addr]] eq "10.10.10.10"} {
             Expiration cookie is there
            set fExpirePage 1
        }
    }
    when CACHE_REQUEST {
    
         Expire this object
        if {$fExpirePage} {
            CACHE::expire
        }
    }
    

    You could use this in conjunction with the cache-buster query parameters.