Forum Discussion

Darren_Person_2's avatar
Darren_Person_2
Icon for Nimbostratus rankNimbostratus
Oct 02, 2007

Duplicate Cache Entries for Firefox & IE?

Hi All,

 

 

We recent;y upgraded to 9.4.1 and have noticed an issue in RAMCache. If I click on a page in IE, I see the page get put into RAMCache. If I go into Firefox and click on that exact same URL, I will see a duplicate entry in RAMCache with a different time stamp. Subsequent requests to that content increase the RAMCache count for each object individually - how is this possible and do you know how to fix it?

 

 

Thanks!!

7 Replies

  • Colin_Walker_12's avatar
    Colin_Walker_12
    Historic F5 Account
    This sounds more like a product support type of question, rather than an iRules development issue. Are you using iRules to decide what to cache?

     

     

    If not, I'd recommend creating a ticket with Technical Support and reporting the issue there, as they're far better equipped to handle support type requests.

     

     

    Colin
  • bl0ndie_127134's avatar
    bl0ndie_127134
    Historic F5 Account
    Well if the response header contains 'Vary: User-Agent', we have to cache the content separately (even if its the same uri). Servers add this header if they know that the content needs to be rendered differently on different browsers, ex. Java Script files.
  • Hi Bl0ndie,

     

     

    I think I understand what you are saying, but the Cache API has a way to set the "User-Agent" through an iRule (Cache::User Agent ). The Wiki says that it helps eliminate duplicate requests, but there is no example of how to use this - can you please provide an example so that we can test implement?

     

     

    We send out XHTML compliant pages that should not change based on browser (that is handled through CSS), so I'm stuck with why we should have multiple copies of the same object in cache.

     

     

    I'd appreciate your feedback and any example u can provide ASAP! Thanks!
  • The CACHE::useragent command is a way to override the client user agent string in the cache system without actually changing the User-Agent header.

    So, if you wanted all IE browsers to be cached with the same agent string

    when HTTP_REQUEST {
      if {[HTTP::header User-Agent] contains “MSIE”} {
        CACHE::useragent “MSIE”
      }
    }

    If you wanted to have all clients cached with the same user agent, you could do something like this:

    when HTTP_REQUEST {
      CACHE::useragent "GENERIC-USER-AGENT"
    }

    Hope this helps... I'll update the wiki with some examples.

    -Joe
  • By the way, you might wonder what the difference between the CACHE::useragent and the CACHE::userkey values are. The CACHE::userkey allows users to add data to the cache key. Its so similar to CACHE::uri that we have stop asking people to use it any more. CACHE::useragent on the other hand only adds the data to the key if the content is user agent specific. Otherwise it is a no-op.

     

     

    Hope that helps clear things up a bit more...

     

     

    -Joe
  • Hi Joe,

     

     

    First off, thank you very much for your help with this. Unfortunately, we implemented it last night and am still seeing many many copies of the same file in cache (we use iControl to pull out the listing of items). I even tried swapping useragent with userkey, but neither of them seemed to help eliminate the duplicates.

     

     

    We have Gzip compression turned on, so the most I should have seen was 2 objects per page (one zipped & one non-zipped) - I saw about 8 different objects in cache and 4 of them had the exact same size (64k - nonzipped). the others were (21k - zipped).

     

     

    Any other thoughts would be greatly appreciated, I also opened a ticket with support but they suggested using the userkey which also did not work.
  • Just as a follow up on this issue - we are still working through it with support, but here are the findings.

    1. Multiple copies are present due to the combination of RAMCache and Gzip compression.

    2. Apparently, compression happens before the caching of the object and uses the users browsers headers for ACCEPT_ENCODING. Since browsers implement this differently ie("deflate, gzip", "gzip, deflate", etc) - and the F5 stores each combination, we get tons of copies of the same object.

    3. We created an iRule to try and minimize the number of potential copies, but it doesn't appear to be 100% working (hence, with the Development team).

    Here is the example iRule that we tried:

    
    when HTTP_REQUEST {
        set var_accept_encoding [string tolower [HTTP::header "Accept-Encoding"]]
        if {$var_accept_encoding contains "gzip"} {
              CACHE::accept_encoding "gzipped"
      }
        else {
           CACHE::accept_encoding "not-gzipped"
      }
    }

    I'll keep you updated as we find out more from Dev.