Forum Discussion

Dianna_129659's avatar
Dianna_129659
Icon for Nimbostratus rankNimbostratus
Sep 30, 2013

iRule to limit request rate of search engines?

Is there sample code I could look at to help write an iRule to throttle search engines as they crawl? I think I can do this by user-agent. The goal is to allow Google and others to crawl, while at the same time, limit them from taking all of our server memory. I appreciate any suggestions or guidance to sample code or other ideas. Many thanks! Dianna

 

1 Reply

  • Take a look at this rule: https://devcentral.f5.com/wiki/iRules.ControllingBots.ashx

     

    You create two pools, where both have the same servers, but one of the pools is a 'slow' pool where the members have connection limits.

     

    Beyond the aggregate number of connections, the connections will get dropped.

     

    Hope that helps.