Forum Discussion

Ruggerfly1's avatar
Ruggerfly1
Icon for Nimbostratus rankNimbostratus
Sep 21, 2016

ASM throttle automated sessions. Humans get priority.

Fairly new to ASM - we have an interest in identifying and throttling BOTS (automated sessions). Using the Default Web Scraping Rapid Surfing settings: 5 over 1000MS in transparent mode at the moment. Reviewing Session Opening, and Session anomalies - any best practices combination to ultimately throttle?

 

the website in question provides statistics for individual members behind a login - there is an automated service members can use that will come in daily and collect all information for anyone signed up, but I'd like for this BOT to get less priority and restrict transactions per second while letting them complete their job.

 

1 Reply

  • Tuning WebScraping can be quite a difficult task and the actual settings will heavily depend on your web application. You will inevitably go through may cycles of trial-and-error until yo uget this right.

     

    However in your case I think WebScraping protection feature might not work at all. I might be wrong but it appears to me that you do not want to BLOCK the bots, you just want to slow them down because your bots are legitimate.

     

    WebScraping protection in ASM won't let you do that - it drops connections from what it identifies as a bot when pre-configured threshold are exceeded. If you can't drop and block connections (because it will affect the business for example) then WebScraping is not for you - you will need an iRule which identifies your legitimate bots somehow (for example using a User Agent header) and then applies a rate-shaping profile to spoon-feed them the data they are after.

     

    If you are OK with dropping the connection and just need to do it to overzealous bots then you can use the Client Side Integrity Defense - it injects a piece of JavaScript into your pages which tracks things like mouse moves and key presses (which humans do but scripts don't).

     

    Re: tuning the settings of Bot Detection - there is a good article on DevCentral which explains the settings in detail here: https://devcentral.f5.com/articles/more-web-scraping-bot-detection

     

    Hope this helps,

     

    Sam