Forum Discussion

John_Geddes_295's avatar
John_Geddes_295
Icon for Nimbostratus rankNimbostratus
Sep 11, 2006

iRule optimization

The iRule below was created by a former employee, and after reading the manual and some posts in this forum, I read that this rule is not really optimized, and will still be processed (due to the HTTP_REQUEST) for every consecutive hit after the first one, including images.

 

 

I basically want to rule out bot traffic to another pool, then let the rest come in to the primary pool. Any suggestions

 

 

 

when HTTP_REQUEST { if { [matchclass [IP::remote_addr] equals $::blacklisted_clients] } { pool Bots} elseif { [matchclass [HTTP::header User-Agent] contains $::blacklisted_useragents] } { pool Bots } elseif { [string first -nocase "bot" [HTTP::header User-Agent]] >= 0 } { pool Bots } else { pool MainPool } }

 

 

1 Reply

  • Deb_Allen_18's avatar
    Deb_Allen_18
    Historic F5 Account
    It seems this rule will accomplish your goal by filtering off 3 suspicious groups of traffic while allowing other traffic through to the main pool.

    Each request coming to the virtual server where the iRule is applied will be evaluated within the HTTP_REQUEST event before the iRule can determine where to send the request.

    You could optimize slightly by catching the IP address match when the client first connects:
     when CLIENT_ACCEPTED {
      if { [matchclass [IP::remote_addr] equals $::blacklisted_clients] } { 
        pool Bots
        event disable all
      }
     }
     when HTTP_REQUEST {
      if { [matchclass [HTTP::header User-Agent] contains $::blacklisted_useragents] } { 
        pool Bots
      } elseif { [string first -nocase "bot" [HTTP::header User-Agent]] >= 0 } { 
        pool Bots
      } else { 
        pool MainPool
      }
     }

    HTH

    /deb