Forum Discussion

Matt_McCullagh_'s avatar
Matt_McCullagh_
Icon for Nimbostratus rankNimbostratus
Jun 24, 2013

Session Filter

Hi All,

 

Jumping in here at the deep end and completely lost as to where to start looking! Any help most appreciated!

 

 

Have been asked to evaluate the possibility of using (already deployed as load balancers) F5 to specifically discard multiple session requests duing a limited time period.

 

Example:

 

Known subscriber clicks on a "purchase" button.

 

Response time may periodically be slow as a credit check is carried out in the background so multiple systems involved.

 

Would like to be able to discard multiple identical requests (subscriber clicking again and again) during a 5 second window.

 

If after 5 seconds another duplicate request is received then that one should be treated and during the following 5 seconds any other request filtered/discarded.

 

I am assuming some form of iRule could be written for this but wanted to reach out to see if anyone had implemented this type of filtering yet and if any gotchas.

 

Is there a way to calculate the impact on resources adding a rule like this could impose?

 

 

Many thanks

 

 

Matt

 

 

5 Replies

  • Been going through all the tech notes etc and wonder would this be a valid approach?

    Thanks

    Matt

    
    when RULE_INIT {
      set static::windowSecs 5
    }
     setting 5 second timer from receiving first request from client. Any additional request during this time period should be refused
    when HTTP_REQUEST {
      if { [HTTP::method] eq "POST" } {
        if { ! [HTTP::header exists Authorization] } {
          HTTP::respond 429 content "Rejected due to large volume of duplicate requests from same client"
          return
        }
     Here I am rejecting any of the duplicate requests if that unique ID is found in the table.
    set myUserID [getfield [b64decode [substr [HTTP::header "Authorization"] 6 end]] ":" 1]
        set myMaxRate [findclass $myUserID $::MaxPOSTRates "1"]
        if { $myMaxRate ne "" } {
          set reqnum [table incr "req:$myUserId"]
          set tbl "countpost:$myUserId"
          table set -subtable $tbl $reqnum "ignored" indef $static::windowSecs
          if { [table keys -subtable $tbl -count] > $myMaxRate } {
            HTTP::respond 303 Location http://server.to.post.to/
            return
          }
        }
      }
    }
    Setting MaxPOSTRate as a variable in case I need to add different scenarios for different providers 
     
  • You can do this, but that doesn't mean you should do it. The ASM module had similar functionality to prevent DDOS, and I've found it useful but with costs.

     

     

    For a good user experience it's better to disable the button I. The browser whilst the request is bring processed. This is a common user experience. To protect your backend service, it's better to limit the total number of requests sent in the backend code. I. Java you'd use an atomic integer and token algorithm to do this.

     

     

    At the f5 layer you don't have much context and your choice of user actions is limited to redirecting to a busy page or returning a reset. Neither of which are very user friendly.

     

     

    That's my two cents, from someone who's been there.

     

     

    Peter
  • Thanks for the feedback Peter.

     

    Unfortunately the site that is causing the problems is external (third party to our client) and at present I have no way to force them to clean up their code.

     

    The problem here is that the end user could, in theory, end up being charged multiple times which is why I am looking for a way to discard subsequent attempts during a limited time period.

     

    I think worst case reported so far was 100 duplicate transactions in 1 second - which does sound rather like an attack doesnt it!

     

    Another option may be to "cache" the other 99 requests and send back the same reponse to each of them once the first has completed the transaction but I am not sure that would be very clean either.

     

    What do you think would be the lesser evil?

     

    Thanks

     

    Matt

     

     

     

     

     

  • I'd need to know a lot more to have a useful opinion. Feel free to email me or LinkedIn.

     

     

    Peter_booth-AT-me-dot-com
  • set myMaxRate [findclass $myUserID $::MaxPOSTRates "1"]since 9.4.4, $:: prefix is no longer required to reference data group. data group now is cmp compatible.

     

     

    Class / Data Group List References

     

    https://devcentral.f5.com/wiki/iRules.cmpcompatibility.ashx

     

     

    set reqnum [table incr "req:$myUserId"]i am not sure but i think duplicate request may be on different tcp connection (before response of the 1st request has not been received). if yes, i think we do not need to generate reqnum. just adding client ip and port number to table may be okay.

     

     

    just my 2 cents.