If you have been following the iRules Tech Tips here on DevCentral lately, you’ll see more and more use of the Session table for data storage and retrieval.  Colin recently put up a few articles around building Heatmaps with iRules.  In those examples, he uses the session table to store all his geolocation data for later reporting.

And George has been having fun with the session table as well in a few of his recent articles

Not sure how the table command works?  There is a great series of articles on the Table Command giving in depth details on how it works and how to use it effectively.

The Problem

The Session table is a great and wonderful thing, but it does have one weakness – it resides in memory and is not persistent across server restarts.  If you have a HA-pair of servers, the session data will be replicated across them, but there will be those edge cases where you need to take them both down and will lose anything you have stored in the session table. You may also want to analyze that data in an external program.

There needs to be some way of archiving that data off of the BIG-IP!

The Solution

Well, now there is a way – thanks to iRules.  I’m going to limit this article to exporting “subtable” data as most of the examples we are seeing now-a-days are focused on using subtables to segment the data within the session table.  Also, it doesn’t hurt that there is a nifty “table keys” command to return all the entries in a specified subtable.

The solution is actually quite simple and will qualify for Colin’s 20-Lines-Or-Less series.

   1: when HTTP_REQUEST {
   2:   switch -glob [string tolower [HTTP::uri]] {  
   3:     "/exporttable/*" {
   4:       set csv "Table,Key,Value\n";
   5:       set tname [getfield [HTTP::uri] "/" 3]
   6:       foreach key [table keys -subtable $tname] {
   7:         set val [table lookup -subtable $tname $key];
   8:         append csv "$tname,$key,$val\n";
   9:       }
  10:       set filename [clock format [clock seconds] -format "%Y%m%d_%H%M%S_${tname}.csv"]
  11:       HTTP::respond 200 Content $csv \
  12:         "Content-Type" "text/csv" \
  13:         "Content-Disposition" "attachment: filename=${filename}";
  14:     }
  15:   }
  16: }

The iRule looks for a request on the virtual server to the URL “http://hostname/exporttable/tablename”  The sections of the URI is split apart and the tablename portion is removed with the getfield command. 

At this point, I call the “table keys” sub-command to get a list of all the keys in that sub-table.  A variable is created to store the resulting output.  In this example, I opted to go with a simple Comma Separated Values (CSV) format but it would be trivial to convert this into XML or any other format you care to use.  The list of keys is then iterated through with the “table lookup” sub-command and the resulting record is appended to the output.  A unique file name is created with the TCL “clock” command to include the date and time along with the requested table name.

Finally, the output is returned to the client with the correct Mime type of “text/csv” as well as a Content-Disposition header to tell the browser the file name as well as indicating that it should attempt to save it to disk.

Fancying It Up A Bit

I could have stopped there with the archiving, but I’m going to go a step further and add a user interface to this export iRule.

   1: when HTTP_REQUEST {
   2:   switch -glob [string tolower [HTTP::uri]] {  
   3:     "/exporttable" {
   4:       HTTP::respond 200 Content {
   5: <html><head><title>iRule Table Exporter</title></head>
   6: <script language="JavaScript"><!--
   7: function SubmitForm()
   8: {
   9:   var submit = false;
  10:   var value = document.getElementById("table_name");
  11:   if ( null != value )
  12:   {
  13:     if ( "" != value.value )
  14:     {
  15:       document.export_table_form.action = "/exporttable/" + value.value;
  16:       submit = true;
  17:     }
  18:     else
  19:     {
  20:       window.alert("Please Enter a table name");
  21:       value.focus();
  22:     }
  23:   }
  24:   return submit;
  25: }
  26: //--></script>
  27: <body><h1>iRule Table Exporter</h1>
  28: <form method="get" name="export_table_form" action="">
  29: <table border='1'>
  30: <tr>
  31: <td>Table Name</td>
  32: <td><input type="text" Id="table_name" value=""></td>
  33: <td><input type="submit" value="Submit" onclick="javascript:return SubmitForm()"></td>
  34: </tr>
  35: </table>
  36: </form>
  37: }
  38:     }
  39: }

Now, if the user requests the url http://hostname/exporttable, a form will be displayed allowing the user to enter the requested table name and then click submit to request the downloaded archive.  Since I wanted the format to be in the URL and not the querystring parameters, I had to do some JavaScript fun to manipulate the “action” for the HTML form (in case you were wondering what the SubmitForm function was for.

Be on the lookout for an upcoming article where I’ll illustrate how to reverse this article and Import an archived file back into the session table.

The Full Example

You will have to add both sections above together to get the fully functional iRule, or you can check it out in the iRules CodeShare under SessionTableExport.



Related Articles on DevCentral

Technorati Tags: table, iRules, Exporting, Joe Pruitt
Comments on this Article
Comment made 14-Oct-2010 by Jason Rahm
Right on, Joe! I was starting the approach to forms for my provisioning tech tip for multi-pool member enable/disable, but was stuck on how to make it work. Now I know who to consult for part 3.