I'm sure there are few people who have escaped some form of reporting or metrics collection in their careers. Pondering my various roles and responsibilities for a moment, I think I've measured almost everything: SPAM emails caught, megabytes browsed, web-cache hits, IOPS, virtual machine density, tweets and re-tweets... In most cases it has been clear to me the value of certain metrics over others. As a contractor I’ve measured hours worked - that’s how I got paid. As an engineer I’ve measured web usage as a means to justify organizational spend for campus access. In sales, I’ve worked with customers on creating metrics-based business cases to justify both investment and change in practice. However, I’ll be honest with you, there were a few metrics that left me scratching my big bald head…

This brings me on to ‘Big Data’, “an all-encompassing term for any collection of data sets so large and complex that it becomes difficult to process them using traditional data processing applications.” Sounds interesting, right? But who’s actually analyzing this data to drive business value. What are their use cases? Or, from a different angle, who’s misusing the data to get the answers they want, instead of the guidance they need? Jason Spooner of Social Media Explorer grabbed my attention with his article titled, “BIG DATA IS USELESS: unless you have a plan for how to use it”, highlighting that, “There is no inherent value to data. The value comes from the application of what the data says.” My thoughts exactly!

While I know I’ve come across a little on the skeptic side today, it’s my endless questioning and curiosity that’s kept me employable thus far. Consequently, I feel compelled to ask, is there a danger that Big Data is merely fueling the behavioral addicts among us? Is this an enterprise-grade obsessive compulsive disorder? Or, maybe it’s the skeptics like me that will inhibit future application. J. Edgar Hoover was heavily criticized over his fingerprint database… ok, not my finest example.

In my current role I’m focused heavily on Software-Defined Networking. Unfortunately, SDN has been largely driven by the desire to solve implementation issues – how quickly an organization can deploy a new network and improve its Time to Market for new applications and services. However, I believe that there is far more to gain from applying software-defined principals to solving post-deployment problems. Consider the benefits of a network topology driven by real-time data analysis. A network that can adapt based on its own awareness. Now that would be cool!

I appreciate that the control-plane abstraction driven by SDN is step one: allowing for the breaking away from management silos and steering towards a policy-driven network. However, there is still far more to gain from a software-defined approach. I, for one, look forward to the day when we see these data center network policies being driven by data analysis, both historical and real-time, to deliver better business services. Dare I call it DC Agility 2.0…?