At Web 2.0 Expo Microsoft essentially stole the show with the introduction of its Live Mesh platform. Live Mesh is, essentially, an integration hub that incorporates and manages Internet connected devices that today are unrelated and managed individually using open standards.

Microsoft might not like the term "integration hub", but that's basically what it is. Yes, on the surface it's a platform that enables inter-device communication and seamless access to a variety of services, but under the hood it's got to be doing some pretty complex integration work. While Microsoft plans on using open standards, that doesn't mean that those standards allow devices to communicate directly. My Blackberry, Outlook, and storage devices don't use the same interface, after all; in order to communicate and share data across disparate device there must be some kind of integration (i.e. translation of data formats) going on.

From Microsoft's Live Mesh Blog

At the core of Mesh is concept of a customer’s mesh, or collection of devices, applications and data that an individual owns or regularly uses.


The Live Mesh platform provides the ability for applications to connect to any other device, regardless of network topology (network transparency), within the mesh.

Now that sounds cool. Who wouldn't be interested in being able to copy data from remote or local storage over to their digital picture frame? Or from their camera to their picture frame and their remote storage for backup?

This is one heck of an initiative, and it sounds hella cool. But (and there's always a but) it's going to take a massively resilient and flexible infrastructure to ensure that all the disparate points in the mesh are always available, secure, and perform up to consumer expectations. Microsoft is right in that consumers, because this is the market at which Live Mesh is targeted, do just want things to work.

The danger in these kinds of initiatives is the same danger that's always existed: if you fail to ensure that your infrastructure can handle a new application you may be faced with a massive, hurried "fix-it" project after the fact. First impressions are the most important, and if you got it wrong the first time it's very difficult to change that and convince potential customers you got it right the next time.

From Microsoft's Live Mesh Blog

The mesh is the foundation for a model where customers will ultimately license applications to their mesh, as opposed to an instantiation of Windows, Mac or a mobile account or a web site. Such applications will be seamlessly installed and run from their mesh and application settings persisted across their mesh.

This is where things are going to get ugly. Microsoft can certainly ensure that its services are available, secure, and fast within the mesh. But if this platform is successful customers will want to include other services from other vendors in their mesh (which is what Microsoft anticipates, that's all good) and that means other services - over which Microsoft does not have control - may negatively impact consumer perception of mesh. Which means that Microsoft will need to impose some kind of service level agreements on service providers who wish to make their services available in "the mesh" and that service providers are going to have to ensure that their application delivery infrastructure can provide fast, secure, and always available access for its services.

Acceleration, security, optimization, intelligent routing...all these capabilities will be necessary to ensure that every service in the mesh is available, secure, and fast as well as highly scalable. How about secure remote access so you can access your "devices" at work while you're at home, and vice versa? The possibilities here are endless, but they are going to require a lot of infrastructure support to make them all reality.

If - and I do mean if - this takes off as well as Microsoft hopes, it certainly could be the killer-app for application delivery.

Imbibing: Coffee