Application Streaming – Why Bother…

Some general thoughts and ramblings on application streaming - where it is better than web applications and where it might not be.

Application streaming is an interesting technology - you can create a client rich application with sophisticated graphics and processing and yet have a high degree of security and the benefits of server side manageability. In my mind this is the best of two worlds. On the one hand you can leverage the full strength of the latest processors and graphics cabilities and on the other you can manage security and upgrades quickly and efficiently.

The application doesn't go through an install process on the client so you eliminate some of the problems associated with different people installing the same application differently. The installation can be "isolated" to protect against conflicts (in some cases this provides backwards compatibility) which also raises some challanges, although this also provides some "challenges" for the integration of mulitple applications on the same device.

Upgrades are simple and guaranteed - since you only upgrade the server and anyone using that application gets the update at next use, true for security patches as well. For those that are using the applications offline (which you can do, try that with a web app) they will get the update the next time they connect to the network.

Streaming (some products anyway) provides a means for license management, so perhaps you don't need to own as many licenses as you thought by tracking concurrent usage and preventing over subscribing. This is can be important for some expensive purchased applications.

Streaming applications are also not subject to the multitude of exploits that are written to attach web browsers and web applications. I believe that for corporate applications they are safer and easier to protect. That alone may be reason enough to justify moving in this direction.

One area where web based applications COULD be better is if they are written to work on multiple platforms with multiple browsers (such as Windows and OS X). However in practice this seems to be seldom done, most apps are still written for one environment or the other and it's more of chance that the application works in the other environments. This could be a big plus if developers would truly develop for the heterogenous world we live in.

Another is that with client rich applications there is often more database traffic being routed over the network between the client and the server infrastructure whereas in a web application the database traffic can be kept between the application server and the database server. This puts the onus on the application developer to take this into account when architecting their application. It can be done efficiently but it does raise that "old" argument and problem.

So perhaps it is time to look at how we develop applications and rather than swinging the pendelum back to all client rich applications, maybe we should be looking at a better balance of applications leveraging the best technology for the requirements.

Just a thought