The Future of Autonomic Computing Innovation

The computer industry is filled with pundits, speculators, visionaries, salesman, brilliant architects and professors. Each provides invaluable insight into their experience, their intelligence, their alma mater, their ticker symbol, their ego and what’s next. Some win the “what’s next lottery”, others work for years of brilliance in relative obscurity.

Seemingly, a world that has deployed over 1 Billion devices a year for the last 3 years , is incapable of understanding the gravity of a new programming models, a new hardware architecture, a sleek new design that delivers on a vision that Gene Rodenberry thought of in the 1960’s or Da Vinci in the 15th Century. What is old is new…..and let me tell you why? It will revolutionize the industry (not evolutionize…a term reserved for slower growing industry’s that require government assistant every decade or so…), transform your environment and provide freedoms you had only hoped to enjoy….and we invented it 40 years ago. Does any of this sound familiar?

It should. These are the paraphrased slogans of an industry in transition. Real products matters, product differentiation matters, standards matter, interoperability matters….and shareholders pay for future expectations.

The future of computing…is NOW. The future of the computer industry is NOW. The next generation of computer programming, software architectures and transformational technologies is NOW. As an industry we have finally begun to embrace interface, architectural and software programming standards to usher in a new era of interoperability and scalability. Behind us are the days of “proprietary interfaces” (What does that actually mean other than I am going to sell you some extra accessories that will be worthless in 2 years?), which do not provide a differentiated performance/cost advantage. Gone are the days of developing programming languages that lock-in customers to individual companies, whether vendors innovate or not. These rules of the past are slowly melting away, allowing the entire industry to embrace interoperability and standards at the highest level in history. Industry diversity is healthy and insures that the most innovative and technologically relevant companies will “win” most of the time. Allowing the 1 Billion and the Next Billion customers of the world to enjoy the best interface technology yet developed….each other.  It also provides us with a unique ability to move to the next phase in our dynamic industry’s growth, autonomic instrumentation.

At Intel, we are constantly working to develop the next great performance architecture, filled with new innovative “goodies”, as our Chief Virtualization Architect Rich Uhlig calls them. These “goodies” (a technical term that Rich borrowed from his nephew, I believe) come in the form of virtualization technologies (Intel VT-x, Intel VT-d and Intel VT-c), security technologies (Intel LT-SX), performance technologies (Hyper-Threading, Turbo Boost) and energy efficiency instrumentation (Node Manager and Data Center Manager). Soon they will also include differentiated services in the cloud which facilitate ease of use and growth for a host of vertical industries in need of innovation. The resulting architectures that emerge will be instrument rich, feature capable and as scalable as users are willing to pay for.

Why is this important? Instrumentation matters. As we apply business and personal rules to our growing compute environments it has become increasingly clear that the more tools we make available to users the better informed we are in making decisions. The more disclosure we provide to investors through the use of autonomic programming architectures the more informed they will be of their investing decisions.

How can you day trade $1B in 35 different stocks without clear autonomic controls in your data center, your database, your application and your client devices?

How can you move 450 Million people efficiecntly throughout a country for 2 weeks without autonomic controls on transportation: plains, trains, boats and automobiles, as they do during the Spring Festival in China?

How can you process 1 Billion text messages a day without clear business rules? What happens when these messages are also coming from machines to other machines, modifying databases, applications and clients?

As humans, we must apply guidelines, much like laws,  for our machines to take action when we are asleep, when we are tired, when we are not present, when we are just simply being human….to slow to react to a rapidly changing environment.

The innovators of the computer industry today understand this NOW. We do not need to discuss a vision of 40 years ago without a plan to act NOW. Claiming ideas without action is dishonorable at best, criminal at worst. The innovators of today must build products and services that help solve the problems of today. We do not need to look to 2050 without a plan to act NOW. The visionaries of tomorrow are…..not born. The visionaries of today…can call me in 10 years.

Autonomic controls are in place today, machine to machine computer architectures are here today, scalable compute engines are here today. Are they perfect, no. Are they effective, yes. The design architects, product engineers and systems designers of today need to address these concerns. Autonomic Instrumentation delivers control to the administrator, the user and the developer. Rules engines can be modified to maximize efficiency, minimize consumption and increase productivity. All of these will lead to increase shareholder (read: No just people who buy shares of stock) value across your enterprise, your school, our hospitals, our governments, and your home.

When executed properly, Autonomic controls should be able to deliver 20-25% performance and efficiency increases with each new generation of Moore’s law. In some cases, as in the Intel Xeon® 5500 Series these increases have been over 150% in virtualization performance, these increases will be a combination of software architecture enhancement and silicon optimization. In other cases, it will be through the dedicated hard work of increase instrumentation capability of a processor platform at the same price of the previous generation through energy efficiency and memory controls.

Autonomic controls will also allow end users to avert disasters in our data centers, our homes and in our hands. Autonomic instrumentation design frameworks, allow users to set parameters on data migrations, data backup, security, memory access, power consumption and virtual machine architectures.

For Intel and our new Xeon® 5500 Series processor family, and our recently announced

Intel® Nehalem-EX platform provide the new generation of platform instrumentation. As product developers, designers and architects we should all find a way to increase the tools available to our customers to take advantage of these instrumentation capabilities. I look forward to being able to announce more of these new features as we announce them and help to provide development frameworks for developers, engineers and architects to build new products and services, ushering in the future of autonomic computing innovation…today.