According to Gartner, more than $1 trillion in IT spending will be directly or indirectly affected by the shift to cloud over the next five years. Many research firms point to hybrid cloud as a fastest-growing segment, including MarketsandMarkets, which predicts that demand will increase at a compound annual growth rate of 27 percent through 2019, outpacing the IT market overall.
There’s no question that cloud technologies have improved time to market, lowered operational and capital expenditures, and provided organizations with the ability to dynamically adjust provisioning to meet changing needs globally. And yet, as many businesses shift from on-premise, private clouds to public or hybrid models, a myriad of technical questions and business concerns come into play as compute, network and storage resources are further virtualized.
Is this data earmarked for the public cloud proprietary information and are there legal requirements for how this data is stored? One Fortune 500 financial organization had the initiative to move applications and data to the public cloud. However, it was later discovered that their corporate policy prohibited placement of personally identifiable information (PII) and other sensitive data beyond their internal network/firewall. Although many security standards are supported by public cloud providers, the financial organization elected to keep their data on-premises because of its internal policy.
Another critical criterion is the IT staff’s tolerance for latency. If your applications and databases must respond within a defined time frame to meet end-user expectations, or they require very high availability or redundancy, they may be best suited to private or hybrid clouds. For a research or academic organization, what is an acceptable trade-off between slightly reduced performance or lack of customization for a reduced data center footprint might not be appropriate for a globally distributed retail enterprise?
The paradox is that many businesses recognize the gains associated with moving to public or hybrid cloud models, but often do not fully appreciate the strategy necessary to optimize their performance. Fortunately, there are methods to help IT teams better understand how their cloud infrastructure is performing.
Cloud Infrastructure Tools provide IT staff with greater visibility and real-time insight into power usage, thermal consumption, server health, and utilization. The key benefits are better operational control, infrastructure optimization, and reduced costs, no matter the shape of an organization’s cloud.
So as the clouds part, let’s look at some of the ways Cloud Infrastructure Tools can help IT teams in their transition from private to public or hybrid clouds.
Going Public and Provisioning
Before moving your data to the public cloud, an organization’s IT staff needs to understand how its systems perform internally. The needs of its applications, including memory, processing power, and operating systems, should impact what it provisions in the cloud.
Cloud Infrastructure Tools collect and normalize data to help teams understand their current implementation on-premise, empowering them to make more informed decisions as to what is necessary for a new cloud configuration.
Hybrid and Hardware
Underlying hardware remains a concern in hybrid models. As such, businesses need to proactively understand hardware errors, where they are, and what to do with them.
Cloud Infrastructure Tools can analyze current hardware usage to help IT staff to understand what servers are too busy and where resources are underutilized. According to a McKinsey study, it’s estimated that as much as 30 percent of the servers in data centers are “dead” or underutilized, using less than 15 percent of their compute capacity, but consuming 70 percent of their rated energy capacity.
While the cloud has transformed the ways companies do business, as with any IT solution, cloud computing doesn’t come without the risk of potentially costly outages. Knowing server power and thermal consumption in real-time can identify underlying hardware issues before they impact uptime.
The Golden Egg of ROI
In a hybrid model, businesses have more flexibility to move their data assets for improved performance and ROI. For example, non-essential data used infrequently can be moved off-site to free resources for data that impacts business performance day-to-day. It’s a puzzle, but it’s possible to align the pieces in such a way that businesses can run their systems at peak performance.
Utilizing Cloud Infrastructure Tools, IT staff can identify how to best provision and refactor data for maximum value to the business. Especially as businesses adopt multiple container solutions, it will become critical for them to understand how each individual solution performs, and how they impact the health and agility of their hybrid model overall.
Gartner’s Ed Anderson, whose focus is the cloud services market, including market trends and forecasts, has characterized the multi-cloud environment “as a foundation for the next wave of applications.” If this is true, then Cloud Infrastructure Tools help IT staff navigate their organization’s course through the clouds and across the rising tide.
Jeff Klaus will be speaking at 3:45 pm on Dec. 5th Veronese 2401B at the Gartner IT Infrastructure, Operations Management & Data Center Conference, taking place on Dec. 4-7 in Las Vegas. Intel will be demoing data center management solutions at booth #131.
This article originally appeared in Network World.