In May 2015, I wrote the first in a series of blog posts exploring the journey to software-defined infrastructure. Each blog in the series dives down into different stages of a maturity model that leads from where we are today in the typical enterprise data center to where we will be tomorrow and in the years beyond. During that time, I also delved into the workloads that will run on the SDI platform.
As I noted in last month’s post, traditional analytics leads first to reactive and then to predictive with the ultimate destination of prescriptive analytics. This state is kind of the nirvana of today’s big data analytics and the topic we will take up today.
Prescriptive analytics extends beyond the predictive stage by defining the actions necessary to achieve outcomes and the inter-relationship of the outcomes to the effects of each decision. It incorporates both structured and unstructured data and uses a combination of advanced analytics techniques and other scientific disciplines to help organizations predict, prescribe, and adapt to changes that occur. Essentially, we’ve moved from, “why did this happen,” to, “what will happen,” and we’re now moving to, “how do we make this happen,” as an analytics methodology.
Prescriptive analytics allows an organization to extract even more value and insight from big data—way above what we are getting today. This highest-level of analytics brings together varied data sources in real time and makes adjustments to the data and decisions on behalf of an organization. Prescriptive analytics is inherently real-time—it is always triggering these adjustments based on new information.
Let’s take few simple examples to make this story more tangible.
- In the oil and gas industry, it can be used to enable natural gas price prediction and identify decision options—such as term locks and hedges against downside risk—based on an analysis of variables like supply, demand, weather, pipeline transmission, and gas production. It might also help decide when and where to harvest the energy, perhaps even spinning up and shutting down sources based on a variety of environmental and market conditions.
- In healthcare, it can increase the effectiveness of clinical care for providers and enhance patient satisfaction based on various factors across stakeholders as a function of healthcare business process changes. It could predict patient outcomes and help alleviate issues before they would normally even be recognized by medical professionals.
- In the travel industry, it can be used to sort through factors like demand curves and purchase timings to set seat prices that will optimize profits without deterring sales. Weather and market conditions could better shape pricing and fill unused seats and rooms while relieving pressure in peak seasons.
- In the shipping industry, it can be used to analyze data streams from diverse sources to enable better routing decisions without the involvement of people. In practice, this could be as simple as a system that automatically reroutes packages from air to ground shipment when weather data indicates that severe storms are likely to close airports on the usual air route.
I could go on and on with the examples, because every industry can capitalize on prescriptive analytics. The big takeaway here is that prescriptive analytics has the potential to turn enormous amounts of data into enormous business value—and do it all in real time.
With the impending rise of prescriptive analytics, we are entering the era in which machine learning, coupled with automation and advanced analytics, will allow computers to capture new insights from massive amounts of data in diverse datasets and use that data to make informed decisions on our behalf on an ongoing basis.
At Intel, we are quite excited about the potential of prescriptive analytics. That’s one of the reasons why we are a big backer of the open source Trusted Analytics Platform (TAP) initiative, which is designed to accelerate the creation of cloud-native applications driven by big data analytics.