Mapping Data Strategy to Business Needs

The world’s data is growing exponentially, with a 10-fold increase on existing volumes forecast by 2025.1 However, less than one percent of the data available today is analyzed and used.2 Organizations that can crack the code to unlocking this goldmine will be able to capitalize on a long-lasting competitive advantage.

In this quest for data-driven prowess, IT teams have been propelled from the side-lines of business strategy to center stage. They increasingly find themselves at the heart of big strategic discussions, tasked with explaining how to manage, analyze, and democratize data in a way that directly maps to business ambitions.

Summarizing the key advice from Intel’s new whitepaper ‘Tame the Data Deluge’, this article outlines some of the key considerations IT leaders must address to build an intelligent data strategy.

First Things First, Sort Out the IT Infrastructure

A great building won’t last on a weak foundation, and legacy processes and models can quickly become a hindrance. Data silos, fragmented systems, and older data storage models can impede digital transformation and hinder progress to becoming an insights-driven business.

To overcome this challenge, it’s important to identify the bottlenecks in an infrastructure and replace them with a modern data management infrastructure. Real-time data hub models, for example, enable near-time analytics to be performed directly on the data, eliminating the need for data copies to be held in multiple formats. These hubs can be extended as needed to incorporate unstructured data, using data stores such as Hadoop*, and support the creation of an end-to-end analytics infrastructure on top of this foundation.

Next, Optimize Data Management

Not all data holds the same value, so it’s important to prioritize the types and sources of data that are most valuable for the business and build a data tiering strategy. Data tiering involves ranking data, from ‘hot’ (that which is in constant use and business critical) all the way through to ‘frozen’ (which may never be accessed but which must be kept for other reasons, such a legal requirements).

Once the data tiers are defined, the existing storage architecture must be assessed to see where there are—and aren’t—alignments. Keeping unnecessary data too close to the processor, incurs unnecessary storage costs; but, on the flip side, marginalizing important ‘hot’ data means performance will suffer.

In the past, a gap in the market between fast, in-memory and slower storage technologies forced organizations to make trade-offs between speed and cost efficiency. However, new solutions—such as Intel® Optane™ SSDs, which offer the speed and performance of in-memory technologies like DRAM with a lower price point—have become available to bridge this divide, making it possible for organizations to place their data exactly where they need it, cost effectively.

Make Sure to Use Data Optimization Techniques

Building deep optimization across hardware, software, and the solution stack will enable the growth and performance needed for analytics and AI workloads. A variety of techniques are available to help with data optimization, so be sure to research which are best suited to your data environment.

Two examples of such techniques are erasure coding (EC) and the Intel® Intelligent Storage Acceleration Library (Intel® ISA-L). Erasure codes, or error correcting codes, are a process for encoding a message in such a way that the original can still be recovered even if part of it is lost or corrupted. This offers a means of optimizing data protection in a way that minimizes security costs and frees up storage space. The Intel® ISA-L is an open source library with a selection of tools for optimizing storage throughput, security, and resilience while minimizing disk space usage. These tools can help to significantly increase de-duplication speeds, as well as increasing data access to allow more time for detailed analysis.

Use Accelerators to Optimize Data Ingestion

It’s also worth taking advantage of accelerator technologies like field programmable gate arrays (FPGAs) to help optimize data ingestion and help handle spikes in data relatively cost effectively. Intel® FPGAs attach directly to copper, fiber, and optical wires, and can move any data in any format from wire to memory in nanoseconds without the need for a network interface card (NIC). As they can be re-programmed to accommodate changing needs, they are also important for future-proofing infrastructure investments and ensuring new use cases can be adopted using existing hardware.

Enforce Stringent, Well-Defined Data Governance and Security Policies

In today’s digital enterprises, governance surrounding data collection and usage is no longer just the domain of the IT team. An effective and secure data-driven culture is built on well-defined and strictly enforced policies for storing, organizing, managing, analyzing and sharing data across the entire organization. Effective governance regarding how data is preserved, protected, and shared is key to unlocking these benefits. There are various factors that must be taken into consideration when establishing data governance policies—including where the data is coming from, who has access to what types of data, whether it is properly tagged and labelled, and whether your governance strategy supports various regulatory initiatives.

To find out more and how Intel can help you get started on this journey, read our new white paper ‘Tame the Data Deluge’.