Optimizing in-memory databases for advanced analytics

The perils of downtime

IT organizations today are under pressure to keep a large number of complicated systems available for both employees and customers, 24/7. This is especially true for those systems that support business-critical applications and processes. For example, the SAP HANA* platform is commonly used across multiple industries to process data and transactions, and then run advanced analytics on them to deliver real-time insights. These are increasingly data-heavy, complex advanced analytics workloads like predictive or prescriptive analytics, and AI.

With so much riding on these systems operating reliably, any unforeseen downtime or admin mistakes (such as unplugging the wrong server during maintenance, or the power supply to the data center being interrupted) can have a big impact.

In these high-pressure situations, you need to get things back up and running as quickly as possible—for the good of the business and your own peace of mind. However, today’s increasing data volumes mean that re-starting a server or loading terabyte-scale databases into a traditional memory system can take hours (or even longer).

This is because a server’s main memory is typically volatile, meaning data is lost if the power supply is interrupted. It must then be loaded back into the memory before redundancy can be restored, creating more downtime. This is not only expensive, but it pulls database admins’ focus away from other important tasks.

SAP and Intel: A combined approach to optimizing advanced analytics workloads

Overcoming this challenge requires innovation in the way data is managed and stored, at both a database and a hardware level. Happily, recent developments by both SAP and Intel have aligned to enable a revolutionary leap forward in these capabilities.

Intel® Optane™ DC Persistent Memory represents an entirely new way of managing data for demanding workloads like the SAP HANA platform. It is non-volatile, meaning data does not need to be re-loaded from persistent storage to memory after a shutdown. Meanwhile, it runs at near-DRAM speeds, keeping up with the performance needs and expectations of complex SAP HANA environments, and their users.

At the same time, the latest version of the platform itself—SAP HANA* 2.0 SPS 03—contains innovations that are designed to enhance support for complex challenges like advanced analytics, development and data management. For example, it provides improved performance through parallel processing improvements for the training and scoring of predictive models, as well as high-availability and load balancing with TensorFlow* integration. When paired with powerful compute capabilities such as those provided by Intel® Xeon® Scalable processors, the platform can be further optimized.

Enterprises can now take advantage of these enhancements to rethink the way they approach data tiering and management. With more flexibility in how memory and storage are used, a number of compelling benefits for the IT team and the business emerge:

  • Downtime is significantly reduced, with testing showing that restart time for SAP HANA 2.0 going from 50 minutes using traditional DRAM to just four minutes using Intel® Optane™ DC Persistent Memory—a 5x improvement.1
  • Greater non-volatile memory capacity means more data can be analyzed at a time, helping speed time to results or decisions.
  • Total cost of ownership for memory for an SAP HANA environment can be reduced by replacing expensive DRAM modules with non-volatile persistent memory.
  • It’s now possible to place your entire storage area network (SAN)-based warm data tier in data modules that act like main memory, which helps boost performance for analytics and other complex workloads.

Explore how you can optimize your SAP HANA environment for advanced analytics using technologies like Intel® Optane™ Technology and Intel® Xeon® Scalable processors.

Follow me at @TimIntel for the latest news on Intel and SAP.


Intel® technologies’ features and benefits depend on system configuration and may require enabled hardware, software or service activation. Performance varies depending on system configuration. No computer system can be absolutely secure. Check with your system manufacturer or retailer or learn more at intel.com

Performance results are based on testing as of 30 May 2018 and may not reflect all publicly available security updates. See configuration disclosure for details. No product can be absolutely secure.

Results have been estimated or simulated using internal Intel analysis or architecture simulation or modeling, and provided to you for informational purposes. Any differences in your system hardware, software or configuration may affect your actual performance.

All information provided here is subject to change without notice. Contact your Intel representative to obtain the latest Intel® product specifications and roadmaps.

1. SAP HANA* simulated workload for SAP BW edition for SAP HANA Standard Application Benchmark Version 2 as of 30 May 2018. SAP and Intel engineers performance the testing. Baseline configuration with traditional DRAM: Lenovo ThinkSystem SR950* server with 8x Intel® Xeon® Platinum 8176M processors (28 cores, 265 watt, 2.1 GHz). Total memory consists of 48x 16GB TruDDR4* 2,666 MHz RDIMMs, and 5x ThinkSystem 2.5” PM1633a 3.84TB capacity SAS 12Gb hot swap SSDs for SAP HANA storage. The operating system is SUSE* Linux* Enterprise Server 12 SP3 and uses SAP HANA 2.0 SPS 03 wit a 6TB dataset. Start time: 50 minutes.

New configuration with a combination of DRAM and Intel® Optane® DC persistent memory: Lenovo ThinkSystem SR950* server with 8x Intel® Xeon® Platinum 8176M processors (28 cores, 265 watt, 2.1 GHz). Total memory consists of 48x 16GB TruDDR4* 2,666 MHz RDIMMs and 48x 128GB Intel Optane DC persistemt memory modules (PMMs), and 5x ThinkSystem 2.5” PM1633a 3.84TB capacity SAS 12Gb hot swap SSDs for SAP HANA storage. The operating system is SUSE* Linux* Enterprise Server 12 SP3 and uses SAP HANA 2.0 SPS 03 wit a 6TB dataset. Start time: 4 minutes.

Published on Categories Big Data and AnalyticsTags , ,
Tim Allen

About Tim Allen

Tim is a strategic relationship manager for Intel driving enablement for enterprise software companies related to the cloud, big data, analytics, AEC, commercial VR, datacenter, and IoT. Tim has 20+ years of industry experience including work as a systems analyst, developer, system administrator, enterprise systems trainer, product marketing engineer, and marketing program manager. Prior to Intel Tim worked at IBM, Tektronix, Intersolv, Sequent and Con-Way Logistics. Tim holds a BSEE in computer engineering from BYU and an MBA in finance from the University of Portland. Specialties include - PMP, MCSE, CNA, HP-UX, AIX, Shell, Perl, C++