Smart Grid on the Edge – Addressing the Challenge

In the first posting of this series, we looked at the challenges facing smart grid operators and the shift in data gravity from the center to the edge of their service areas. We highlighted the importance of edge-based data management when addressing rapidly escalating remote and distributed data challenges. In this installment, we will distill key criteria to consider when evaluating an edge data management platform.

Criteria 1: Service level. Design and features that enable management of conflicting service levels and associated workloads is the most important criteria to evaluate for edge data management solutions. Edge data management systems must be connected to operational systems for data ingestion, which will invariably convey a critical service level requirement. At the same time, evolving data-hungry application demands will bring a much broader connectivity surface to the edge ecosystem. Brokering this highly secure and reliable data capture (control plane) service with a relatively wide open IT data broker (data plane) requirement is an enormous challenge in an edge platform footprint—highly secure, reliable, and able to deliver data to any application at any service level on demand.

Criteria 2: Extensibility. Edge environments are characterized by limited resources and challenged by next-generation digital solutions with relatively unlimited demand. Edge data services must be portable across host devices (“nodes” at the edge), efficient enough for low-power rugged systems, and span multiple generations of hardware, as field-installed systems often have long deployment windows. This implies a separation between device and software that traditionally has not existed across industrial, embedded ecosystems.

Additionally, data access and delivery protocols must conform to myriad user application requirements. This also implies OTA delivery of updates and upgrades to ensure long-term compatibility with rapidly evolving end-use applications.

Finally, reliable standards-based API interfaces should provide for rapid development and/or integration between edge nodes and applications that cover applications as diverse as mobile phones and IT-managed enterprise information systems.

Criteria 3: Data fidelity. As with any information-based strategy, digital management of smart grid resources will require consistently high-quality data.

Local data capture systems must be able to successfully balance high-frequency data input (write) with highly variable end-use (read). Failure to efficiently manage resources and guarantee bandwidth for sensor input can result in lost data and/or inaccuracies in timestamp values. This can be a significant challenge when considering environment constraints forced on edge devices, as noted in criteria 2.

Data capture should be lossless at the edge. Consider this question, “If machine data capture is limited to the current-best operating model, how will data be mined to develop next-generation operating models?” Legacy control systems were not designed for high-fidelity data capture and dynamic access. In fact, a case can be made that control plane access and management should be kept independent from broader data access demands (see criteria 1 regarding service levels).

A practical rule: Demand for high-quality, empirical data capture cannot jeopardize service levels and security requirements for legacy operational technologies and control systems.

Finally, note that lossless data capture does not necessarily preclude filtering and summarization for data transfers between edge nodes and central management platforms. An edge-based data platform should have the ability to retain full fidelity history locally for a meaningful window (months to years)—allowing for a practical window of lossless data capture. This elucidates additional requirements for local data filtering, selection, compression, and encryption on the edge system prior/inline to transfer of targeted data for end-use applications.

This post is part two of a three-part series from Exara. In the next, and final, installment we will discuss an approach to evaluating what types of data processing are useful at the edge—beyond the four-walls, managed data center/cloud resources—with an eye  toward short-term results and longer-term smart grid transformation.

About the author: Eric Kraemer is the CTO and cofounder of Exara, Inc., with more than 15 years’ experience working with high-performance information systems and software development, both in the US and internationally.

Exara is an Intel Partner that has developed the world’s first edge data services platform for the Industrial IoT. The Exara platform is delivered on Intel®-based edge servers that can be deployed in substations, electrical plants, and many other industrial environments. For more information, please visit Exara's website.