Machine Learning in Manufacturing: Collect, Contextualize, and Predict

Some manufacturers may find the journey to predictive analytics with machine learning a daunting initiative. The task of collecting, storing, and analyzing huge volumes of disparate data in a consistent, repeatable manner can overshadow the benefits of improved product quality, improved yield, and reduced maintenance costs. But with today’s advances in machine learning algorithms and compute performance these models are manageable and deliver significant value.

At Intel, we use machine learning to help identify tool health, predict the quality of our wafers, and increase the overall yield. We use a standard Industrial Internet of Things (IIoT) framework that separates data from logic. The predictive analytics are based on three primary building blocks:

  1. Connectivity. First, we identify the data available from our existing sensors and the data we can collect by integrating new sensors. Data structures vary across sensor types, so standardizing the message structures by source and type, simplifies the data integration. A service-oriented architecture (SOA) provides a stable foundation to minimize the impact of future changes and allows quick, seamless updates to existing environments.
  2. Transforming the data. We simplify integration with standard message structures. We deliver standardized components for visualization and analytics using third-party and open source tools. These data structures, or messages, retain their unique origins, time stamps, and other identifying factors to ensure that we can trace the resulting insights back to the source.
  3. Building up the hierarchy. We started with one tool to demonstrate capabilities and identify how the machine was behaving. We then added additional tools of the same type to the framework to understand how they were behaving in context to their counterparts. Data mining allowed us to correlate the statistical patterns and establish relationships across tools, processes, and products. We are now using automated models, based on the patterns observed in the initial data.

Many machine learning models use unstructured data. But machine data is typically numerical, making it easier to evaluate. For example, measuring the pressure within an engine pipe includes an event, such as powering on the engine. Measuring pressure also includes the time it takes for water to travel the length of the pipe, the temperature, and the condition of the motor when the event occurs. Subject matter experts understand the acceptable ranges and control values for the specific processes and tools. The experts’ analysis is crucial to determine when the model should take action.

Combining values from additional sources also strengthens the correlation between out-of-range values and other events in the process. The more data we collect, the better our insights are into tool and process condition.

Iterative Process Development

We use an iterative process to refine the data and improve the insights. When we first started, we manually ran the models and observed patterns. The patterns became the foundation for building models to detect tool nonconformance, as well as predict product yield and quality. Then we moved to using third-party applications. Now, we are using h machine-learning models. Repeating the process helps filter out unnecessary data, identify misaligned data, and find new data to enrich our understanding.

With each iteration we discover new relationships, allowing us to further refine our data and models. The discoveries are limitless. We continue to deepen our insights and see even greater benefits. If you want to learn more about our journey, check out our paper, “Increasing Product Quality and Yield Using Machine Learning.

Published on Categories AnalyticsTags , , , ,
Karl Brennan

About Karl Brennan

Karl Brennan is an Enterprise Architect based in Intel Ireland’s Technology Manufacturing Group (TMG) Innovation and Research organisation. Karl is the lead technical developer for Intel’s first IIOT application deployed across Intel’s wafer fabrication manufacturing sites worldwide. This system is deployed across a wide spectrum of cleanroom wafer fabrication equipment and is used to monitor and manage high precision manufacturing equipment in this increasingly complex manufacturing environment. Prior to this assignment Karl worked as a lead researcher in Intel’s European research organisation (ILE.) where he lead the Intel involvement in a number of European Union funded research projects, including KAP, PlantCockpit, IOLanes, and Timbus. Karl’s areas of interest include software enabled advanced manufacturing research, visual data mining, distributed computing, and Software Architectures for high volume environments. Karl is a 20+ year veteran Intel employee and has previously operated in a number of software related roles within the Intel manufacturing environment. Prior to working for Intel, Karl worked for Digital Equipment Corporation (DEC) as a software developer. Karl graduated in 1985 from Trinity College Dublin with a B.Sc in computer science and completed an M.B.A. in Glasgow University, Scotland in 1991.