We have teamed up with Seagate, a major manufacturer of hard drives, to design, develop and implement such a technology in their factory; GSK, a leading pharmaceutical manufacturer, has recognised the potential of real-time process analytics and NVIDIA, the global GPU provider. The goal is to establish production robustness against major disruptions and market volatility that create uncertainties regarding workforce numbers and supply chain continuity.
This vision paves the way for responsive manufacturing systems and digitally controlled factories to materialise technology that can seamlessly analyse sensor-obtained data and translate it into actionable information. Whilst companies capture large datasets, their ability to process them and react in real-time is hindered by the algorithms' complexity and the scale of the data. Indeed, if anything, the current pandemic has reinforced the need to enhance manufacturing capability to cope with sudden increases in demand, production repurposing, and possibly even entirely autonomous production.
The steps taken in the processing capability and manufacturing systems where the data can be analysed in real-time at the edge, i.e. on the factory floor, make it secure and thus ensure more effective performance by being less reliant on external communications and high-performance processing resources. We propose that this be done methodologically and securely with minimal dependencies on external factors, thus prompting us to investigate ways of performing real-time analytics in a practical, cost-effective and sustainable manner.
RAPID proposes a two-pronged approach to reduce the computational dimensionality through novel 'data sketching' algorithms and optimisation using 'transprecision computing' on GPU technology to provide further acceleration. In detailed interactions with Seagate and GSK, both based in the UK, we have identified manufacturing stages where real-time analytics can majorly transform processes and outcomes.
In particular, the proposed technology will be applied to a 'diagnostic analytics' case study involving optical imaging data for a critical metrology stage in disk manufacture. The study will involve two 'predictive analytics' examples for model learning to predict the health state of silicon wafers and for improved fault detection, feature extraction and monitoring of chemical products.
The data sketching method dramatically reduces the complexity of computations by randomly sampling a few, most informative, data and model entries, leading to small-scale computations that can be performed very quickly with a slight compromise on precision. Sketching trades off precision and speed, and if done optimally, a two-order-of-magnitude speedup is feasible when sampling around 10% of the data.
This advantage is further exploited using the sketched computations implemented using transpression computing, which challenges traditional computing to accelerate computations further when high precision is not required. In computing with noisy data and learned statistical models in factory environments, a controllable reduction in precision is prudent for performance improvement but also essential for noise robustness.
Further Information:
EPSRC EP/V028618/1 (PI Nick Polydorides, Jan 2022-May 2025)