Visualizing IOT Disruptions

By Francisco Dagnino
Nov 08, 2023

Only 6.5% of Installed Sensors Were Usable and Providing Valuable Data

Customer Challenge

A copper mining operation in Chile was struggling to improve their heap leach process – a process that involves multiple conveyor belts several kilometers long, massive bucket wheels, kilometers’ long crushed material piles, dangerous chemicals, and a long etc. After close to 3 years of implementing an array of over 1,400 sensors across the line, a 50 million USD investment, the operation team was not perceiving any benefits expected from the additional data. As a result, the heap leach process had become the bottleneck of the entire mining operation.

Solution

After a series of workshops with most stakeholders in the heap leach process – from bucket wheel operators to data teams to management – it was discovered that much of the data provided by the more than 1,400 sensors was not usable. In fact, only about 200 sensors were emitting data, out of which a meager 92 were providing reliable data and had 3 years of historical data that could be analyzed.  

This was forcing the data team to rely on bulky and unreadable equipment data logs to identify candidate locations for failure. After this, a team would be sent to a candidate location (often driving a few kilometers) for further inspection.  

A detailed analysis of maintenance logs revealed that in close to 70% of unplanned equipment downtime, the failure was as simple as a blown fuse, easy and quick to be repaired, yet finding the failure location had been taking 4 hours on average, resulting in millions lost in unplanned downtime every year.

The available historical data, fed from thermometers, vibrometers, accelerometers, pressure gauges, and similar, did not allow for sophisticated algorithms, since all of it is stored aggregately at a granularity not suitable for any machine learning algorithm studied. However, leveraging the Nelson Rules (Nelson rules - Wikipedia), an empirical rule-based model, it was still possible to manage the data sparsity and provided accurate recommendations to quickly narrow down the search for a failure. A simple and intuitive visual tool was deployed to the ops data team, greatly reducing the time to identify a failure. Estimates put the downtime reduction in 13%, or an equivalent of approximately 10 million USD/year.

Major Insight

It was surprising for most parties involved to discover that sensor data was so off from the plan implemented a few years back. It became evident that the IT unit, that should have been a key player in the project, was never involved, and was rather the Innovation business unit that introduced the idea, got the funding and supervised implementation. The key missing piece: ownership. It is unclear why, but as the project took shape, it is likely that both Innovation and IT assumed the expectations of each other. The former, that IT would naturally take over post implementation, while the latter expected a formal handover, that never took place.

With so many companies and professionals focusing on Big Data and AI solutions, it can be tempting to assume that unless you have massive volumes of high-quality data, you can’t really do much high ROI analytics, but there is beauty in small data too. Using an old, yet proven method for process control – dating back to the 1920s’! – it was still possible to obtain highly valuable output from data. Methods like this are often a practical solution for data scarcity and sparsity situations.

Results & Value Generated

This initiative was part of a larger and global Digital Transformation program the mining company had initiated about 4 years before. By ensuring the relevant stakeholders were highly involved, we were able to greatly reduce the friction between corporate and operational teams – which often hinders change in conservative, high corporate inertia industries like mining.

The tool provided not only helped the operation teams simplify a costly and frustrating aspect of their jobs, it opened their minds to the value of intentionally collaborating across functions, beyond the daily tasks and shift objectives. And yes, a non-trivial 10 million USD added to the topline.

Reflection

In large transformational projects of any nature, it is tempting for leadership to communicate the vision by relying solely on carefully crafted messaging and slide decks to the most immediate layers down the corporate ladder. After all, you have chosen your leaders because you trust how they manage their teams and how they align to your leadership and vision. But in large organizations – this company has close to 100,000 employees across a dozen countries – grandiose visions can be watered down with shocking speed as you get to the most hands-on employees at the very beginning of the value chain. Some time after this project, this company appointed a Chief Transformation Officer with clear objectives, resources, and accountabilities. In less than 2 years, they were at the forefront of the industry in terms of innovation, automation, analytics.

On the technical side, it quickly became clear that the noisy nature of IoT data would present an important challenge. It was through tenacity and creativity that we came to the conclusion that this was a classic case of less is more: the data available was simply not good enough for the sophisticated algorithms we were certain to be successful, but beyond that, we were forgetting the context in which this project was being developed and that black box solutions had little chance of a successful implementation (random forests were actually quite promising for a couple of weeks, before becoming a dead end) even if highly accurate. At the end of the day, one tacit objective was to gain the trust of highly conservative operators that have for decades done things their own way – and successfully.

Finally, I’m very intrigued with how a project like this could have leveraged the booming AI echo system we live today. Could ChatGPT’s anomaly detection solutions have proven accurate? Given the hype, could a solution like that gain traction in a highly conservative operation?

Hungry for more?

Discover more ideas to improve your business

Generative AI Framework: Select a Tech Framework

Dive into the heart of Generative AI Strategy with a focus on the "Select the Tech Framework" pillar. Explore the nuances, challenges, and best practices to ensure a seamless tech adoption journey.

Charting the Roadmap: Digital, Celestial Navigation

Dive into the heart of the transformational journey as we explore the crucial "Chart the Roadmap" stage. Learn how organizations translate their digital vision into actionable steps, develop interlinked processes, and design a constellation of specialized roadmaps to address diverse objectives and pain points. Discover the strategic agility needed to sail through the uncharted waters of digital transformation.

Utilizing GenAI: Compliance Just Got Easier

Explore Phi Research's integration of Generative AI (GenAI) into compliance processes, transforming the way businesses manage regulatory requirements. Discover the Regulatory Chatbot POC, which empowers executives with real-time compliance insights, and explore innovative GenAI use cases for automated documentation, compliance checks, reporting automation, audit trails, scenario planning, contract review, and analysis. 

Book a Meeting & Find Balance with a Digital Transformation
Let's make your vision a reality
Bring harmony to your business
Start Now