A copper mining operation in Chile was struggling to improve their heap leach process – a process that involves multiple conveyor belts several kilometers long, massive bucket wheels, kilometers’ long crushed material piles, dangerous chemicals, and a long etc. After close to 3 years of implementing an array of over 1,400 sensors across the line, a 50 million USD investment, the operation team was not perceiving any benefits expected from the additional data. As a result, the heap leach process had become the bottleneck of the entire mining operation.
After a series of workshops with most stakeholders in the heap leach process – from bucket wheel operators to data teams to management – it was discovered that much of the data provided by the more than 1,400 sensors was not usable. In fact, only about 200 sensors were emitting data, out of which a meager 92 were providing reliable data and had 3 years of historical data that could be analyzed.
This was forcing the data team to rely on bulky and unreadable equipment data logs to identify candidate locations for failure. After this, a team would be sent to a candidate location (often driving a few kilometers) for further inspection.
A detailed analysis of maintenance logs revealed that in close to 70% of unplanned equipment downtime, the failure was as simple as a blown fuse, easy and quick to be repaired, yet finding the failure location had been taking 4 hours on average, resulting in millions lost in unplanned downtime every year.
The available historical data, fed from thermometers, vibrometers, accelerometers, pressure gauges, and similar, did not allow for sophisticated algorithms, since all of it is stored aggregately at a granularity not suitable for any machine learning algorithm studied. However, leveraging the Nelson Rules (Nelson rules - Wikipedia), an empirical rule-based model, it was still possible to manage the data sparsity and provided accurate recommendations to quickly narrow down the search for a failure. A simple and intuitive visual tool was deployed to the ops data team, greatly reducing the time to identify a failure. Estimates put the downtime reduction in 13%, or an equivalent of approximately 10 million USD/year.
It was surprising for most parties involved to discover that sensor data was so off from the plan implemented a few years back. It became evident that the IT unit, that should have been a key player in the project, was never involved, and was rather the Innovation business unit that introduced the idea, got the funding and supervised implementation. The key missing piece: ownership. It is unclear why, but as the project took shape, it is likely that both Innovation and IT assumed the expectations of each other. The former, that IT would naturally take over post implementation, while the latter expected a formal handover, that never took place.
With so many companies and professionals focusing on Big Data and AI solutions, it can be tempting to assume that unless you have massive volumes of high-quality data, you can’t really do much high ROI analytics, but there is beauty in small data too. Using an old, yet proven method for process control – dating back to the 1920s’! – it was still possible to obtain highly valuable output from data. Methods like this are often a practical solution for data scarcity and sparsity situations.
This initiative was part of a larger and global Digital Transformation program the mining company had initiated about 4 years before. By ensuring the relevant stakeholders were highly involved, we were able to greatly reduce the friction between corporate and operational teams – which often hinders change in conservative, high corporate inertia industries like mining.
The tool provided not only helped the operation teams simplify a costly and frustrating aspect of their jobs, it opened their minds to the value of intentionally collaborating across functions, beyond the daily tasks and shift objectives. And yes, a non-trivial 10 million USD added to the topline.
In large transformational projects of any nature, it is tempting for leadership to communicate the vision by relying solely on carefully crafted messaging and slide decks to the most immediate layers down the corporate ladder. After all, you have chosen your leaders because you trust how they manage their teams and how they align to your leadership and vision. But in large organizations – this company has close to 100,000 employees across a dozen countries – grandiose visions can be watered down with shocking speed as you get to the most hands-on employees at the very beginning of the value chain. Some time after this project, this company appointed a Chief Transformation Officer with clear objectives, resources, and accountabilities. In less than 2 years, they were at the forefront of the industry in terms of innovation, automation, analytics.
On the technical side, it quickly became clear that the noisy nature of IoT data would present an important challenge. It was through tenacity and creativity that we came to the conclusion that this was a classic case of less is more: the data available was simply not good enough for the sophisticated algorithms we were certain to be successful, but beyond that, we were forgetting the context in which this project was being developed and that black box solutions had little chance of a successful implementation (random forests were actually quite promising for a couple of weeks, before becoming a dead end) even if highly accurate. At the end of the day, one tacit objective was to gain the trust of highly conservative operators that have for decades done things their own way – and successfully.
Finally, I’m very intrigued with how a project like this could have leveraged the booming AI echo system we live today. Could ChatGPT’s anomaly detection solutions have proven accurate? Given the hype, could a solution like that gain traction in a highly conservative operation?
Discover how a marketing firm streamlined their influencer assessment process, blending quantitative and qualitative measures to improve performance, optimize contracts, and significantly reduce costs.
Explore the pivotal "Experiment and Iterate" stage in the digital transformation journey. Learn how organizations capture, implement, measure, and communicate early best practices to accelerate their unique transformational odyssey. Discover how these practices become the foundation for ongoing success, fostering a culture of transparency and adaptability.
Explore the essential first step in the digital transformation journey - "Sculpting the Vision." Learn how organizations formulate a strategic imperative that guides every aspect of their transformation. Discover the importance of understanding the 'why,' identifying pain points, and engaging key influencers to pave the way for digital innovation and competitiveness.