Breaking Down Big Data
This article was originally published to the Australian Mining website.
Mining is an industry that runs the technological divide.
Whilst it is leading the way in terms of remote operations and automated systems that allow a miner to operate a mining truck from thousands of kilometres away, at the other end of the scale enormous processing and metallurgical operations are monitored, controlled, and planned using Excel sheets despite the fact that mountains of precise technical data is already being collected.
With the rise of Big Data, and the ability to monitor – often in real time – flow data from tanks and pipes, as well as metal and acid content, and compare it with historical data engineers are often overwhelmed with information.
Nearly every aspect of the mining industry, from minute processes through to massive haul truck payloads and warehousing and maintenance activities are now measured, tracked, and stored, and these machines and data sets can now compare and create a predictive picture for future production in a way the industry never could before.
“We see a significant number of mines that have data locked away in individual systems but now want to federate that data together, instigate new processes, involving their people in new ways to achieve better outcomes. Mining generates Big Data because the number of sensors are growing rapidly and systems involved are becoming more intelligent, so the challenge ahead is to federate that data,” Cisco Systems engineer Michael Boland said.
Rio Tinto has embraced this innovation path, and opened its ‘Big Data’, Analytics Excellence Centre early last year to help it deal with these reams of data from disparate sources.
Put simply, the humble –and easily amendable with no tracking oversight – Excel sheet will no longer cut it in the current environment.
This need for a greater data control, oversight, and comprehension is compounded by the recent changes to the ASX governance rules. Publically listed companies will now need to disclose their exposure to economic, environmental and social sustainability risks for the first time. This means miners need to present accurate, relevant corporate data, e.g. their operating data and compliance frameworks, to a level that was previously not required.
With this growing need to utilise the full capabilities of Big Data analytics to comply with ASX rules and lift efficiency, combined with clarity of data – as well as the capability for preventative maintenance – Metallurgical Systems has developed a program designed to tick these boxes, and which has already been roadtested at a number of copper and polymetallic operations globally.
The program, Metallurgical Intelligence, is a whole-of-plant management software that utilises thousands of data points to provide clean, accurate data, combined with automated intuitive reporting that integrates with existing systems, software, and processes, and can be tailored to individual sites.
Speaking to Metallurgical Systems managing director, John Vagenas, he explained the program was developed as there was a gap in the industry, and many mines were missing an opportunity to evolve their operations through the use of Big Data analytics.
The need for this system is being even more prevalent as the higher level engineers get closer to retirement age, and take not only their skills but also their knowledge of plant operations and what are often proprietary data systems with them, leaving a large knowledge gap.
Metallurgical Systems began life as an offshoot of Elemental Engineering, a process simulation and process development company focused on mineral and metallurgical processing. Elemental is already well known for its work on OZ Minerals’ hydromet demonstration processing plant.It parlayed this knowledge from Elemental to spin out the new company focused on its plant information system, Metallurgical Intelligence.
Using Tableau, Metallurgical Systems has allowed for data integration and drill down capabilities for engineers, operators, managers, and stakeholders in a user friendly environment, providing a total overview of every aspect of plant operations without the need for lengthy training programs or a background in IT.
“Once the system has enough information gathered from all the monitoring devices throughout the process, it can run a dynamic simulation of the entire process system down to individual tank level, building it from each node – and keep in mind that a plant may have 2000 to 3000 nodes,” Vagenas told Australian Mining. “This is a system that can examine and monitor information minute by minute, and be used to conduct detailed investigations and resolve issues.”
The program can also combine this information with data gathered from historical sources and the lab to calculate plant chemistry and throughput, and combine this with data collected from the mining process, as well as power generation and distribution data, to give a never before seen level of interconnectedness and oversight over an operation.
“This program can query any places that data is being stored, gather it together, filter it for quality and then organise it in a common structure where you can use it effectively,” Vagenas said.
The ability to get right down to an almost ridiculously granular level makes the program a stand out.
“This program can break down how different parts of the plant are performing, across any given shift or across a period of time, and how inventory is changing through the site,” he said.
It also performs rapid calculations
“It validates what’s in the refinery at any given time, down to the equivalent item level, and what’s in each tank,” Vagenas said, “it helps you understand what’s in your plant, what’s changing, and how it is changing.”
“You can get the details on how much acid is being consumed by each element, and how much material is leaching is in each tank, and the tank profiles.”
With this understanding, greater efficiencies in ore blending, use of consumables, and power usage can be gained. This data can then easily be shared amongst the company.
Vagenas gave the example of how it can delineate information, using one client’s experience on how it collects and presents data in a meaningful and accessible way.
This major miner powered its plant using a number of different electricity suppliers; it used the program to figure out the percentage each supplier provided per hour, and the costs, and then used this information to renegotiate contracts.
The program also allows for operating prediction, as it can overlay information which can then be used to compare relationships between aspects such as throughput, plant chemistry, acid usage, and recovery over certain periods of time or different shifts, and the use that data to predict future performance.
It can also be used for maintenance purposes. As it allows for a drill down to individual instruments and sensors, the program can be used to see which sensors are gathering data and where they reside in the plant, and if not those individual nodes can be investigated to find out why.
This system is also explicitly transparent, Vagenas said, as it uses individual log-ins and tracking to show what changes were made, and who made them.
It brings companies up to par in terms of the new ASX changes by making them compliant with the new codes, and makes their data easily externally auditable, and allows the company full access to their own data to make better, more incisive business decision.
Vagenas demonstrated its ease of use, highlighting its simple drag and drop system, stating that by using the Tableau interface for reporting it makes the process a lot more intuitive, and helps cuts tasks that previously took hours down to minutes.
He added that Metallurgical Systems is also adding new user interfaces to the program in November, as “we think we can make this even better”.
“This program is breaking down Big Data, and letting engineers get back to their job of analysing information and actually running the plant.”