This is the second of two blog posts by Stephen Gay exploring mineralogical modelling (Part 1 published 20th March 2014). Part 2 here develops the theme by comparing the value of using rule-of-thumb and probability -based particle modelling principals.
Mineralogical analysis is often approached from a pragmatic viewpoint (referred to here as the ‘rule of thumb’ approach), however the power of mineralogical analysis is greatly increased when linked with probabilistic models. The stereology problem (Figure 2) is an example of a probability problem. There is some probability a linear intercept will appear liberated, barren or composite. The particular subbranch of mathematics dealing with such problems is called geometric probability. In this case the probability that a section will appear composite is much larger because the grain size has decreased.
There is vast amount of research in the subject area of geometric probability and stereology (Gay, 1994; Keith, 1998; Latti 2004). Keith extended the stereological problem to dealing with multimineral particles. Latti validated the various methods using serial sectioning; which means taking a series of sections at very small distances so that the composition of the actual particles can be resolved.
The Probabilistic Framework for simulating mineral processing plants
Mineral processing also has a probabilistic framework. For example if a particle goes into a separation unit (such as flotation) then there is a probability that the particle will go to the corresponding concentrate stream or tailings stream. In a processing context, the word ‘probability’ can be used interchangeably with the word ‘proportion’. ‘Probability’ is the probabilistic term, and proportion is the final outcome. For example, if the probability a particle will go the concentrate is 90%, then it follows that we expect 90% of such particles (the proportion) will indeed go to the concentrate. In mineral processing we generally are aware of ‘proportions’ but often mineral processing is not considered from a probabilistic viewpoint. Figure 4 shows a partition curve, which are often used for modelling separation by density (although these are seldom used in base metals).
The partition curve can be determined by using the mineralogical distribution of the feed and either the concentrate or the tails. But suppose we had a known mineralogical composition distribution of the feed only, and no mineralogical information for the concentrate and tails; yet we did have the assay for various mineral for the concentrate and tailings. Although this is not enough information to directly determine the partition curve we can still estimate the partition curve using a probabilistic method called information theory. The approach is to start with a uniform distribution and then to adjust this so that the partition curve is consistent with the known information (feed mineralogy and product assays). The more information we can include in this approach the more accurate the estimate of the partition curve. (Gay and Vianna, 2002). Thus there is a compromise. Perform detailed mineralogical analysis throughout the whole plant; or make inferences wherever we can. These inferences are made using mathematics. Thus mathematics and mineralogical analysis work together to provide the best understanding that can be developed in a manner which is practical.
The author (Stephen Gay) has spent many years on the problem of inference of mineralogical information, and not only can the inference method be used on a single plant audit; but can be used over several plant audits (over time). The method is based on using ore variability to advantage (rather than the status quo of ore variability making estimation difficult). For example, if there are two copper-sulphide species (say chalcopyrite and chalcocite it is very difficult to use a probability method on a single set of audit data to identify how the chalcocite might behave compared to the chalcopyrite.
However if one performs a series of audits, where in some cases chalcopyrite may be higher grade than chalcocite and visa versa we can now identify the different properties of the minerals, as, for example if the chalcocite was more floatable than chalcopyrite then this can be easily detected by improved recoveries. This method of linking mineralogical analysis, probability methods and plant audits is entirely new (patent pending).
For example, one of the common criticisms of mineralogical analysis is that it is expensive. But as explained the cost of mineralogical analysis can be greatly reduced by using mass balance methods and probabilistic methods. What this means in practise is that we only need to perform mineralogical analysis on a few key (or major streams). Using mass balance and probabilistic techniques we can infer the mineralogical data at ALL other streams. This means we can now perform a simulation on the plant to determine optimum methods of plant operation.
Note: The probabilistic approach is based on information theory which has origins with Boltzmans’ theories of atoms, and Shannon’s foundational work on information theory (Shannon, 1932), although originally called a theory of communication. Later information theory was popularised by Jaynes, and is now commonly used in many areas of science. It is only it its infancy stages in mineral processing.
The particle based model for simulation
If we consider the problem of multimineral particles, even constructing a partition curve appears daunting because for each mineral we add a new dimension to the graph. Although it appears difficult it isn’t. We simply model our ore as consisting of particle types, with each particle types having a multimineral composition. We assign to each particle a probability of separation at each unit. We can calculate these probabilities using information theory. This approach was first developed by Keith 2000; and later applied to modelling liberation by comminution by Gay (2004).
So what are the steps for using mineralogical data to improve plant performance?:
This advanced methodology is entirely different to ‘rule-of thumb’. Rule of thumb is of value for identifying gross changes that are required. The methodology proposed is important for incremental improvement. Remember a 3% financial benefit per year for a plant with revenue of a billions of dollars a year is $30 Million! This can be enough to turn a viable operation into a viable operation.
The life of Ludwig Boltzmann http://en.wikipedia.org/wiki/Ludwig_Boltzmann.
Gay S.L 2004 A liberation model for comminution based on probability theory, Minerals Engineering. Vol. 7, No. 4: pp. 525-534.
Gay S.L. 1994 Liberation Modelling using particle sections (1994) PhD thesis. Julius Kruttschnitt Mineral Research Centre, The University of Queensland.
Gay S.L. 2014 MMPlantMonitor. A new software system for monitoring processing plants http://circlepad.com/MathsMet/MMPlantMonitor
Gay SL & Vianna S 2002 Mass Balancing – Considerations for reconciling mineralogical data. AusIMM Value Tracking Symposium, Brisbane pp. 131-140.
Jaynes E.T. 1995 Probability Theory: The Logic of Science , Available directly from the internet: http://shawnslayton.com/open/Probability%20book/book.pdf
Keith J.M. 2000 A Stereological correction of multimineral particles. PhD thesis. . Julius Kruttschnitt Mineral Research Centre, The University of Queensland.
Latti D. 2006 The Textural Effects of Multiphase Mineral Systems in Liberation Measurement. PhD thesis. Julius Kruttschnitt Mineral Research Centre, The University of Queensland.
Shannon C.E. , Weaver W.. 1949 The Mathematical Theory of Communication. Univ of Illinois Press, 1949. ISBN 0-252-72548-4.
A full list of JKMRC PhDs is available at: https://www.jkmrc.uq.edu.au/Publications/PostgraduateTheses.aspx