Conference PaperPDF Available

Operational mineralogy: an overview of key practices in sample analysis, sample preparation and statistics

Authors:
  • Petrolab Limited

Abstract and Figures

Automated mineralogy has been available to mining companies since the early 2000s and was driven by the development of QEMSCAN® and Mineral Liberation Analyser (MLA) systems. Numerous case studies have shown that these tools have the ability to provide valuable mineralogical metrics relevant to the characterisation and evaluation of an orebody. The current challenge with these tools has been the application in operational contexts with their often long turnarounds, complex ore-bodies and single point data sets of dynamic systems. Operational mineralogy is an exciting new key branch of automated mineralogy that has been made possible by the development of ruggedized and versatile SEM systems, rapid sample preparation techniques and data visualisation tools. These advances enable mineralogy to be at the heart of short term operational decision making. This paper reviews some of these advances and current best practise, particularly in sample analysis options, sample preparation and particle statistics that are enabling rapid turnarounds. All work has been developed on the ZEISS Mineralogic platform, and the final presentation includes case studies of the implementation of operational mineralogy at different scales with examples from monthly auditing through to daily on-site support.
Content may be subject to copyright.
The changing face of metal extraction geology, biology and geometallurgy 1470
Operational mineralogy: an overview of key practices in
sample analysis, sample preparation and statistics.
Christopher Brough, James Strongman, John Fletcher, Mariola Zając, Rachel Garside, Corinne Garner, Libby
Rose
Petrolab Ltd
Abstract. Automated mineralogy has been available to
mining companies since the early 2000s and was driven
by the development of QEMSCAN® and Mineral
Liberation Analyser (MLA) systems. Numerous case
studies have shown that these tools have the ability to
provide valuable mineralogical metrics relevant to the
characterisation and evaluation of an orebody. The
current challenge with these tools has been the
application in operational contexts with their often long
turnarounds, complex ore-bodies and single point data
sets of dynamic systems. Operational mineralogy is an
exciting new key branch of automated mineralogy that
has been made possible by the development of
ruggedized and versatile SEM systems, rapid sample
preparation techniques and data visualisation tools.
These advances enable mineralogy to be at the heart of
short term operational decision making. This paper
reviews some of these advances and current best
practise, particularly in sample analysis options, sample
preparation and particle statistics that are enabling rapid
turnarounds. All work has been developed on the ZEISS
Mineralogic platform, and the final presentation includes
case studies of the implementation of operational
mineralogy at different scales with examples from
monthly auditing through to daily on-site support.
1 Introduction
Mineral processing operations are undergoing a shift in
expectations of what can be achieved with process
monitoring data, with much being drawn from the
chemical, pharmaceutical and manufacturing industries.
The key difference with mining and mineral processing
and one reason for the lag in up take is that the ore feed
material has significantly more variability than inputs in
those industries. Therefore, alongside advances in
automation and process control, there is a requirement to
have a clear understanding of the impact of feed
variability on the process.
The understanding of feed variability has typically
been undertaken through process mineralogy
assessments with a focus on representative sampling,
understanding geometallurgical domains and
characterizing likely process performance (e.g. Lotter et
al 2011, Baum, 2014), with consequential improvements
on project cash flow (Lotter et al 2018a). As valuable as
these assessments are they usually represent single
point datasets of dynamic systems and the current
opportunity is to use mineralogy assessments in a
continuous monitoring framework (Graham 2017,
Kalichini et al. 2017). Understanding the routine impact of
feed variability requires linking mineralogy, geochemical
assay data, and process performance together with a
focus on trending datasets. This linkage is called
“Operational Mineralogy” and ideally these trending
datasets operate on a routine basis from monthly or
weekly audits (on off-site or near-site automated
systems) to preferably a daily or shift-by-shift resolution
(through on-site automated systems). The
implementation of these operational mineralogy systems
is relatively new and very infrequent. To date examples
have been observed from Cerro Verde (Fennel et al.
2005) and the Kansanshi mine (Kalichini et al 2017). This
paper briefly reviews the advances that have made
operational mineralogy possible together with key
practices that are necessary for its implementation.
2 Operational Mineralogy
Operational mineralogy has developed as a branch of
process mineralogy, which itself has been an integral part
of mineral processing for the last half-century. Its
development has been made possible by three specific
advances, namely;
i. Technological development of ruggedized fit-for-
purpose and versatile site-based automated SEMs.
ii. Developments in rapid sample preparation of
mineralogy blocks.
iii. Development of data visualisation tools to
summarise large datasets into relevant metrics.
These changes allow routine mineralogy to be at the
heart of operational decision making, and as such
operational mineralogy marks a shift from traditional
project focused mineralogy to dynamic mineralogy data
deeply integrated with daily, weekly or monthly decision
making. The first two of these advances are discussed
below, along with the key consideration of particle
statistics.
2.1 Ruggedized Versatile Automated SEMs
The first major step to automation in the process
mineralogy field was the development of quantitative
automated mineralogical systems, such as QEMSCAN®
(Originally QEM*SEM) (Miller et al. 1982) and MLA (Gu
and Napier-Munn 1997). These have driven
improvements in ore characterisation, deportment
studies and circuit surveys, becoming a critical part of
project development and assessment. In recent years
several new providers have continued the development
of automated mineralogy systems including TIMA, INCA
Mineral and ZEISS Mineralogic. Altogether these
systems have made significant improvements in both
throughput and repeatability. Critically, all these
1471 Life with Ore Deposits on Earth 15th SGA Biennial Meeting 2019, Volume 4
automated systems provide quantitative measurements
of grain and particle size distributions, liberation and also
association data on a particle by particle basis. These
advances have allowed automated mineralogy to
become an integral component of process mineralogy.
At the core of all these automated mineralogy systems
is a scanning electron microscope (SEM), a highly
sensitive, analytical instrument and very much at home in
a clean laboratory environment. Therefore in the past
simply transporting, installing and then servicing these
instruments on a mine site was not practical other than
on a few very large operations with dedicated or
centralised labs. The development of mobile ruggedized
systems for military applications, then the oil and gas
industry, along with continued innovation of table top
systems, means that the logistics of moving an
instrument and installing on site have been greatly
reduced.
MinSCAN, developed by Zeiss and using Mineralogic,
is one such on-site ruggedized system and it benefits
from recent advances in functionality and versatility
(Graham et al 2015, Graham 2017). This system offers
five analysis modes all of which are user configurable
(Figure 1 and see Graham et al. 2015). Deciding between
the different analyses modes depends on the
mineralogical complexity of the sample, what level of
information needs to be obtained and how much time is
available. For the mine site application the routines must
be consistent and robust.
Figure 1. The five analysis modes on Zeiss Mineralogic.
Pixel mapping mode is the longest but most common
analysis mode and produces the full suite of liberation
and grainsize data. Textural information and throughput
are determined by the pixel spacing, which is chiefly
dictated by the mineralogical complexity of the sample
(Figure 2). Line scan can provide rapid bulk mineralogy
datasets and will incorporate much larger particle
populations than pixel mapping. However, grainsize data
and liberation datasets are not available and on finer
fractions the chance of acquiring poor edge spectra are
greatly increased. BSE only is extremely fast and works
very well for samples where the thresholds of target
phases are distinct from one another and the gangue. For
BSE only analysis all deportment information will be
based on stoichiometric assumptions. Spot Centroid
“MLA” mode and Feature Scan are also very fast and
reduce edge and boundary effects, although as with BSE
only they are heavily dependent on the distinct
thresholding of the phases.
Figure 2. Trade off of speed against textural clarity at the same
magnification but larger pixel spacing. (a) 2 µm (~28mins) and (b) 4
µm (~7mins) pixel spacing retaining the textures within the samples.
(c) 8 µm (~3mins) pixel spacing retains the magnetite texture around
the gold (right hand side) but loses the texture elsewhere. (d) 16 µm
(46seconds) pixel spacing which is not suitable for analysis of this
sample. No textures are preserved and the gold is not detected.
Along with analysis modes the image processing toolkits
of modern automated systems, and particularly the Zeiss
Mineralogic system are powerful, allowing for versatile
options in the segmentation and targeted analysis of
particular phases. One such developed image processing
routine is the bright phase search plus context (BPSC)
that allows the analysis of bright phases (e.g. ore
phases), along with a dilation zone into the surrounding
particle (Figure 3 and see Brough et al. 2017). The image
processing recipe works by defining two thresholds, one
for the resin which is excluded and one for the target
phase which is included. A dilation field is then applied to
the target phase of a pre-determined width which spreads
the analysis out in a border from the target phase into the
surrounding host. If the host is a mineral phase then the
border region is included in the analysis, but if the host is
resin (i.e. the target phase is liberated), then there is no
analysis of the border region. From this analysis the
grain-size distribution and partial perimeter liberation of
the target phase can be calculated, without the necessity
to analyse the whole of the hosting particle. It is
particularly useful in coarser size fractions where the
target grain may be fine-grained even if the host particle
Line Scan
BSE Only
Spot Centroid Feature Scan
Mapping
Original Particle
Biotite
Cr-rich haematite
Electrum
Gold
Haematite
Ilmenite
Magnetite
A
60 µm
B
D
C
60 µm
60 µm60 µm
The changing face of metal extraction geology, biology and geometallurgy 1472
is coarse-grained, and allows the analysis to keep a fine
step-size (i.e. fine resolution) to correctly resolve the ore
textures. Furthermore, this BPSC analysis, coupled with
a linescan analysis allows for the rapid assessment of
bulk modal, deportment and basic liberation data
increasing the throughput capacity of the automated
analysis.
Figure 3. Image of a targeted BPSC routine (cf. Brough et al. 2017).
Light green is the targeted phase. Dark green is the dilation zone of
interest into the mineral host and the greys are remaining host
phases unanalyzed. Black is the resin.
2.2 Sample preparation
Any operational mineralogy program requires quality
samples on a rapid turnaround. Bad sampling (i.e. the
collection of “specimens” rather than “samples”) equals
bad data, and no amount of analysis can remove bias
from a sample (Lotter et al. 2018b, Gy 1979). The
sampling program should follow the assaying procedures
for the targeted stream and ideally the mineralogy sample
should be derived from assay reject. This allows a
balance to be made between the elemental assays
derived from mineralogical data to be compared against
chemical assay. There needs to be a clear understanding
of the potential error limits of the sampling method and
analysis procedures. With operational mineralogy the
focus is on trends, but equally it would be impossible to
make an accurate mineralogical balance on a sample that
is not suitable for metallurgical accounting.
Once collected the preparation of the samples into
polished blocks requires a rapid turnaround. Whilst
sample preparation techniques are usually subject to
confidential in-house protocols there are several key
principles that can be summarised. Particulate matter is
generally denser than the resin it is being set in and
subject to internal density differences as well (i.e.
between ore and gangue phases). As such the differential
density settling of particles, and therefore the incorrect
estimation of modal mineralogy is a key potential problem
(e.g. Sutherland and Gottlieb 1991, Kwitko-Ribeiro 2011).
Sample preparation must therefore allow for the mixing of
a suitably viscous resin and particulate matter in a thick
paste, with enough liquid to coat and fix the particles, but
not so much that particles easily slide through the resin
to the polishing surface. Included in the particulate matter
will be a filler (e.g. graphite) to aid particle separation.
Bubbles may form during preparation either from mixing,
or from volatile release from the resin blend. These can
be problematic due to the collection of fine dust within
exposed bubbles on the polished surface and the
potential for charging during analysis. As such
preparation is undertaken under vacuum to remove
excessive bubbles. Longitudinal sections (Figure 4) are
an excellent tool for determining optimal preparation
requirements and can be used in its own right if density
settling is considered unavoidable (e.g. Coetzee et al.
2011).
Figure 4. Longitudinal sections to study the effects of density
settling under different conditions. From trials runs like this it’s
possible to determine optimal ratios of resin to particulate matter.
2.3 Particle statistics
All analysis techniques involve the collection of data from
a subset of the total particle population. The number of
particles to be analysed from each size fraction can be
calculated using some of the general theory applied to the
analysis of point-counting in petrographic studies but
taking account of some more recent statistical studies.
The published literature on the techniques of point-
counting thin sections or polished blocks evolved from
work in the early 19th and 20th century with some papers
showing a particular focus on ore-minerals and ore
concentrates (e.g. Chayes 1946, 1949). General rules for
the estimation of error in point counting were provided in
Barringer (1953) and were further elucidated in 1965 with
the then seminal work of van der Plas and Tobi. Since
1965, their estimation became the benchmark for
measuring point-count error and has been widely used by
earth scientists. However, recent statistical studies
(Howarth 1998 and references therein) have concluded
that some of the assumptions built in to the 1965 paper
were inaccurate, particularly as the required confidence
interval around that estimation of error increases.
As such, using the Howarth (1998) paper as a
blueprint and, in particular, equation (11) from that paper
(itself derived from Blyth 1986), the measure of error can
be used to calculate the number of particles that need to
be analysed to reduce that error to an acceptable margin.
It is noted that similar proposals have been put forward
for a statistical approach based on the bootstrap method
(Evans and Napier-Munn, 2013; Mariano and Evans,
2015). For operational mineralogy terms it is proposed
1473 Life with Ore Deposits on Earth 15th SGA Biennial Meeting 2019, Volume 4
that the acceptable error for the relative error of a target
phase present at an abundance of 10%, should itself be
10% (i.e. the measured value is 10% ± 1%). This is
equivalent to an absolute error of <0.5% for a target
phase present at an abundance of 1%. The formula from
Blyth (1986) estimated the error for the upper bound and
lower bound corresponding to counts in the range;
p(n) = 100[1/N] to p(n) = 100{(N 2)/N]
Within this range the upper and lower bounds on the
estimate are given by;
p(n)u = 100[BETA(1 , n + 1, N n)]
p(n)l = 100[1 - BETA(1 , n + 1, N n)]
Where, N = number of grains analysed, n = number of
target phases, = confidence limit around that estimation
and BETA is a statistical function.
Using their formulae it is possible to calculate a solution
for any given error margins. For example, if the
acceptable error was deemed to be a 10% relative error
around a target phase present at proportions of 10%
(10% ± 1%) with 95% confidence, then the solution
comes as 2500 grains. This is equivalent to an absolute
error of <0.5% for a target phase present at an
abundance of 1%. 2500 grains would be sufficient for
most operational contexts, and it should be noted that
particle statistics can be improved for specific phases
through targeted analysis (e.g. BPSC; Figure 3).
3 Conclusions
Operational mineralogy brings the tools for mineralogical
characterisation into a continuous monitoring framework
in order to build on-going trends in process response on
a mine-site. Whilst optimal on a day-to-day or shift-to-shift
basis these operational mineralogy techniques can be
applied to weekly or monthly audits with the aim on
integrating mineralogical information on feed variability
with on-site decision making. Operational mineralogy has
been made possible by key advances in ruggedized
automated SEMs, the development of more powerful and
versatile analytical software and best practice in rapid
sample preparation and optimal particle statistics. These
advances allow for rapid turnarounds and the
presentation of key metrics to on-site and off-site
stakeholders. Weekly and monthly audits represent a
form of reactive control and are vital for understanding
how the ore-body and processing conditions are
changing. However, with day-to-day or shift-to-shift
implementation there is the opportunity to move to
predictive control of the processing operation, being able
to adapt the processing operation to the incoming ore.
This potential for day-to-day automated mineralogy is a
key opportunity for mine-sites going forward.
References
Barringer AR, (1953) The preparation of polished sections of ores
and mill products using diamond abrasives, and their
quantitative study by point counting methods. Bull. Instn Min.
Metall, 63:21-41.
Baum W, (2014) Ore characterization, process mineralogy and lab
automation a roadmap for future mining. Minerals
Engineering, 60:69-73
Blyth CR, (1986) Approximate Binomial Confidence Limits. Journal
of the American Statistical Association, 81, 843 855.
Brough CP, Strongman J, Bowell R, Warrender R, Prestia A, Barnes
A, Fletcher J, (2017) Automated environmental mineralogy; the
use of liberation analysis in humidity cell testwork. Minerals
Engineering. 107:112-122.
Coetzee LL, Theron SJ, Martin GJ, Merwe JD, and Stanek TA
(2011) Modern gold deportments and its application to industry.
Minerals Engineering 24:565575
Chayes F, (1946) Linear analysis of a medium-grained granite.
American Mineralogist, 31:261-275.
Chayes F, (1949) A simple point counter for thin-section analysis.
American Mineralogist, 34:1-11.
Evans CL. Napier-Munn TJ. (2013) Estimating error in
measurements of mineral grain size distribution. Minerals
Engineering, 52:198-203.
Fennel M, Guevara J, Velarde G, Baum W, Gottlieb P, (2005)
QEMSCAN mineral analysis for ore characterization and plant
support at Cerro Verde. 27th Mining Convention, Arequipa,
Peru, Proceedings, 111.
Graham S, (2017) SEMs, mines and mineralogy On-site
automated mineralogy delivering an operational mineralogy
approach for mine management. SGA conference, Quebec, 14th
Biennial Meeting, 1-4
Graham S, Brough C, Cropp A (2015) An Introduction to Zeiss
Mineralogic and the correlation of light microscopy with
automated mineralogy: a case study using BMS and PGM
analysis of samples from PGE-bearing chromitite prospect.
Precious Metals Conference, 1-11
Gu Y, Napier-Munn T, (1997) JK/Philips mineral liberation analyzer
an introduction. Minerals Processing '97 Conf. Cape Town,
SA, 2
Gy PM, (1979) Sampling of particulate materials theory and
practice. Elsevier, Amsterdam.
Howarth RJ, (1998) Improved estimators of uncertainty in
proportions, point-counting, and pass-fail test results. American
Journal of Science, 298:594-607.
Kalichini, MS Goodall, WR, Paul, EM, Prinsloo, A, Chongo, C
(2017) Applied Mineralogy at Kansanshi mine -Proof of the
Concept of On-Site Routine Process Mineralogy for
Continuous Improvement of Plant Operations, Process
Mineralogy.
Kwitko-Ribeiro R. (2011) New sample preparation developments to
minimize mineral segregation in process mineralogy. 10th
International Congress for Applied Mineralogy. 411-417.
Lotter NO, Kormos LJ, Oliveira J, Fragomeni D, Whiteman E, (2011)
Modern Process Mineralogy: Two case studies. Minerals
Engineering, 24:638 650.
Lotter NO, Evans CL, Engstrom K. (2018a) Sampling A key tool in
modern process mineralogy. Minerals Engineering, 116:196
202.
Lotter NO, Baum W, Reeves S, Arrue C, Bradshaw D. (2018b) The
business value of best practice process mineralogy, 116:226
238.
Mariano RA, Evans CL, (2015) Error analysis in ore particle
composition distribution measurements. Minerals Engineering,
82:36-44.
Miller PR, Reid AF, Zuiderwyk MA, (1982) QEM*SEM® image
analysis in the determination of modal assays, mineral
associations and mineral liberation. CIM-XIV International
Processing Congress, Toronto Canada
Sutherland DN, and Gottlieb P (1991): Application of automated
quantitative mineralogy in mineral processing. Minerals
Engineering (4/711):753762
Van der Plas L, Tobi AC, (1965) A chart for judging the reliability of
point counting results. American Journal of Science, 263:87
90.
... However, both software packages are currently licensed and sold through Thermo Fisher and are under new software development under the MAPS software package. With the prolifera tion of the applications of Automated Mineralogy several additional hardware software platforms have seen application in surficial media characterization, including TESCAN Integrated Mineral Analyzer (TIMA) [87], ZEISS MinSCAN [88] and Oxford Instruments IncaMineral [89]. ...
Article
Full-text available
This paper provides a summary of traditional, current, and developing exploration techniques using indicator minerals derived from glacial sediments, with a focus on Canadian case studies. The 0.25 to 2.0 mm fraction of heavy mineral concentrates (HMC) from surficial sediments is typically used for indicator mineral surveys, with the finer (0.25–0.50 mm) fraction used as the default grain size for heavy mineral concentrate studies due to the ease of concentration and separation and subsequent mineralogical identification. Similarly, commonly used indicator minerals (e.g., Kimberlite Indicator Minerals—KIMs) are well known because of ease of optical identification and their ability to survive glacial transport. Herein, we review the last 15 years of the rapidly growing application of Automated Mineralogy (e.g., MLA, QEMSCAN, TIMA, etc) to indicator mineral studies of several ore deposit types, including Ni-Cu-PGE, Volcanogenic Massive Sulfides, and a variety of porphyry systems and glacial sediments down ice of these deposits. These studies have expanded the indicator mineral species that can be applied to mineral exploration and decreased the size of the grains examined down to ~10 microns. Chemical and isotopic fertility indexes developed for bedrock can now be applied to indicator mineral grains in glacial sediments and these methods will influence the next generation of indicator mineral studies.
Article
The powerful modern toolbox of hybrid Process Mineralogy for flowsheet development uses best practice sampling as one of its tools. In this paper, the three key components of best practice sampling are reviewed with case studies. These three components are:. 1.Minimum sample mass.2.Rules of unbiassed sampled extraction.3.The safety line.These excellent models and rules are not commonly taught in undergraduate programmes. In this review paper, which is intended as an introductory reference for those practitioners in Process Mineralogy who have not had exposure to the sampling models, simple and practical explanations are presented for reference. It is shown that finer particle size distributions lead to smaller minimum sample mass requirements.While sampling theory allows us to estimate the error involved obtaining a mass of sample for mineralogical analysis it is also useful to account for errors in the process mineralogy measurements themselves. Examples of the confidence intervals on liberation measurements made on high- and low-grade samples are provided to illustrate the importance of sample size-specifically measuring sufficient numbers of particles-in these analyses.
Article
Modern Process Mineralogy has been making significant advances in methodology and data interpretation since it was assembled in the mid-1980s as a multi-disciplined team approach to obtaining mineralogical information from drill core and plant samples so as to infer the metallurgical processing requirements of that ore. This hybrid discipline consists of teams that include geologists, mineralogists, samplers, mineral processors and often others, working together. The degree of cross-training, communication and trust dictates the potential capacity of the team and it is possible to develop technical capabilities that surpass those of conventional teams. A pivotal tool for technically efficient and plant-oriented process mineralogy is, of course, the use of modern, automated laboratory technology. In these cases, process mineralogy, though associated with some capital investment, is a valuable risk reduction tool and an operations optimization tool for any mining company, not only in terms of finances but also in terms of human and intellectual capital. However, if the teams are dysfunctional and information is not interpreted correctly due to limited experience in the team or less than best practice, or it is not implemented or used, much of the value can be lost. Process Mineralogy can then be regarded as ‘time consuming and expensive’. In this paper, the business value of best practice Process Mineralogy is outlined and discussed. Case studies that include ‘green fields’ new design applications and ‘brown fields’ interventions to mature operations have been selected to demonstrate the tremendous financial value that can be achieved are presented, along with those where costly disasters could have been averted. The list is not intended to be exhaustive or complete, and the reader is referred to the extensive literature available. Examples are selected for this publication specifically to illustrate the delicate balance between generating additional business value through potentially expensive mineralogical analyses and the lost opportunities of underperforming flowsheets, unanticipated losses due to high feed variance, inadequate liberation or deleterious minerals, over-reagentised circuits, or extra costs of unnecessary or underutilised equipment.
Article
Measurements of ore particle composition distribution, commonly termed mineral liberation distribution, are used in assessing process performance in mineral processing. In many applications, comparisons are made between particle composition distributions (for example comparing the products of fine and coarse grinds) and in such comparisons it is useful to understand the errors in the measurements in order to decide whether any differences are significant. A statistical approach based on bootstrap resampling has been applied to estimate the confidence intervals for ore particle composition distribution measurements obtained using the MLA automated mineralogy system. In this approach confidence intervals for each individual composition class are estimated as compared to a previous analytical solution which provides this information for particle composition data in cumulative form (Leigh et al., 1993). The effects on the magnitude of the error associated with measured values of particle composition distribution of the number of ore particles measured in the analysis and the complexity of the particle texture are investigated. Examples from a gold-bearing pyrite ore and an iron oxide copper gold ore are presented to demonstrate the practical application of this approach.
Article
When ore characteristics such as mineral grain size distributions are quantified using measurements on particulate samples there is an error associated with the measured values. The magnitude of this error is a function of the grade of mineral of interest, the texture of the ore and the number of ore particles measured in the analysis. In practice the desire to minimise the error due to sampling by increasing the number of particles measured must be balanced against the increase in time and cost of analysing this increased number of particles. A statistical method based on bootstrap resampling has been developed to estimate the error in measurements of textural characteristics which are quantified by automated mineralogy systems. An application of the method to estimate the error in measurements of mineral grain size distribution is presented; however, the method can equally be applied to estimate the error in other textural characteristics, for example mineral association. By estimating how the error in the characteristic of interest reduces as particle sample size increases, the bootstrap resampling approach assists mineralogists to identify how many particles must be analysed to achieve the desired variance in the measured value. Examples from a copper porphyry ore are presented to illustrate the practical applications of this methodology in quantitative mineralogy programmes.
Conference Paper
Sample preparation is the most critical step in a microscopy-based characterization. Ideal products should be the most representative and the least biased. Under this condition, the imaged surface displays a random dispersion of particles, with morphological, density and compositional deviations virtually absent. In practice, every single preparation procedure is a superposition of several steps with different degrees of imperfectness. Artifacts generated by epoxy manipulation and polishing procedures are easily minimized; however, density segregation is more difficult to avoid and detect, being frequently neglected. Its pernicious effects for process mineralogy are shown herein in different case studies, as well as the experience of Vale to minimize it via new dynamic sample preparation methods, developed for either bulk or fractioned samples of ores and products. 1 INTRODUCTION Mineral segregation consists of the differential settlement of particles in the bottom of moulds during the potting step, as a function of higher average mineral density and size. It is far more critical than lack of randomness and may occur irrespective of sample and mounting medium, though some factors may contribute considerably to a stronger segregation, like thin epoxy, coarse particles, broad particle size range and sharp gradient of density. Concern about it is widespread in literature [1–3]. This effect has been observed in standard QEMSCAN sample preparation by mixing sample with ultrapure graphite filler [4]. Instead of preventing particle agglomeration, the graphite floats up and segregates from the sinking sample (Figure 1). A typical symptom is noticeable in data reconciliation where elements constituents of sulfides and oxides (S, Cu, Ni, Fe, etc.) tend to get overestimated, while in silicates (Si, Mg, K, Al, etc.) tend to get underestimated (Figure 2). As long as the gravity force is always present and geological materials are usually heterogeneous in all scales, the effect of mineral segregation can only be minimized. Two types of samples were considered for this purpose: fractioned and bulk. In fractioned samples, the main goal is the detection and measurement of morphological and mineralogical data on individual particles, with narrow sizing for bias minimizing, and measurements in BMA and PMA modes. Particle de-agglomeration is done either via physical methods (adding graphite or epoxy standards) and/or via image processing (morphological filtering). However, both ways are fully fallible. The proposed solution is a dynamic epoxy curing, where a slow and continuous lateral rolling is applied to covered moulds, until complete hardening is achieved. This relatively simple device minimizes at once the effects of particle agglomeration and *Correspondence to: rogerio.kwitko@vale.com