TOS forum

Published by IM Publications Open LLP
Online ISSN: 2053-9681
Publications
Article
The WCSB10 conference will cover the latest research and application of the Theory of Sampling (TOS) and Blending in many important technology and industry sectors: mining, exploration, minerals processing, metals refinement, cement, food & feed, agri & aqua culture, pharmaceutical production etc. WCSB10 specifically has a broader societal, industrial and environmental emphasis with a special focus on sustainable science, technology and industry. This article provides an update on WCSB10 with information on all aspects of the conference.
 
Article
The paper discusses the process design and mass balance highlights to illustrate the ISO 13909 compliance of a 4-year-running contractual payment station sampling plant. The plant is used to sample -60mm coal supply to a coal fired power station infrastructure belonging to the South African National Power Utility. Payment parameters are proximate analysis (Calorific Value, Ash-, Sulphur content), physical (size and abrasion index) and moisture content. Over and above ergonomic physical samples and timeous moisture samples, the plant also delivers the crushed chemical sample in triplicate for buyer-, seller- and referee analysis. The lot period was defined as 24 hours production while the system compliance was designed around mass-based sampling. Production varies over a wide range of throughout tonnages from 800-3600 ton/h as dictated by complex boiler and silo operational requirements. The plant required a unique operational philosophy to adapt the number of primary increments (and corresponding time interval) per sublot size to the varying production rates which over a fix lot period results in varying lot size. Furthermore, a contractual grade, calibrated, 6-idler belt scale is used to measure the instantaneous conveyor load which in turn controls the cutter speed via a programmable logic controller (PLC) prompted variable speed drive (VSD) setpoint to sample fixed increment mass to represent fixed production mass interval. The primary belt end crosscut sampler boasts a unique articulated joint design sample chute to fit down the high sample tract; eliminating the delineation errors associated with the previous competitor design.
 
Article
For many of us in the regionally distributed and interconnected mining industries, the pandemic impacts were earlier and broader than most. PDAC, the mining mega-convention that descends on Toronto each spring, started 2020 as per any other year, but by the end of the week the world had changed. Sanitiser bottles appeared on tables, elbow bumps replaced handshakes, and the airports on the trip home were a mix of caution and carnage; a sign of the new reality to which we had now entered. COVID-19 had rapidly spread from being an isolated“Wuhan” virus, and many projects still had field personnel undertaking commissioning and service activities. In the space of a week in March 2020, the focus shifted from urgently completing tasks to evacuating staff back to safety as expeditiously as practicable. Clients were generally supportive of such movements, with similar strategies playing out within their operations. Movements were quickly constrained by pandemic restrictions, and the plane tickets, hotel beds and shipping containers were invariably prioritised for essential operations. Despite high opinions of our indispensability, we ceded priority in most jurisdictions to the public health response. It was only once personnel were back safely in their home cities or in hotel quarantine, stakeholder meetings had been urgently convened across myriad not-yet-ubiquitous online platforms, and formal written correspondence had been exchanged flagging the start of the disruption, that the reality set in; how to continue and complete mining project installations on the opposite side of the continent or world, with operations and suppliers suspended or furloughed, and no certainty as to when personnel and equipment mobility may resume? With very few precedents to draw upon in any of our working careers, the well-intended responses to these disruptions were varied in success, but in any case, will prove formative to how we act in future crises. Whilst we cannot predict with certainty when, where and how the next disaster will occur, it is incumbent on all to take the hard learned lessons of COVID-19 and have disaster response and recovery plans that are updated and reflect our real, lived experiences.
 
Article
Charles Oliver Ingamells passed away in April 1994 at age 77. Ingamells received his BA at the University of Western Ontario and his MS at the University of Minnesota. During his later years in his retirement home in Florida he was a faithful representative of a group of well-known world experts in Sampling Theory, such as Pierre M. Gy, Francis F. Pitard, Jan Visman, Paul Switzer at Stanford University and J.C. Engels at the US Geological Survey and the Linus Pauling Institute in Menlo Park, California. His association with Francis F. Pitard during several years at Amax Extractive Research & Development in Colorado has added to a unique combination of different experiences in the field of geochemical analysis. His pioneering work in the field of geological sampling led to collaboration with the above experts.
 
Article
Tributes from colleagues, friends and family.
 
Article
The Theory of Sampling (TOS) has become firmly established in many process industries over the last few decades as the basis for precise and accurate material characterisation. Increasingly, considerations go into the design of sampling and sample preparation equipment to ensure sample representativeness. With the introduction of digitalisation under the buzzword Industry 4.0, many options have emerged to monitor these processes. But also, in the other fields, such as process analytics techniques (PAT) and applied sensor technology, which are not directly attributable to sampling, new applications arise where the Theory of Sampling is never-the-less a very useful addition. Here we present two case studies in which TOS have delivered decisive improvements in data acquisition.
 
Article
The paper discusses the process design and mass balance for the minimum and maximum design case to illustrate the ISO 8685 compliance of a 2-year-running barge loading payment station sampling plant. The plant samples -100 mm, export quality, bauxite material from a barge-loading conveyor delivering 10 kton/h at 5.4 m/s. The green field operation does not know the Coefficient of Variation or the Size Range Factor as required inputs to calculate the Number of Primary Increments and Minimum Gross Sample Mass required. Therefore, informed assumptions were made given performance data of a neighboring sampling system that is in operation for over a decade. Without the available variation and size factor data the ISO-compliant scheme design could not commence. Where this data is not available for green field projects, it poses a risk that plant designs may not be compliant where variabilities could exceed assumptions on the input parameters. The system is designed for various barge carrying capacities with lot size in mass. Operational quality assurance however requires samples more frequently and therefore sublot periods are 4 hourly time based. ISO 8685 compliance is achieved with sample increments taken at maximum throughputs and barge sizes to determine the time-based interval. At reduced throughputs, the fixed time interval regime results in the minimum ISO requirements to be exceeded and tied in well with client overall quality incentives. Primary sample increments from a tailored cross belt sampler are crushed automatically in the sampling plant to 25 mm and then to 6mm using two stages of double roll crushers. The sample is then subdivided through secondary and tertiary sampling to produce a composite 4-hourly chemical sample. The 4-hourly sublot samples are collected in an ergonomic 4-way carousel with each composite sample representing 1-hour barge loading production— allowing the client quality assurance insights into their blending facility performance.
 
Article
Industrial operations are often based on critical quality measures obtained for technical process control and/or to determine the value of raw materials and product streams. Process Analytical Technology (PAT) monitoring is applied to characterise, for example, raw materials, semi-finished as well as finished products. There is an active interest in approaches for “smart” online, real-time industrial sensor applications, especially where industrial operations involve high sample throughput and/or may involve hazardous substances demanding automation. State-of-the-art sample preparation procedures and equipment can deliver key performances indicators, often supplemented by sensor data that are used as proxy quality measures which helps to ensure measurement representativity and optimal process/product control. We here illustrate this industrial front-line arena by an example in which PAT accelerometer data are used for real-time monitoring of the efficiency of the automated grinding sample preprocessing process.
 
Schematic illustration of variograms of four alternative mixing process variants in pharmaceutical formulation development. The process represented by the bottom variogram is optimal because of its lowest sill level and least deviations from a flat variogram. All variograms reveal one form or other of feeder periodicity inheritance, only sufficiently dampened in the bottom one. Note regulator threshold criterion (horizontal line). Even though the optimal variogram is not flat the fact that it falls exclusively below the regulator threshold allows the blending process to be declared fit-for-purpose.  
Article
Process monitoring in technology and industry in general, in pharmaceutical batch and continuous manufacturing in particular, is incomplete without full understanding of all sources of variation. Pharmaceutical mixture heterogeneity interacts with the particular sampling process involved (by physical extraction or by Process Analytical Technology (PAT) signal acquisition) potentially creating four Incorrect Sampling Errors (ISE), two Correct Sampling Errors (CSE) in addition to the Total Analytical Error (TAE). In the highly regulated pharmaceutical production context it is essential to eliminate, or reduce maximally, all unnecessary contributions to the Total Sampling Error (TSE) to the Measurement Uncertainty (MU total) in order to be able to meet stringent regulatory blend and dose uniformity requirements. Current problems mainly stem from inadequate understanding of the challenges regarding sampling of powder blends. In this endeavor the Theory of Sampling (TOS) forms the only reliable scientific framework from which to seek resolution. We here present the variographic approach with an aim to conduct TSE error variance identification and to show how to develop fit-for-purpose acceptance levels in critical powder blending process monitoring. The key issue regards the nugget effect, which contains all non-optimised [ISE, CSE] plus TAE contributions to MU total. A large nugget effect w.r.t. the sill is a warning that the measurement system is far from fit-for-purpose, and must be improved. Regulatory guidances have hitherto called for physical sampling from within blenders, leading to significant ISE associated with the insertion of sample thieves (sampling spears). Instead of self-crippling spear sampling we here call for a paradigm shift, very much from the TOS regimen, in the form of alternative on-line variographic characterisation of 1-D blender outflow streams. Practical illustrations and case histories are described in parallel contributions to WCSB7.
 
Article
Metal accounting is one of the main tools for financial and technical management of metal production industry. It is based on measurements and has to manage the uncertainty inherent to the measurement process. The uncertainty in the metal accounting generates financial risk. The accuracy of the metal accounting results is directly linked to the accuracy of the material balance and then to the accuracy of the mass and content measurements. Estimate the overall measurement error, through its probability distribution or its first and second moments (mean and variance), can contribute to the enterprise decision making. The overall measurement error can be calculated and analysed by establishing the uncertainty budget. If this approach has been mainly introduced to calculate the analytical error (cf. ISO GUM), it has to take into account the sampling procedure. Even though it is not explicitly named “uncertainty budget”, the same approach is proposed in the Pierre Gy’s Theory of Sampling (TOS), where the various components of the overall error are well identified and described with their properties and their relative weights. The present paper proposes a methodology to build such uncertainty budgets in the frame of the implementation of a metal accounting system. It can be applied to an existing measurement system, analysing the results in order to find some ways for improving the measurement accuracy. In addition, it can be used to define a new measurement procedure with an objective of accuracy. Various real examples illustrate both applications.
 
Article
Following years of development and testing, in-situ chemical assay by Pulsed Fast and Thermal Neutron Activation (PFTNA) has been implemented in mining grade control at BHP Western Australian Iron Ore as a world first. Demonstrating the technical capability and aptness of a new methodology, however, is not sufficient to ensure the sustained quality of reported assay data. The success of moving from testing stage to implementation in active mining grade control, is chiefly dependent on the robustness of ongoing quality control and quality management. This paper shows the steps undertaken to achieve end to end monitoring of data acquired by Blasthole Assay Tools (BHAT) using PFTNA methods. The main challenge forin situ chemical assay by the BHAT is to design a quality assurance/quality control program (QA/QC) without a physical sample being collected, and in consequence, without the conventional separation into the focus areas sample collection, sample preparation and laboratory analysis. In this context, the BHAT combines all in one in-strument, and different ways to monitor data integrity, repeatability and accuracy need to be established as outlined below. After the validity of a BHAT calibration has been verified and a tool is in operation, data is monitored on a daily basis to check that relevant operational parameters inside the tool are working within defined acceptance limits. Measurement error in the field is monitored with repeat logs in Blastholes, and inter-instrument error by replicate logs of different BHAT units in the same Blastholes. Accuracy and instrument drift over longer periods are monitored by repeated logs in Reverse Circulation (RC) drill holes. Operational parameters, such as neutron output and spectral resolution of the instrument detector are monitored by scheduled logs in dedicated testing facilities. Also, duplicate manual sampling in Blastholes isused to compare grade populations obtained by different sampling methods in mining pits to aid grade reconciliation from mining to production. By routine application of these QA/QC steps, in conjunction with close communication of results to mining teams, the new BHAT technology has been successfully embedded in day to day mining operations.
 
Four distributions with identical mean m = 42% and standard deviation s = 0.5%. The skew-normal distribution has parameters a =-8, x = 42.65%, w = 0.82%. The bimodal (a) and (b) distributions have parameters a = 0.2, m 1 = 41.20%, s 1 = 0.20%, m 2 = 42.20%, s 2 = 0.32% and a = 0.17, m 1 = 41.00%, s 1 = 0.40%, m 2 = 42.20%, s 2 = 0.15% respectively (see appendix for details).
RL distribution for the bimodal (b) sampling distribution, for a = 5% and n = 1. The upper Figure is for positive shifts of the mean, and the lower Figure for negative shifts of the mean.
Full sampling distributions-bimodal (b)-with mean assay 41% (shift =-1%), 42% (no shift) and 43% (shift = + 1%) from left to right. The vertical lines show the lower (LCL) and upper (UCL) control limits at the 95% confidence level.
Article
Determination of the complete sampling distribution (Lyman, 2014), as opposed to estimation of the sampling variance, represents a significant advance in sampling theory. This is one link that has been missing for sampling results to be used to their full potential. In particular, access to the complete sampling distribution provides opportunities to bring all the concepts and risk assessment tools from statistical process control (SPC) into the production and trading of mineral commodities, giving sampling investments and results their full added-value. The paper focuses on the way by which sampling theory, via the complete sampling distribution, interfaces with production and statistical process control theory and practice. The paper evaluates specifically the effect of using the full sampling distribution on the Operating Characteristic curve and control charts’ Run Length distributions, two SPC cornerstones that are essential for quality assurance and quality control analysis and decision-making. It is shown that departure from normality of the sampling distribution has a strong effect on SPC analyses. Analysis of the Operating Characteristic curve for example shows that assumption of normality may lead to erroneous risk assessment of the conformity of commercial lots. It is concluded that the actual sampling distribution should be used for quality control and quality assurance in order to derive the highest value from sampling.
 
Article
For Anglo American Platinum (AAP) to reach their burning ambition goal of doubling the Earnings Before Interest, Taxes, Depreciation, and Amortization (EBITDA) by 2023, sites are required to adopt an alternative approach to improve representativeness of metal accounting samples given the increase in grind and throughput demand. The success of optimization projects will rely heavily on metal accounting data being accurate so that improvements declared are based on sound samples and assay measurements. The 60 litre mechanically agitated hopper (MAH) was initially developed and ratified in order to overcome particle segregation evident in the 20-litre conventional, compressed air-agitated hoppers of vezin type sampling systems. A sustained plant accountability performance within the range of 95-105% was realized due to the correction of the previously overstated feed grade by means of a more representative sample. Pierre Gy’s rule of thumb of 30 increments per sampling campaign has not been proven and documented for the Platinum Group Metals (PGM) industry. The MAH however with additional volume capacity allows for flexibility to increase the primary sampling increments per shift from ± 32 to ± 96 to cater for process variability (thereby reducing distributional heterogeneity) without increasing the overall resulting final sample mass. Additional technology and larger 110/220 litre capacity hoppers have been deployed. Enhancements include a wash water and drainage system, an improved trash screen design and high/low hopper level sensors. The MAH principle of operation has also been expanded to cater for a double (3-drive) stage sampling system as well as a triple (5-drive) stage sampling system. It is believed that the latest MAH design will satisfy the Theory of Sampling principles and therefore a motivation for an industrial roll out of the innovation within AAP is underway.
 
Article
The Aloha Sampler is an innovative new sampling tool to effectively collect and combine increments from dynamic, liquid, one-phase and two-phase systems. It is extremely inexpensive and very cost effective to implement and produces more representative samples than any other conventional techniques. TOS forum has asked EnviroStat to present the Aloha Sampler for its readers.
 
Article
In order to minimise the sampling error and sampling bias associated with the sampling of metal bearing ores it is essential that the heterogeneity characteristics of the ores be fully appreciated. Heterogeneity tests were carried out on the significantly different manganiferous ores produced at Wessels and Mamatwan mines near Hotazel, South Africa, for the purpose of establishing an optimal sampling protocol for the ores. The method referred to as the Segregation Free Analysis (SFA) was used for the determination of the parameters K and Alpha by construction of calibration curves. The method involves crushing a sufficient amount of ore so that after passing it through a set of fifteen nested screens there is sufficient material to then be split into 32 samples of mass 2-5 kg, using a riffle splitter, and analysing each of the samples. Thus, for fifteen nested screens there are fifteen series each consisting of 32 samples, making a total of 480 samples for analysis. Of the eighteen elements that were analysed in each sample only %Mn3O4, %FeO, %K2O, %P, and %SO2 were calibrated, the first two being the main paying elements and the last three being deleterious elements for the smelting processes in which the ores are used. Calibration curves indicate that for the coarse fraction, above 1 cm, manganese ores have alpha values close to 3, whereas those less than 1 cm in diameter have alpha values closer to 1. Reasons for this behaviour are uncertain but it could be related to the behaviour of the crystal structure in the very pure ores as the ores are progressively crushed and screened to finer size fractions. Separate nomograms were therefore prepared for the coarse and fine fractions. Net conclusions indicate that both Wessels and Mamatwan ores are relatively easy to sample and that simple two or three stage processing will suffice when preparing the final 2 g aliquot at 75 microns. Apart from minor modifications in the sample preparation protocols, there is no evidence to suggest that the Wessels and Mamatwan ores require different sample preparation protocols, or that they should be assayed differently. The calibration curves for manganese ore are compared with the calibration curves for gold bearing ores which generally have alpha values close to 1. The difference in alpha between the gold ores and bulk commodities is considered to be related to the primary distribution of the metals in nature, lognormal for gold and normal for manganese.
 
Article
Industrial and technological processes are very difficult to manage when the quality of feed and product or discard are not measured with confidence. Effective control can occur when process analytical technologies are chosen that provide representative, precise, and timely measurements. For the measurement technique to be representative it must comply with the Theory of Sampling (TOS) and provide an equal chance of any component in the streaming material to be included in the support for the measurement. This generally precludes technologies that measure only the surface of materials, or biased measurements stemming from a limited portion of the material only, particularly in the minerals processing and recycling sectors, which usually display high compositional variability. The location of the analytical technology should relate to the benefit being targeted and allow for enough reaction time to respond to the quality in some way; diverting short increments based on composition and decision parameters based on process impact, blending with other quality materials, or feeding information backwards or forwards. Feed forward options can include flow rate control, reagent control, operational process variables that impact recoveries, etc. Major benefits have been achieved in measuring coarse conveyed flows with high specification Prompt Gamma Neutron Activation Analysis (PGNAA) over short increments (thirty seconds to two minutes) for most elements, or over five to ten minute increments for trace elements, such as gold. PGNAA applied to conveyed flows allow the full flow to be measured continuously and composition averaged for each increment in real time. The use of penetrative and continuous moisture measurement using transmission microwaves has also proved effective for moisture monitoring and management. Precisions between laboratory samples of the flow and analysis data from analysers can be sufficient to have high confidence in resulting process control decisions. This paper explains the benefits in more detail and includes case studies to highlight actual benefits derived from the application of such systems. It should be noted that sampling of the materials is still required for calibration and adjustments for the process analytical tools.
 
Article
Even though the sampling technique result in potentially biased samples with poor precision of the metal grade and are classified as specimens and not samples, the manual sampling of rotary percussion blast hole chips is still widely performed in the industry for operational grade control purposes. The objectives of this investigation are to estimate the precision and“bias” of manual sampling by comparing the copper grade results of fifteen (15) diamond drill core samples versus fifteen (15) rotary percussion blast hole drilling chip samples. This also includes the determination of a practical manual sampling template with the highest precision to providean understanding of the distribution of the copper content within the cone of blast hole chips. The contouring plots of thecopper grades provides the selection of the best fit-for-purpose template with regards precision and operational resourcing requirements. The diamond drill core samples take into account the Increment Delimitation Error (IDE) andIncrement Extraction Error (IEE) and therefore can be considered as reference samples for the purpose of this review.
 
Article
Technically, sampling of food and feed is the process of selecting a small mass from a larger quantity of material for the purpose of performing a measurement, quantitative or qualitative, on the selected portion and making valid inferences with respect to the entire target mass (Decision Unit). It is too often simply assumed (without justification) that the representativeness and integrity of the sampled material is a given, and consequently also erroneously assumed that the measurement results obtained can be used to make reliable inferences about the target. This is a seriously mistaken assumption. Sampling of food and feed materials is performed for a number of reasons at various stages of an integrated food safety system, including but not limited to, premarketing risk assessment, process control in a food/feed manufacturing environment, first responder investigations to foodborne disease outbreaks, and regulatory compliance (agencies/programs performing monitoring or surveillance of food or feed products in support of food safety surveillance of food or feed products in support of food safety regulations. While sampling situations are diverse, and for many the immediate thought is that specific sampling procedures probably should be tied in with the specific nature of the products or processes being sampled, a singular, unified approach can in fact address all situations and products, aiming for a fit-for-purpose (fit-for-decision) representative sampling process. The target audience includes food/feed protection personnel, e.g., field sampling operators, academic and industrial scientists, laboratory personnel, companies, organizations, regulatory bodies, and agencies that are responsible for sampling, as well as their project leaders, project managers, quality mangers, supervisors, and directors who are responsible for business and other decisions of economic and societal importance. In the United States alone there are an estimated 45,000 federal, state, and local food/feed regulatory personnel, not including industry or laboratory personnel. With a conservative estimate of 50-75% of them involved in sampling activities, the target audience forms a very sizable body in the United States as well as worldwide. For the world at large, the relevant numbers are exorbitant. There is much to do…. And there is here a powerful carry-over effect beyond food and feed sampling. The general principles presented apply to any-and-all materials (lots, DUs) with similar heterogeneity characteristics as those in the food, feed, and environmental sciences. Perhaps paradoxical at first view, sampling of heterogeneous materials is in a sense a matrix-independent endeavour, only the material heterogeneity counts2-5. In this sense ref. 1: “Representative Sampling for Food and Feed Materials”, Special Guest Editor Section Journal of AOAC International, vol. 98, No. 2 (2015) constitutes a general introductory mini-text book for representative sampling1.
 
Article
It is an undeniable fact that Visman and Ingamells’s works provide valuable additions to the Theory of Sampling. This paper shows real cases where their approaches gave valuable information to better understand the complex heterogeneity of low content constituents that led to better sampling and subsampling protocols. These case studies are: Cobalt assays in a lateritic ore led to the conclusion that some areas were very low in cobalt content. A closer look at the data using Ingamells’s approach proved that conclusion completely wrong. The estimation of low content iron in high purity ammonium paratungstate using 1-gram subsamples for the analytical method proved to be affected by a severe Poisson Process giving the illusion of a product being within specification when in fact it was a very bad product. It should be emphasized that there are probably thousands of similar cases in many industries, as the result of economists not communicating enough with knowledgeable technical staff.
 
Article
Industrial research experiments are conducted in various scales in the mining industry. Regardless of the experimental purpose, sampling and analysis is normally always a part of the experimental process to collect necessary data. However, in order to ensure that the experiment will enable valid conclusions, the understanding and minimisation of sampling variation is crucial. Two effective methods for evaluation of sampling variability in any process sampling situation are the duplicate and replication experiments. The application of sampling experiments in the early phases of a demo- or pilot-scale experiment is an effective way to both understand the total measurement system variability, as well as the possibility to improve sampling methods if the sampling variability is deemed too high to enable representative results to use for experimental evaluation. LKAB is an iron ore mining company in the north of Sweden where experiments are conducted in all parts of the process value chain with regularity. In the current state of the world, encountering more and more threats to our global climate and environment, a focus for LKAB has been to reduce the use of fossil fuel as well as to minimize waste and tailings. One of LKABs current environmental initiatives is to investigate the feasibility of recovering and processing apatite from tailings of the LKAB beneficiation process. Further processing of recovered apatite will generate critical raw materials, phosphorous, rare-earth elements, and fluorine. To increase the understanding of the process variability of various analytical parameters in a pilot-scale experiment within this project, both duplicate and replication sampling experiments were conducted during one of the pilot-scale campaigns. The sampling experiments were applied to three separate sampling locations where two different sampling methods were used. Results show that both sampling method and sampling experimental method can affect the results obtained. The case study showed that the sampling variability was higher for sampling locations where grab sampling was applied, in comparison to composite sampling that generated lower sampling variability at one of the sampling locations in the pilot plant. This indicate that the composite sampling method can produce more representative results and should be favourable in future process experiments. The results also indicate that the duplicate sampling experiment is more robust to outliers in comparison to the replication experiment. The duplicate experiment is also able to quantify the process variability and evaluate the relative sampling to process variability which can be an advantage in some cases.
 
Article
Sampling is necessary every time inferences are to be made to take informed, optimal decisions in science, technology, industry, trade and commerce. For reasons extensively addressed over the last two decades, some fields, normally those where good sampling practices are a source of economic gain such as the mining/minerals/metals industrial sectors, explicate the role of sampling more than others. This is not the case within the realm of food and feed safety assessment where sampling continues - still today - to be perceived more as an economic burden and a technical necessity to be fulfilled because of regulatory demands, rather than a need to ensure reliable evidence to support management and regulatory decisions. This is true today and will become even more central in the future to address the challenges posed by the accelerating climate crisis, resources depletion and increasing food demand. Risk assessment and sampling are both probabilistic disciplines, the first devoted to estimate and minimise safety risks, the latter devoted to estimate and mitigate sampling risks (the effects of sampling errors). Here we offer an exposé with the aim of positioning TOS as an essential disciplines and practical tool needed to ensure the best possible estimation of risks in support of safety decision-making and risk management in all of food and feed sciences, technology, industry, trade, commerce, and society at large. We demonstrate that sampling plays an integral, but an often much overlooked role in all these fields.
 
Article
Many commercial coal testing laboratories are accredited to ISO 17025 – General requirements for the competence of testing and calibration laboratories. There is an expected reliance in the Coal mining industry that the laboratory adheres to all elements of this standard in between successive accreditation audits conducted every 18 months by the independent accreditation body, National Association of Testing Authorities (NATA). In the absence of proactive QAQC & QM practices monitoring the quality of the information reported by laboratories, potential issues impacting production decisions and reconciliation results are only determined in a reactive manner. In addition, for mining companies working with several internal and external laboratories across the supply chain, the management of the logistics, practices and information becomes very challenging and time-consuming, impacting the company’s ability to track laboratory results as key inputs in a production and reconciliation perspective. The absence of proactive QAQC & QM practices results in a sub-optimal/reactive approach in the Coal industry, increasing the risk of short-term unaware production gaps related to quality, increased time required for quality breach investigations, the absence of a holistic approach/monitoring in the value chain, and the financial impact for the business performing under sub-optimal conditions. This paper aims to show the journey towards the implementation of a new proactive QAQC and QM program, where now the quality of many different laboratories across the supply chain can be monitored and linked with global reconciliation results, as an improvement opportunity to complement the current industry standard ISO 17025 accreditation and Proficiency Round Robin approach.
 
Article
Quality Assurance and Quality Control (QA/QC) is of critical interest in the mining industry. Over the years, Anglo American Platinum has adopted a sound strategy of Best Practice Principles (BPP) for mass measurement, sampling, sample preparation, analysis, and metal accounting. Often, much effort is focused on implementation and maintenance of quality control systems to provide quality assurance. Within the Anglo American Platinum business units, QA/QC data are deemed of significant value on a day-to-day basis and on a higher level also provides a means to prove or disprove evaluation and metal accounting disputes between various sites and/or opposing members of the Joint Evaluation Committee (JEC). Unfortunately, QA/QC data and associated QA/QC systems alone do not always provide the technical or tangible reasons to supplement explanations around anomalies in performance. It is sometimes necessary to go beyond monitoring and focus on interpreting the QA/QC data to comprehend the underlying issues. This paper aims to showcase a multitude of actual case studies pertaining to troubleshooting of challenges encountered throughout the Platinum processing pipeline (i.e., Concentrator to Smelter to Refinery). These challenges range from areas of mass measurement to sampling, to sample preparation and analytical as well as plant performance. Observations and learnings from these instances indicated that even though stringent QA/QC was adhered to, it was evident that complying to first principles of mass measurement and sampling theory, minimum sample mass and an ongoing understanding of individual material characteristics was crucial. It was also highlighted that the re-assessment of designs, methods and protocols are necessary per material stream and that a standardization approach across all Anglo American Platinum business units is perhaps sensible at one time but may not always be appropriate and/or relevant.
 
Article
Many styles of gold mineralisation pose challenges during sampling because of the presence of coarse gold and high natural heterogeneity (“nugget effect”). The gold-bearing conglomerates of the Western Australian Pilbara provide some challenges. Novo Resources Corporation has addressed many of these over the last five years. Its Beatons Creek open pit operation is the first Pilbara conglomerate to go into production (January 2021) based on a total oxide Mineral Resource of 316,000 oz Au (5.2 Mt at 1.9 g/t Au at a 0.5 g/t Au cut-off). Mineralisation occurs within the Beatons Creek conglomerate member of the Hardey Sandstone formation, which constitutes part of the Fortescue Group. Gold is present within the matrix of multiple, narrow stacked and un-classified ferruginous-conglomeritic reef horizons, which are interbedded with un-mineralised conglomerate, sandstones and grits with minor intercalations of shale, mudstone, siltstone and tuffs. The gold occurs as free particles up to 5 mm across within the ferruginous matrix of the conglomerates. It is closely associated with detrital pyrite and authigenic nodules. Previous owners and Novo have employed several sampling techniques across the project including diamond and RC drilling, trench channel sampling and bulk sampling. Assay methods included fire assay, LeachWELL and more recently PhotonAssay. As part of its 2018 evaluation programme, fifty-eight c. 1-4 t bulk samples were collected from accessible oxide mineralisation and processed via a pilot plant. This paper presents some of the issues and solutions applied by Novo, which have wider implications and impact on the sampling of other heterogeneous orebodies.
 
Article
Despite “sample collecting”, with the objective of evaluating the quality of a material lot, being a very ancient activity, and many books and papers being published with the purpose of “educating” the sampling community, we can say that the Theory of Sampling - TOS (developed by Pierre Gy) is the one that gave the best approach of the potential, and most common, errors of this activity. The focus of TOS in the early days was primarily dry particulate material, maybe because it presents bigger challenges in terms of heterogeneity, but there are other areas that have discovered the use of this precious tool to solve the same issues on many other kinds of materials, but it is still not universally adopted. For instance, there is a belief that liquids are completely homogenous, as if every liquid have the same behaviour as water, but it is not true, especially when we talk about mixed materials, not soluble between them, and with different densities. For instance, ore slurries (pulp), is a suspension formed by pulverized ore, flotation reagents and process water. This slurry may seem a homogenous substance, looking from the top of a flotation cell or discharging in a thickener, but, any fluid, flowing in a pipe develops a specific profile of speed which is dependent of the rheology and of the pipe wall friction, etc. Other important variables to consider in the make-up of mineral suspension include the concentration gradients through the flow profile and specific gravity of the particles. This work evaluates the types of slurry samplers or static cutters to answer the question:“Which equipment do I really need?” The answer to this question will help the project owner make the correct decision for plant sampling – including longer term viability.
 
Article
What is this? Sampling washed away; “samplewashing?” Greenwashing is a known term nowadays, but is there such a thing as “Samplewashing” too? Yes and no: Greenwashing is the process of conveying a false impression or providing misleading information about how a company’s products are more environmentally sound. … Greenwashing is a play on the term “whitewashing”, which means using misleading information to gloss over bad behaviour. And that is what this sampling paper is all about: presenting moisture results on samples of solid bulk materials where the theory of sampling was not applied… and therefore sampling errors are magnified by not only glossing over the representativeness of the process, but at the same time by watering-down the monetary profits of the trade for one whilst condensing them for the other party. It really is “Samplewashing” when it comes to moisture determination!
 
Article
Automated, mechanical cross belt (hammer) samplers remain popular because they are easy to retrofit into brown fields applications or green fields projects when cross stream samplers are not always designed into the plant layout from initiation. Cross belt samplers require less headroom and are easy to retrofit onto existing conveyors. Despite disputes about possible delimitation and extraction errors resulting from hammer samplers, they are not excluded for use from ISO sampling standards (13909 - Coal and Coke and 8685 - Bauxite) but are excluded from others (3082 - Iron ore). Possible errors can be mitigated by applying “know-how” into the bespoke design of a hammer sampler for installation on a specific conveyor belt system. This paper discusses the design details of a primary stage hammer sampler for a Bauxite ship-loading sampling plant. The design requirements of 10 000 metric ton per hour ship loading rate, 100mm particle top size, 1800mm wide conveyor travelling at 5.4m/s, results in a hammer sampler that takes up to a 260kg increment with each cut. The application requires more torque and at faster responses than that delivered by 10 Bugatti Chiron’s combined and is (to my knowledge) the largest of its kind requiring a unique high-torque-at-low-rotational-speed drive system where conventional geared motors could not achieve the necessary output performance. Even though the sampler is the primary stage to a complete operational sampling plant, the emphasis is on the power requirement calculations, the mechanical design, components, materials and features of this unit that makes it not only mechanically operational but also intended at high sampling precision levels prescribed by the Theory of Sampling (TOS). Despite a small statistical bias detected for the sampling system (not the hammer sampler only), the sampling plant performed within the maximum tolerable bias specified for the commercial trade application and is fit for purpose.
 
Top-cited authors
Kim H. Esbensen
  • Geological Survey of Denmark and Greenland
Rodolfo J Romañach
  • University of Puerto Rico at Mayagüez
Carlos Alberto Ortega-Zuniga
  • Rutgers, The State University of New Jersey
Andrés D. Román
Adriluz Sánchez-Paternina
  • University of Puerto Rico at Mayagüez