The WCSB10 conference will cover the latest research and application of the Theory of Sampling (TOS) and Blending in many important technology and industry sectors: mining, exploration, minerals processing, metals refinement, cement, food & feed, agri & aqua culture, pharmaceutical production etc. WCSB10 specifically has a broader societal, industrial and environmental emphasis with a special focus on sustainable science, technology and industry. This article provides an update on WCSB10 with information on all aspects of the conference.
For many of us in the regionally distributed and interconnected mining industries, the pandemic impacts were earlier and broader than most. PDAC, the mining mega-convention that descends on Toronto each spring, started 2020 as per any other year, but by the end of the week the world had changed. Sanitiser bottles appeared on tables, elbow bumps replaced handshakes, and the airports on the trip home were a mix of caution and carnage; a sign of the new reality to which we had now entered. COVID-19 had rapidly spread from being an isolated“Wuhan” virus, and many projects still had field personnel undertaking commissioning and service activities. In the space of a week in March 2020, the focus shifted from urgently completing tasks to evacuating staff back to safety as expeditiously as practicable. Clients were generally supportive of such movements, with similar strategies playing out within their operations. Movements were quickly constrained by pandemic restrictions, and the plane tickets, hotel beds and shipping containers were invariably prioritised for essential operations. Despite high opinions of our indispensability, we ceded priority in most jurisdictions to the public health response. It was only once personnel were back safely in their home cities or in hotel quarantine, stakeholder meetings had been urgently convened across myriad not-yet-ubiquitous online platforms, and formal written correspondence had been exchanged flagging the start of the disruption, that the reality set in; how to continue and complete mining project installations on the opposite side of the continent or world, with operations and suppliers suspended or furloughed, and no certainty as to when personnel and equipment mobility may resume? With very few precedents to draw upon in any of our working careers, the well-intended responses to these disruptions were varied in success, but in any case, will prove formative to how we act in future crises. Whilst we cannot predict with certainty when, where and how the next disaster will occur, it is incumbent on all to take the hard learned lessons of COVID-19 and have disaster response and recovery plans that are updated and reflect our real, lived experiences.
Charles Oliver Ingamells passed away in April 1994 at age 77. Ingamells received his BA at the University of Western Ontario and his MS at the University of Minnesota. During his later years in his retirement home in Florida he was a faithful representative of a group of well-known world experts in Sampling Theory, such as Pierre M. Gy, Francis F. Pitard, Jan Visman, Paul Switzer at Stanford University and J.C. Engels at the US Geological Survey and the Linus Pauling Institute in Menlo Park, California. His association with Francis F. Pitard during several years at Amax Extractive Research & Development in Colorado has added to a unique combination of different experiences in the field of geochemical analysis. His pioneering work in the field of geological sampling led to collaboration with the above experts.
The Theory of Sampling (TOS) has become firmly established in many process industries over the last few decades as the basis for precise and accurate material characterisation. Increasingly, considerations go into the design of sampling and sample preparation equipment to ensure sample representativeness. With the introduction of digitalisation under the buzzword Industry 4.0, many options have emerged to monitor these processes. But also, in the other fields, such as process analytics techniques (PAT) and applied sensor technology, which are not directly attributable to sampling, new applications arise where the Theory of Sampling is never-the-less a very useful addition. Here we present two case studies in which TOS have delivered decisive improvements in data acquisition.
The paper discusses the process design and mass balance for the minimum and maximum design case to illustrate the ISO 8685 compliance of a 2-year-running barge loading payment station sampling plant. The plant samples -100 mm, export quality, bauxite material from a barge-loading conveyor delivering 10 kton/h at 5.4 m/s. The green field operation does not know the Coefficient of Variation or the Size Range Factor as required inputs to calculate the Number of Primary Increments and Minimum Gross Sample Mass required. Therefore, informed assumptions were made given performance data of a neighboring sampling system that is in operation for over a decade. Without the available variation and size factor data the ISO-compliant scheme design could not commence. Where this data is not available for green field projects, it poses a risk that plant designs may not be compliant where variabilities could exceed assumptions on the input parameters. The system is designed for various barge carrying capacities with lot size in mass. Operational quality assurance however requires samples more frequently and therefore sublot periods are 4 hourly time based. ISO 8685 compliance is achieved with sample increments taken at maximum throughputs and barge sizes to determine the time-based interval. At reduced throughputs, the fixed time interval regime results in the minimum ISO requirements to be exceeded and tied in well with client overall quality incentives. Primary sample increments from a tailored cross belt sampler are crushed automatically in the sampling plant to 25 mm and then to 6mm using two stages of double roll crushers. The sample is then subdivided through secondary and tertiary sampling to produce a composite 4-hourly chemical sample. The 4-hourly sublot samples are collected in an ergonomic 4-way carousel with each composite sample representing 1-hour barge loading production— allowing the client quality assurance insights into their blending facility performance.
Industrial operations are often based on critical quality measures obtained for technical process control and/or to determine the value of raw materials and product streams. Process Analytical Technology (PAT) monitoring is applied to characterise, for example, raw materials, semi-finished as well as finished products. There is an active interest in approaches for “smart” online, real-time industrial sensor applications, especially where industrial operations involve high sample throughput and/or may involve hazardous substances demanding automation. State-of-the-art sample preparation procedures and equipment can deliver key performances indicators, often supplemented by sensor data that are used as proxy quality measures which helps to ensure measurement representativity and optimal process/product control. We here illustrate this industrial front-line arena by an example in which PAT accelerometer data are used for real-time monitoring of the efficiency of the automated grinding sample preprocessing process.
Process monitoring in technology and industry in general, in pharmaceutical batch and continuous manufacturing in particular, is incomplete without full understanding of all sources of variation. Pharmaceutical mixture heterogeneity interacts with the particular sampling process involved (by physical extraction or by Process Analytical Technology (PAT) signal acquisition) potentially creating four Incorrect Sampling Errors (ISE), two Correct Sampling Errors (CSE) in addition to the Total Analytical Error (TAE). In the highly regulated pharmaceutical production context it is essential to eliminate, or reduce maximally, all unnecessary contributions to the Total Sampling Error (TSE) to the Measurement Uncertainty (MU total) in order to be able to meet stringent regulatory blend and dose uniformity requirements. Current problems mainly stem from inadequate understanding of the challenges regarding sampling of powder blends. In this endeavor the Theory of Sampling (TOS) forms the only reliable scientific framework from which to seek resolution. We here present the variographic approach with an aim to conduct TSE error variance identification and to show how to develop fit-for-purpose acceptance levels in critical powder blending process monitoring. The key issue regards the nugget effect, which contains all non-optimised [ISE, CSE] plus TAE contributions to MU total. A large nugget effect w.r.t. the sill is a warning that the measurement system is far from fit-for-purpose, and must be improved. Regulatory guidances have hitherto called for physical sampling from within blenders, leading to significant ISE associated with the insertion of sample thieves (sampling spears). Instead of self-crippling spear sampling we here call for a paradigm shift, very much from the TOS regimen, in the form of alternative on-line variographic characterisation of 1-D blender outflow streams. Practical illustrations and case histories are described in parallel contributions to WCSB7.
Metal accounting is one of the main tools for financial and technical management of metal production industry. It is based on
measurements and has to manage the uncertainty inherent to the measurement process. The uncertainty in the metal accounting
generates financial risk. The accuracy of the metal accounting results is directly linked to the accuracy of the material balance
and then to the accuracy of the mass and content measurements. Estimate the overall measurement error, through its probability
distribution or its first and second moments (mean and variance), can contribute to the enterprise decision making.
The overall measurement error can be calculated and analysed by establishing the uncertainty budget. If this approach has been
mainly introduced to calculate the analytical error (cf. ISO GUM), it has to take into account the sampling procedure. Even though it
is not explicitly named “uncertainty budget”, the same approach is proposed in the Pierre Gy’s Theory of Sampling (TOS), where the
various components of the overall error are well identified and described with their properties and their relative weights.
The present paper proposes a methodology to build such uncertainty budgets in the frame of the implementation of a metal
accounting system. It can be applied to an existing measurement system, analysing the results in order to find some ways for
improving the measurement accuracy. In addition, it can be used to define a new measurement procedure with an objective of
accuracy. Various real examples illustrate both applications.
Following years of development and testing, in-situ chemical assay by Pulsed Fast and Thermal Neutron Activation (PFTNA) has been implemented in mining grade control at BHP Western Australian Iron Ore as a world first. Demonstrating the technical capability and aptness of a new methodology, however, is not sufficient to ensure the sustained quality of reported assay data. The success of moving from testing stage to implementation in active mining grade control, is chiefly dependent on the robustness of ongoing quality control and quality management. This paper shows the steps undertaken to achieve end to end monitoring of data acquired by Blasthole Assay Tools (BHAT) using PFTNA methods. The main challenge forin situ chemical assay by the BHAT is to design a quality assurance/quality control program (QA/QC) without a physical sample being collected, and in consequence, without the conventional separation into the focus areas sample collection, sample preparation and laboratory analysis. In this context, the BHAT combines all in one in-strument, and different ways to monitor data integrity, repeatability and accuracy need to be established as outlined below. After the validity of a BHAT calibration has been verified and a tool is in operation, data is monitored on a daily basis to check that relevant operational parameters inside the tool are working within defined acceptance limits. Measurement error in the field is monitored with repeat logs in Blastholes, and inter-instrument error by replicate logs of different BHAT units in the same Blastholes. Accuracy and instrument drift over longer periods are monitored by repeated logs in Reverse Circulation (RC) drill holes. Operational parameters, such as neutron output and spectral resolution of the instrument detector are monitored by scheduled logs in dedicated testing facilities. Also, duplicate manual sampling in Blastholes isused to compare grade populations obtained by different sampling methods in mining pits to aid grade reconciliation from mining to production. By routine application of these QA/QC steps, in conjunction with close communication of results to mining teams, the new BHAT technology has been successfully embedded in day to day mining operations.
Determination of the complete sampling distribution (Lyman, 2014), as opposed to estimation of the sampling variance, represents
a significant advance in sampling theory. This is one link that has been missing for sampling results to be used to their full potential.
In particular, access to the complete sampling distribution provides opportunities to bring all the concepts and risk assessment
tools from statistical process control (SPC) into the production and trading of mineral commodities, giving sampling investments
and results their full added-value. The paper focuses on the way by which sampling theory, via the complete sampling distribution,
interfaces with production and statistical process control theory and practice. The paper evaluates specifically the effect of using the
full sampling distribution on the Operating Characteristic curve and control charts’ Run Length distributions, two SPC cornerstones
that are essential for quality assurance and quality control analysis and decision-making. It is shown that departure from normality of
the sampling distribution has a strong effect on SPC analyses. Analysis of the Operating Characteristic curve for example shows that
assumption of normality may lead to erroneous risk assessment of the conformity of commercial lots. It is concluded that the actual
sampling distribution should be used for quality control and quality assurance in order to derive the highest value from sampling.
For Anglo American Platinum (AAP) to reach their burning ambition goal of doubling the Earnings Before Interest, Taxes, Depreciation, and Amortization (EBITDA) by 2023, sites are required to adopt an alternative approach to improve representativeness of metal accounting samples given the increase in grind and throughput demand. The success of optimization projects will rely heavily on metal accounting data being accurate so that improvements declared are based on sound samples and assay measurements. The 60 litre mechanically agitated hopper (MAH) was initially developed and ratified in order to overcome particle segregation evident in the 20-litre conventional, compressed air-agitated hoppers of vezin type sampling systems. A sustained plant accountability performance within the range of 95-105% was realized due to the correction of the previously overstated feed grade by means of a more representative sample. Pierre Gy’s rule of thumb of 30 increments per sampling campaign has not been proven and documented for the Platinum Group Metals (PGM) industry. The MAH however with additional volume capacity allows for flexibility to increase the primary sampling increments per shift from ± 32 to ± 96 to cater for process variability (thereby reducing distributional heterogeneity) without increasing the overall resulting final sample mass. Additional technology and larger 110/220 litre capacity hoppers have been deployed. Enhancements include a wash water and drainage system, an improved trash screen design and high/low hopper level sensors. The MAH principle of operation has also been expanded to cater for a double (3-drive) stage sampling system as well as a triple (5-drive) stage sampling system. It is believed that the latest MAH design will satisfy the Theory of Sampling principles and therefore a motivation for an industrial roll out of the innovation within AAP is underway.
The Aloha Sampler is an innovative new sampling tool to effectively collect and combine increments from dynamic, liquid, one-phase and two-phase systems. It is extremely inexpensive and very cost effective to implement and produces more representative samples than any other conventional techniques. TOS forum has asked EnviroStat to present the Aloha Sampler for its readers.
In order to minimise the sampling error and sampling bias associated with the sampling of metal bearing ores it is essential that the heterogeneity characteristics of the ores be fully appreciated. Heterogeneity tests were carried out on the significantly different manganiferous ores produced at Wessels and Mamatwan mines near Hotazel, South Africa, for the purpose of establishing an optimal sampling protocol for the ores. The method referred to as the Segregation Free Analysis (SFA) was used for the determination of the parameters K and Alpha by construction of calibration curves. The method involves crushing a sufficient amount of ore so that after passing it through a set of fifteen nested screens there is sufficient material to then be split into 32 samples of mass 2-5 kg, using a riffle splitter, and analysing each of the samples. Thus, for fifteen nested screens there are fifteen series each consisting of 32 samples, making a total of 480 samples for analysis. Of the eighteen elements that were analysed in each sample only %Mn3O4, %FeO, %K2O, %P, and %SO2 were calibrated, the first two being the main paying elements and the last three being deleterious elements for the smelting processes in which the ores are used. Calibration curves indicate that for the coarse fraction, above 1 cm, manganese ores have alpha values close to 3, whereas those less than 1 cm in diameter have alpha values closer to 1. Reasons for this behaviour are uncertain but it could be related to the behaviour of the crystal structure in the very pure ores as the ores are progressively crushed and screened to finer size fractions. Separate nomograms were therefore prepared for the coarse and fine fractions. Net conclusions indicate that both Wessels and Mamatwan ores are relatively easy to sample and that simple two or three stage processing will suffice when preparing the final 2 g aliquot at 75 microns. Apart from minor modifications in the sample preparation protocols, there is no evidence to suggest that the Wessels and Mamatwan ores require different sample preparation protocols, or that they should be assayed differently. The calibration curves for manganese ore are compared with the calibration curves for gold bearing ores which generally have alpha values close to 1. The difference in alpha between the gold ores and bulk commodities is considered to be related to the primary distribution of the metals in nature, lognormal for gold and normal for manganese.
Industrial and technological processes are very difficult to manage when the quality of feed and product or discard are not measured with confidence. Effective control can occur when process analytical technologies are chosen that provide representative, precise, and timely measurements. For the measurement technique to be representative it must comply with the Theory of Sampling (TOS) and provide an equal chance of any component in the streaming material to be included in the support for the measurement. This generally precludes technologies that measure only the surface of materials, or biased measurements stemming from a limited portion of the material only, particularly in the minerals processing and recycling sectors, which usually display high compositional variability. The location of the analytical technology should relate to the benefit being targeted and allow for enough reaction time to respond to the quality in some way; diverting short increments based on composition and decision parameters based on process impact, blending with other quality materials, or feeding information backwards or forwards. Feed forward options can include flow rate control, reagent control, operational process variables that impact recoveries, etc. Major benefits have been achieved in measuring coarse conveyed flows with high specification Prompt Gamma Neutron Activation Analysis (PGNAA) over short increments (thirty seconds to two minutes) for most elements, or over five to ten minute increments for trace elements, such as gold. PGNAA applied to conveyed flows allow the full flow to be measured continuously and composition averaged for each increment in real time. The use of penetrative and continuous moisture measurement using transmission microwaves has also proved effective for moisture monitoring and management. Precisions between laboratory samples of the flow and analysis data from analysers can be sufficient to have high confidence in resulting process control decisions. This paper explains the benefits in more detail and includes case studies to highlight actual benefits derived from the application of such systems. It should be noted that sampling of the materials is still required for calibration and adjustments for the process analytical tools.
Technically, sampling of food and feed is the process of selecting a small mass from a larger quantity of material for the purpose of performing a measurement, quantitative or qualitative, on the selected portion and making valid inferences with respect to the entire target mass (Decision Unit). It is too often simply assumed (without justification) that the representativeness and integrity of the sampled material is a given, and consequently also erroneously assumed that the measurement results obtained can be used to make reliable inferences about the target. This is a seriously mistaken assumption. Sampling of food and feed materials is performed for a number of reasons at various stages of an integrated food safety system, including but not limited to, premarketing risk assessment, process control in a food/feed manufacturing environment, first responder investigations to foodborne disease outbreaks, and regulatory compliance (agencies/programs performing monitoring or surveillance of food or feed products in support of food safety surveillance of food or feed products in support of food safety regulations. While sampling situations are diverse, and for many the immediate thought is that specific sampling procedures probably should be tied in with the specific nature of the products or processes being sampled, a singular, unified approach can in fact address all situations and products, aiming for a fit-for-purpose (fit-for-decision) representative sampling process. The target audience includes food/feed protection personnel, e.g., field sampling operators, academic and industrial scientists, laboratory personnel, companies, organizations, regulatory bodies, and agencies that are responsible for sampling, as well as their project leaders, project managers, quality mangers, supervisors, and directors who are responsible for business and other decisions of economic and societal importance. In the United States alone there are an estimated 45,000 federal, state, and local food/feed regulatory personnel, not including industry or laboratory personnel. With a conservative estimate of 50-75% of them involved in sampling activities, the target audience forms a very sizable body in the United States as well as worldwide. For the world at large, the relevant numbers are exorbitant. There is much to do…. And there is here a powerful carry-over effect beyond food and feed sampling. The general principles presented apply to any-and-all materials (lots, DUs) with similar heterogeneity characteristics as those in the food, feed, and environmental sciences. Perhaps paradoxical at first view, sampling of heterogeneous materials is in a sense a matrix-independent endeavour, only the material heterogeneity counts2-5. In this sense ref. 1: “Representative Sampling for Food and Feed Materials”, Special Guest Editor Section Journal of AOAC International, vol. 98, No. 2 (2015) constitutes a general introductory mini-text book for representative sampling1.
Industrial research experiments are conducted in various scales in the mining industry. Regardless of the experimental purpose, sampling and analysis is normally always a part of the experimental process to collect necessary data. However, in order to ensure that the experiment will enable valid conclusions, the understanding and minimisation of sampling variation is crucial. Two effective methods for evaluation of sampling variability in any process sampling situation are the duplicate and replication experiments. The application of sampling experiments in the early phases of a demo- or pilot-scale experiment is an effective way to both understand the total measurement system variability, as well as the possibility to improve sampling methods if the sampling variability is deemed too high to enable representative results to use for experimental evaluation. LKAB is an iron ore mining company in the north of Sweden where experiments are conducted in all parts of the process value chain with regularity. In the current state of the world, encountering more and more threats to our global climate and environment, a focus for LKAB has been to reduce the use of fossil fuel as well as to minimize waste and tailings. One of LKABs current environmental initiatives is to investigate the feasibility of recovering and processing apatite from tailings of the LKAB beneficiation process. Further processing of recovered apatite will generate critical raw materials, phosphorous, rare-earth elements, and fluorine. To increase the understanding of the process variability of various analytical parameters in a pilot-scale experiment within this project, both duplicate and replication sampling experiments were conducted during one of the pilot-scale campaigns. The sampling experiments were applied to three separate sampling locations where two different sampling methods were used. Results show that both sampling method and sampling experimental method can affect the results obtained. The case study showed that the sampling variability was higher for sampling locations where grab sampling was applied, in comparison to composite sampling that generated lower sampling variability at one of the sampling locations in the pilot plant. This indicate that the composite sampling method can produce more representative results and should be favourable in future process experiments. The results also indicate that the duplicate sampling experiment is more robust to outliers in comparison to the replication experiment. The duplicate experiment is also able to quantify the process variability and evaluate the relative sampling to process variability which can be an advantage in some cases.
Sampling is necessary every time inferences are to be made to take informed, optimal decisions in science, technology, industry, trade and commerce. For reasons extensively addressed over the last two decades, some fields, normally those where good sampling practices are a source of economic gain such as the mining/minerals/metals industrial sectors, explicate the role of sampling more than others. This is not the case within the realm of food and feed safety assessment where sampling continues - still today - to be perceived more as an economic burden and a technical necessity to be fulfilled because of regulatory demands, rather than a need to ensure reliable evidence to support management and regulatory decisions. This is true today and will become even more central in the future to address the challenges posed by the accelerating climate crisis, resources depletion and increasing food demand. Risk assessment and sampling are both probabilistic disciplines, the first devoted to estimate and minimise safety risks, the latter devoted to estimate and mitigate sampling risks (the effects of sampling errors). Here we offer an exposé with the aim of positioning TOS as an essential disciplines and practical tool needed to ensure the best possible estimation of risks in support of safety decision-making and risk management in all of food and feed sciences, technology, industry, trade, commerce, and society at large. We demonstrate that sampling plays an integral, but an often much overlooked role in all these fields.
Quality Assurance and Quality Control (QA/QC) is of critical interest in the mining industry. Over the years, Anglo American Platinum has adopted a sound strategy of Best Practice Principles (BPP) for mass measurement, sampling, sample preparation, analysis, and metal accounting. Often, much effort is focused on implementation and maintenance of quality control systems to provide quality assurance. Within the Anglo American Platinum business units, QA/QC data are deemed of significant value on a day-to-day basis and on a higher level also provides a means to prove or disprove evaluation and metal accounting disputes between various sites and/or opposing members of the Joint Evaluation Committee (JEC). Unfortunately, QA/QC data and associated QA/QC systems alone do not always provide the technical or tangible reasons to supplement explanations around anomalies in performance. It is sometimes necessary to go beyond monitoring and focus on interpreting the QA/QC data to comprehend the underlying issues. This paper aims to showcase a multitude of actual case studies pertaining to troubleshooting of challenges encountered throughout the Platinum processing pipeline (i.e., Concentrator to Smelter to Refinery). These challenges range from areas of mass measurement to sampling, to sample preparation and analytical as well as plant performance. Observations and learnings from these instances indicated that even though stringent QA/QC was adhered to, it was evident that complying to first principles of mass measurement and sampling theory, minimum sample mass and an ongoing understanding of individual material characteristics was crucial. It was also highlighted that the re-assessment of designs, methods and protocols are necessary per material stream and that a standardization approach across all Anglo American Platinum business units is perhaps sensible at one time but may not always be appropriate and/or relevant.
Many commercial coal testing laboratories are accredited to ISO 17025 – General requirements for the competence of testing and calibration laboratories. There is an expected reliance in the Coal mining industry that the laboratory adheres to all elements of this standard in between successive accreditation audits conducted every 18 months by the independent accreditation body, National Association of Testing Authorities (NATA). In the absence of proactive QAQC & QM practices monitoring the quality of the information reported by laboratories, potential issues impacting production decisions and reconciliation results are only determined in a reactive manner. In addition, for mining companies working with several internal and external laboratories across the supply chain, the management of the logistics, practices and information becomes very challenging and time-consuming, impacting the company’s ability to track laboratory results as key inputs in a production and reconciliation perspective. The absence of proactive QAQC & QM practices results in a sub-optimal/reactive approach in the Coal industry, increasing the risk of short-term unaware production gaps related to quality, increased time required for quality breach investigations, the absence of a holistic approach/monitoring in the value chain, and the financial impact for the business performing under sub-optimal conditions. This paper aims to show the journey towards the implementation of a new proactive QAQC and QM program, where now the quality of many different laboratories across the supply chain can be monitored and linked with global reconciliation results, as an improvement opportunity to complement the current industry standard ISO 17025 accreditation and Proficiency Round Robin approach.
What is this? Sampling washed away; “samplewashing?” Greenwashing is a known term nowadays, but is there such a thing as “Samplewashing” too? Yes and no: Greenwashing is the process of conveying a false impression or providing misleading information about how a company’s products are more environmentally sound. … Greenwashing is a play on the term “whitewashing”, which means using misleading information to gloss over bad behaviour. And that is what this sampling paper is all about: presenting moisture results on samples of solid bulk materials where the theory of sampling was not applied… and therefore sampling errors are magnified by not only glossing over the representativeness of the process, but at the same time by watering-down the monetary profits of the trade for one whilst condensing them for the other party. It really is “Samplewashing” when it comes to moisture determination!
Automated, mechanical cross belt (hammer) samplers remain popular because they are easy to retrofit into brown fields applications or green fields projects when cross stream samplers are not always designed into the plant layout from initiation. Cross belt samplers require less headroom and are easy to retrofit onto existing conveyors. Despite disputes about possible delimitation and extraction errors resulting from hammer samplers, they are not excluded for use from ISO sampling standards (13909 - Coal and Coke and 8685 - Bauxite) but are excluded from others (3082 - Iron ore). Possible errors can be mitigated by applying “know-how” into the bespoke design of a hammer sampler for installation on a specific conveyor belt system. This paper discusses the design details of a primary stage hammer sampler for a Bauxite ship-loading sampling plant. The design requirements of 10 000 metric ton per hour ship loading rate, 100mm particle top size, 1800mm wide conveyor travelling at 5.4m/s, results in a hammer sampler that takes up to a 260kg increment with each cut. The application requires more torque and at faster responses than that delivered by 10 Bugatti Chiron’s combined and is (to my knowledge) the largest of its kind requiring a unique high-torque-at-low-rotational-speed drive system where conventional geared motors could not achieve the necessary output performance. Even though the sampler is the primary stage to a complete operational sampling plant, the emphasis is on the power requirement calculations, the mechanical design, components, materials and features of this unit that makes it not only mechanically operational but also intended at high sampling precision levels prescribed by the Theory of Sampling (TOS). Despite a small statistical bias detected for the sampling system (not the hammer sampler only), the sampling plant performed within the maximum tolerable bias specified for the commercial trade application and is fit for purpose.
Development of cost-effective and accurate methods are crucial for operational condition monitoring of wind turbine blade bearings. Based on the Theory of Sampling (TOS), a novel method for acquiring representative samples of lubricating grease from in-service wind turbine blade bearings has been developed over the last 10 years, in some respects similar to “on-line” PAT approaches used for continuous process sampling. The new method is compared and evaluated to a comprehensive, fully TOS-compliant refence sampling performed on dismantled bearings, which is a complete analogue to “stopped belt” reference sampling. Three case studies are reported with which to illustrate the merits of the new wind turbine bearing condition monitoring method, which is needed in the rapidly developing renewable CO2-free energy supply chain. Seen in the context of the currently much accelerated need for a massive green transition, the market prospects for this wind turbine process sampling approach can hardly be overestimated.
Better sampling, preparation, and analysis (SPA) can improve the precision of results for resource grades and commodity trading. What we are often asked by customers is how they might quantify these improvements in terms of economic benefits. One way to do this is by applying the SPA precision of your results to the selling price of the resource. In this paper, a deeper look into resource pricing methodology for iron ore will be undertaken and how improved precision, on the quantification of the critical elements in this product, can affect the selling/purchase price of this commodity. Using a real-world example, this paper will show the results of a basic business case study, investigating the return on investment (ROI) for a Sampling Improvement Project (SIP) including a well-designed sampling, sample transport, sampler preparation and analysis facility. The investigation will include the estimated total cost of the SIP, from the problem statement to implementation, together with an estimate on operational costs. This is then compared to the potential profit gains that the SIP could provide. Primary focus will be on structures/methods used to determine commodity prices, how the measurement of the concentration of the critical elements link to pricing, and how variations in measured vs actual concentrations effect the final price. Also considered will be the economic benefits of faster more reliable data collection as well as improved quality moisture measurement.
The most basic concept of sampling theory is that “a sample is part of a lot”, where the sample collected needs to be representative to the lot sampled. On sample stations, the lot to be sampled is represented by the material transported by the conveyor belt, while the sample is collected and further subsampled via cutters until the final sample collection point. Current normal practices to evaluate the operation of sample stations that support processing, metallurgical balance, reconciliation, and final port shipments are typically based on visual inspections: Material build-up on cutters, samplespillage, reflux while sampling, pegging on sizing screens, worn cutter lips are all observations that indicate issues. Being subjective observations, these do not allow the quantification of the sample’s representivity, and the risks for mining businesses due to a positive or negative bias being incorporated during sample collection stage. Bias Tests are mentioned in several International Standards across commodities (ISO3082 for Iron, ISO 13909-8 for Coal and ISO12743 for Copper, Lead, Zinc and Nickel, for example) to compare the sample obtained against the material it is supposed to represent at the control point. The current methodology and strategy used in the industry requires the interruption of the regular production process multiple times in a row for extended periods of time, to manually extract the material from the conveyor belt (also including manual handling and safety considerations). For this reason, bias tests are not very popular in industry (“we lose a lot of money and time having to interrupt our process many times”) - and are therefore usually performed only very reluctantly, or not at all, exposing mining companies to higher production and financial risks than necessary, hence it is simply assumed that the processes involved are not affected by bias. This paper is presenting a proactive approach to perform a Bias Test, developed at Hay Point Port Coal, a Rolling Bias concept has been developed, switching the current reactive, time consuming and manual process task, to a more proactive and frequent methodology that allows for trending analysis of the sample station. Quarterly planned maintenance stops are used to perform the bias test, where a vacuum system developed and tested by ALS Laboratory and BHP Coal, performs the collection of the material from the conveyor belt drastically reducing the time required to perform the task manually, but more importantly reducing the exposure of people to safety and manual handling risks. This approach enables Hay Point Port to have quarterly performance data of the sample station, converting this process to a more objective, proactive, and sustainable approach where data, every quarter, has been monitored since 2019.
Diamond drill-hole grades are known to be of better quality than those of blast holes; is this true? We present a formal study of a porphyry copper deposit in Chile where the variogram of 3-meter long drill hole samples is compared to 15-meter long blast hole ones and we show that the blast holes can be assumed to regularizing the point information deduced from the drill holes, except for a nugget effect specific to the blast samples. Complementary analyses based on migrated data show that the drill holes also have their own errors. After a brief description of the first steps in the blast sampling protocol, we show, by using extension variance concepts, that the blast error is not due to the arbitrary removal of material from the sampling cone produced by drilling.
We answer the question “Exactly how did the World Conference of Sampling and Blending originate?” With WCSB 10 approaching, “three fellows” decided to do something about this. It turned out to be quite a detective story spanning 20+ years, three continents, many obsolete PC platforms and searching through several thousands of old e-mails. The story, as told here by Messieurs Francois-Bongarcon, Vann and Esbensen, is a tour-de-force of the pre- and very early history of the WCSB institution.
A replication experiment was performed to validate a stream sampling method for a pharmaceutical powder blend. A 1.5 kg powder blend was prepared and an in-house developed feeder was used to divide into six sub-samples of approximately 250 g. Each 250 g sub-sample (1/6 total blender lot volume) was deposited along a rig of 3 meter length. A validated near infrared (NIR) spectroscopic method was used to determine the drug concentration as the powder deposited in the rig moved at a linear velocity of 10 mm/sec. The depth of penetration of the NIR radiation was 1.2 mm and the sample volume analysed was approximately 180 mg. The MPE (minimum practical error) obtained with the system was 0.04% w/w acetaminophen (APAP), which was considered excellent for the system. The replicate analysis of the powder deposition provided 390 measurements of drug concentration, with a mean APAP concentration of 14.93% (w/w) and a relative standard deviation (RSD) of 5.20%. Replicate measurements (n=650) of the powder deposited along a single rig of 3 m length x 10 provided an RSD of 2.23%, attributable to deposition (outflow) heterogeneity. Finally, static replicate analysis of the measurement error alone amounted to an RSD of 0.14%. The embedded replicate experiments elucidated all sources of variation in a sampling system for pharmaceutical powder blends, and proved reliable and highly sensitive in identifying areas of non-acceptable residual heterogeneity (dead zones).
The amount of waste printed circuit boards (WPCBs) currently represent a fast-growing issue that must be properly managed to limit their impact on the environment and human health. Due to their metal content, they can also be considered as a resource. Their characterization is a key point for evaluating different valorisation processes. The conventional methodologies to characterize wastes and/or metal resources are hardly applicable to such waste, as they are highly heterogeneous, difficult to micronize, and their individual components (plastics, glass, ceramics, and metals) are hard to liberate. Thus, in parallel to developing analytical tools that allow their accurate characterization, a sampling strategy suitable for WPCBs must also be established. In this study, an empirical approach was developed that aims at estimating the uncertainty arising from sampling WPCBs. To do so, the duplicate method of uncertainty estimation was followed, to compare the metal content in different sub-samples and to determine confidence intervals.
Sampling of bulk mineral commodities for international trade, such as iron ore, coal and a wide variety of mineral concentrates, is generally carried out in accordance with National or more commonly International (ISO) Standards developed to provide representative samples for subsequent analysis and payment. Because commercial transactions are involved, clearly getting the sampling right is critically important, and poor sampling practices can potentially lead to substantial financial losses for one of the parties involved. The “golden rule” for correct sampling is that “all parts of the material being sampled must have an equal probability of being collected and becoming part of the final sample for analysis”. If this rule is not respected, then bias is easily introduced and samples are not representative. While on-site observations indicate that the adoption of good sampling practices is improving, ensuring that samples are representative continues to be an ongoing challenge. This is often due to cost-cutting measures where sampling facilities, equipment and operations are the first to suffer, or it may just simply be due to ignorance of the requirements for collecting representative samples despite the existence of National and International Standards as well as high level sampling courses presented by international experts. More often than not, the company focus is on maximizing production tonnage rather than product quality and its measurement. Areas where significant issues continue to occur include: Primary cutter design for ever increasing high-capacity streams Correct operation of cross-stream secondary cutters Crusher performance and ongoing maintenance, particularly in relation to product particle size Retained sample mass versus particle size Extraction and handling of moisture samples Equipment maintenance. Timely ongoing maintenance of sample stations is critical and needs to be a high priority to ensure correct performance. A“set and forget” strategy simply does not work. Sampling needs to be given the commitment it deserves by company management, particularly through correct sample plant design, timely equipment maintenance, and appropriate staff training and awareness.
Silicon is the element of significant importance — including the production of electronic devices, solar panels and metallurgical alloys. Silicon materials producers must keep strict quality control of their raw materials and products. Modern and innovative silicon production line requires high quality analytical information about chemical composition — for this purpose instrumental methods are commonly used. Correctness of the results obtained with these methods must be verified using materials of well-know composition, traceable to SI units, confirmed by the certificate. These materials are called certified reference materials (CRM). In response to the market needs, ELKEM (Norway)— the world’s leading silicon producer, together with ŁUKASIEWICZ — Institute of Non-Ferrous Metals (Poland) - an experienced CRMs producer, started the SILREF project, which consist of production of 8 CRMs for silicon material: metal silicon, ferrosilicon and microsilica. Development of new CRMs consist of several stages: production of material with planned composition, homogenization via grinding, sieving, and mixing, homogeneity and stability testing, characterisation, and certification. Sampling is part of almost every one of these steps. The choice of the right sampling scheme has a significant impact on the final quality of the produced CRMs. Homogeneity of powder type CRMs and its estimated uncertainty makes important contribution to the total uncertainty of the reference values. After homogenization and dividing the material into the 100 mL jars, 200 to 700 single units were obtained - depending on the material. In this case it is important to determine the homogeneity between the units and inside them. This was performed in accordance to the rules of ISO Guide 35:2017“Reference materials — Guidance for characterization and assessment of homogeneity and stability”. 10 random single units from every material was selected. Then, three samples from different place were taken from each unit. All parameters planned for certification (element concentrations, loss of ignition etc.) require analysis for homogeneity, stability and quantitative determination. The obtained data were used for the statistical evaluation of homogeneity based on the ANOVA test. The determined homogeneity was taken then for further calculations of total uncertainty. The determination of homogeneity, stability and characteristics is accompanied by a large number of analytical test and statistical calculations that lead to determination of the final reference’s composition values with their expanded uncertainty.
During growth and post-harvest storage, fungi can infect grain and produce secondary metabolites known as “mycotoxins ”. Some mycotoxins are regulated due to their potential hazardous health effects. Thusly, analysis of bulk grain consignments for mycotoxins is common in the grain trade. The heterogeneity of bulk grain with respect to deoxynivalenol (DON) and ochratoxin A (OTA), two regulated mycotoxins, was investigated. Variation of concentrations amongst individual wheat kernels was assessed, along with the variation within sub-samples and test portions produced from 10 kg laboratory samples, and amongst 500 t increments sampled during loading of bulk shipments (4,600 to 55,000 t). Concentrations in individual kernels ranged from < 0.02 to 583 mg/kg for OTA and < 0.3 to 414 mg/kg for DON. Analysis of the distribution of concentrations was limited due to the difference between the sample sets available for use; one was naturally infected (DON) and the other was inoculated and incubated under laboratory conditions (OTA). Bulk shipments were sampled during loading using a Canadian Grain Commission-approved automated cross-stream diverter-type sampler and in-line divider. Increments were combined, and 10 kg laboratory samples were prepared from the resulting composite using a Boerner divider, comminuted using a rotor beater mill, and sub-sampled using rotary sample division to produce representative sub-samples and test portions. Concentrations of OTA in the 500 t increment samples varied from < 0.25 to 22.9 µg/kg; DON varied from < 0.05 to 0.67 mg/kg. Within shipments, the OTA concentrations varied more amongst increments than did DON. The coefficients of variation for OTA ranged from 42 to 95% which were 2-4× greater than for DON. The results illustrate heterogeneity of bulk wheat relevant to international trade and regulated mycotoxins. Differences observed for DON and OTA also reflect how biological differences in mycotoxin production contributes to the challenges faced in analysing bulk whole grain for mycotoxins.
Value created by mineral beneficiation processes relies on physical separation (including flotation) of particles containing different grades of metals or minerals of economic interest. The value created is a direct consequence of exploiting heterogeneity as governed by the properties of the natural resource that is processed, most often starting by well-controlled size reduction processes, e.g. blasting, crushing and grinding. These are usually applied in a stepwise fashion before the separation stage exploiting the liberated mineral(s) of economic interest. While material heterogeneity usually is a property that must be managed (reduced) by mixing and blending to stabilize sampling and optimize extraction processes, it can also be the fundament for creating value by first increasing liberated heterogeneity to allow effective sorting to come into play. To calibrate value-adding physical separation processes it is necessary to be in complete control of material heterogeneity, as part of characterization of the original natural resource. Optimal physical separation is based on relevant sampling, Theory of Sampling (TOS). However, this is currently done with labour intensive processes conducted with highest precision but notably most often only using small sample masses, often leaving accuracy be the wayside. This contribution shows that TOS is a prime factor, as is representative heterogeneity characterization, necessary for optimizing mineral beneficiation processes.