October 2024
·
9 Reads
Journal of Building Engineering
This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.
October 2024
·
9 Reads
Journal of Building Engineering
August 2024
·
49 Reads
International Journal of Metrology and Quality Engineering
Type A uncertainty evaluation can significantly benefit from incorporating prior knowledge about the precision of an employed measurement device, which allows for reliable uncertainty assessments with limited observations. The Bayesian framework, employing Bayes theorem and Markov Chain Monte Carlo (MCMC), is recommended to incorporate such prior knowledge in a statistically rigorous way. While MCMC is recommended, metrologists are usually well-familiar with plain Monte Carlo sampling and previous work demonstrated the integration of similar prior knowledge into an uncertainty evaluation framework following the plain Monte Carlo sampling of JCGM 101–the Supplement 1 to the GUM. In this work, we explore the potential and limitations of such an approach, presenting classes of data distributions for informative Type A uncertainty evaluations. Our work justifies an informative extension of the JCGM 101 Type A uncertainty evaluation from a statistical perspective, providing theoretical insight and practical guidance. Explicit distributions are proposed for input quantities in Type A scenarios, aligning with Bayesian uncertainty evaluations. In addition, inherent limitations of the JCGM 101 Monte Carlo approach are discussed concerning general Bayesian inference. Metrological examples support the theoretical findings, significantly expanding the applicability of the JCGM 101 Monte Carlo technique from a Bayesian perspective.
January 2024
·
8 Reads
October 2023
·
42 Reads
·
1 Citation
August 2023
·
151 Reads
·
1 Citation
Thermal management is a key issue for the downsizing of electronic components in order to optimise their performance. These devices incorporate more and more nanostructured materials, such as thin films or nanowires, requiring measurement techniques suitable to characterise thermal properties at the nanoscale, such as Scanning Thermal Microscopy (SThM). In active mode, a hot thermoresistive probe scans the sample surface, and its electrical resistance R changes as a function of heat transfers between the probe and sample. This paper presents the measurement and calibration protocols developed to perform quantitative and traceable measurements of thermal conductivity k using the SThM technique, provided that the heat transfer conditions between calibration and measurement are identical, i.e., diffusive thermal regime for this study. Calibration samples with a known k measured at the macroscale are used to establish the calibration curve linking the variation of R to k. A complete assessment of uncertainty (influencing factors and computational techniques) is detailed for both the calibration parameters and the estimated k value. Outcome analysis shows that quantitative measurements of thermal conductivity with SThM (with an uncertainty value of 10%) are limited to materials with low thermal conductivity (k<10Wm−1K−1).
March 2023
·
21 Reads
·
2 Citations
To improve the confidence and quality of measurements produced by regional and international infrasound monitoring networks, this work investigates a methodology for propagating uncertainty associated with on-site measurement systems. We focus on the propagation of sensor calibration uncertainties. The proposed approach is applied to synthetic infrasound signals with known back azimuth and trace velocity, recorded at the array elements. Relevant input uncertainties are investigated for propagation targeting the incoming signals (noise), instrumentation (microbarometers, calibration system, wind noise reduction system), and the time-delay-of-arrival (TDOA) model (frequency band). Uncertainty propagation is performed using the Monte Carlo method to obtain the corresponding uncertainties of the relevant output quantities of interest, namely back azimuth and trace velocity. The results indicate that, at high frequencies, large sensor uncertainties are acceptable. However, at low frequencies (<0.1 Hz), even a 2∘ sensor phase uncertainty can lead to errors in the back azimuth of up to 5∘ and errors in the trace velocity of 20 m/s.
May 2022
·
85 Reads
·
1 Citation
Journal of Radioanalytical and Nuclear Chemistry
INSIDER (Improved Nuclear SIte characterization for waste minimization in D&D operations under constrained EnviRonment) was a European project funded under the H2020-EURATOM programme and launched in June 2017. The project was coordinated by the French Commissariat à l’énergie atomique et aux énergies alternatives (CEA), it had a total duration of 4 years and covered a budget of 4 M€. INSIDER’s work was performed by 5 technical working groups (WG) which brought together 18 institutions from 10 countries, leading to a total of 68 participating researchers. The objective of the project was to optimise the radiological characterisation of nuclear installations in constrained environments in order to obtain an accurate estimate of the content of contaminated materials as well as to optimise the quantity of contaminated materials to be treated as waste. The focus of this paper is on the statistical analysis of an interteam comparison of measurement results (dose rate, total gamma measurement, and gamma spectrometry) made in situ at the BR3 reactor, Belgium.
May 2022
·
26 Reads
Talanta
In order to further improve the management of contaminated materials in nuclear facilities subject to a decommissioning programme, as well as during post-accidental site remediation and clearance, the definition and selection of the most appropriate intervention scenarios producing well-characterized radioactive waste for which storage and disposal routes are clearly identified is needed. As a step towards this accomplishment, we propose a methodology for the organization and analysis of coordinated interlaboratory comparisons (ILC) for the performance assessment and the uncertainty evaluation of available measurement techniques (methods and tools) of radioactive materials. This methodology is new for this type of comparison and demonstrated on the BR3 (Belgian Reactor 3, Belgian Nuclear Research Centre, Mol) case study from the H2020 INSIDER project (2017–2021), for which barium 133, cobalt 60 and europium 152 are analysed with gamma spectroscopy in ILC, based either on irradiated concrete from the BR3 bioshield or from spiked concrete certified reference material (CRM). On one hand, we show the advantage of organizing ILC on CRM for a more reliable uncertainty evaluation taking bias into account following ISO 21748:2017. But using CRM may be impossible due to their scarcity or too costly for performance assessment thus limiting the use of CRM in ILC in practice. On the other hand, we show that for performance evaluation and monitoring, ILC can be alternately performed on reference materials provided that laboratories’ uncertainties are reported and the most appropriate analysis of data is performed using dark uncertainty (excess variance) in the presence of inconsistent data.
July 2021
·
605 Reads
·
3 Citations
In this document, the examples illustrate various aspects of uncertainty evaluation and the use of uncertainty statements in conformity assessment. These aspects include, but are not limited to – choice of the mechanism for propagating measurement uncertainty, – reporting measurement results and measurement uncertainty, – conformity assessment, and – evaluating covariances between input quantities.
July 2021
·
151 Reads
This set of examples addresses measurement in healthcare in the following topic areas. The examples show improved and alternative treatments of the evaluation of measurement uncertainty, building on current practice in these areas. A diversity of topics is addressed, such as uncertainty arising in image reconstruction, determination of nanoparticle size distribution in waste water, quantification of small volumes and flows in accurate dose delivery to patients, and the determination of haemoglobin concentration in blood.
... Because of the large amount of data and the high rate of data acquisition in the process of dynamic calibration, the 24-bit high-precision ADS1271 chip is selected to complete the signal acquisition. Due to the high rate of data collected by ADS1271, the data can not be transmitted to the computer in real time through the ordinary serial port, so a SDRAM data memory is designed, which is based on the HY57V641620 chip [16][17][18][19][20]. Figure 8. Block diagram of the overall structure of the control system. ...
March 2023
... This sets out a specific set of validation procedures, together with criteria for acceptable performance for a wide range of analytes. A recent example of such a validation study, for an in-house modification of a standard method, [6] illustrated some of the problems of achieving reliable results across different soil matrices, even with very precise methods ( Figure 1); clearly, some matrices can provide individual challenges (LGC6145 for nickel in Figure 1). These pose practical difficulties for achieving performance and for reporting results and uncertainty. ...
July 2021
... To avoid the ill-posed nature of the problem, different strategies can be considered. One can used a maximum of a priori information like the wall thickness, sensitivity analysis methods to limit the model updating to the most significant model parameters [38][39][40][41] and regularization techniques such as Tikhonov regularization [36,[42][43][44] or Bayesian framework [7,27,36,45,46]. According to the numerical study in [36], only the temperature on the internal surface of the studied wall (noted T SI ) is considered in the inversion process. ...
June 2021
Energy and Buildings
... Kuya et al. [2011] and Xiong et al. [2013] adapt this to perform a sequential design on the lowest level, and then take a subset from this for the design to be run at higher levels. Stroh et al. [2022] aims to select new points based on maximising the ratio between the expected reduction of uncertainty and the cost of running the computer code. Our criteria optimises for exploration and exploitation simultaneously, as well as offering a non-nested design approach. ...
May 2021
Technometrics
... The Monte Carlo statistical method uses random sampling to obtain numerical results. It is widely used in various fields, such as physics and engineering [34][35][36][37]. The process is based on the law of large numbers, which states that the average of the results obtained from many trials is close to the expected value. ...
May 2021
Measurement
... On the other hand, the assumption is not severely restrictive, since models for the measurand are usually of a simple form, or already given by an inverse problem involving the observation model function. A general class of partially invertible models for the measurand and their Bayesian treatment is presented in [23]. ...
December 2020
... For the numerical experiments of this article (Sections 4.3 and 4.4) we will take a simpler route, assuming that the variance λ depends only on the fidelity level δ-which is approximately true in the two examples we shall consider. In this setting, as long as the number of fidelity levels of interest is not too large, the value of the variance at these levels can be simply estimated jointly with the other hyper-parameters of the model; a general-purpose log-normal prior for the vector of variances is proposed by Stroh et al. (2016Stroh et al. ( , 2017b. ...
December 2018
... The exact definition of the value of information depends on one's goals. For example, one could be interested in optimizing an objective [40,31,17,30,39,16,44,4,28,36,42,34,10], learning an accurate representation of the physical response [41,33,59,5,20,61] or estimating the probability of a rare event [45,46]. ...
July 2017
... These data are often used to validate, verify, and calibrate models. Recent advances in data-driven algorithms, such as data assimilation [12,13,14,15], statistical representations of spread and heat flux [16,17,18], and Bayesian methods [19,20], have improved model-data integration. However, with the artificial intelligence revolution, there are new and relevant techniques to combine models and data. ...
April 2017
Fire Safety Journal
... CSEs formulated with qualitative explanatory variables can also be performed, provided Rasch transformation is done [22]. Earlier, we have tested how a CSE could be obtained with qualitative explanatory variables for a measure of patient experiences of participating in care and rehabilitation [23], which in part corresponds to the method presented by Adroher and Tennant [15]. ...
May 2015