Conference PaperPDF Available

Yield stress is not enough: Recent innovations in micromechanics for nonlinear analysis

Authors:

Abstract and Figures

Material nonlinearity is implemented in finite element programs using mathematical models of material behaviour. These are typically defined by a set of parameters whose values are determined from experimental data. These constitutive models are phenomenological; that is they describe nonlinear material behaviour, but do not explicitly model the physical mechanisms that lead to plasticity. In this paper, the authors discuss how three different modelling strategies may improve realism; (i) Stochastic Monte Carlo Simulation, (ii) Image-based Modelling and (iii) Cellular Automata coupled with Finite Elements. Each of these techniques are linked to the underlying micromechanics in a different way. Case studies are presented for a range of materials (namely graphite, bone and polycrystalline iron) to give an overview of how each of these techniques might be used. This paper will be of interest to engineers who might not have previously considered (a) stochastic modelling being relevant to their work or (b) using X-ray tomography data and image-based modelling as an alternative method for calibrating a constitutive model. Furthermore, the three techniques could be used together by firms wanting to custom design novel materials for extreme engineering applications.
Content may be subject to copyright.
NAFEMS World Congress, 17-20th June 2019, Quebec City, Canada
1
Yield stress is not enough: Recent innovations in
micromechanics for nonlinear analysis
Dr. L. Margetts, Mr. S. Hewitt CEng
(University of Manchester, UK);
Dr. A. Shterenlikht
(University of Bristol, UK);
Dr. L.M. Evans
(University of Swansea, UK);
Dr. J.D. Arregui-Mena
(Oak Ridge National Laboratory, USA);
Dr. F. Levrero
(University of Oxford, UK);
Prof. Dr. P. Pankaj
(University of Edinburgh, UK)
Abstract
Material nonlinearity is implemented in finite element programs using
mathematical models of material behaviour. These are typically defined by a
set of parameters whose values are determined from experimental data. These
constitutive models are phenomenological; that is they describe nonlinear
material behaviour, but do not explicitly model the physical mechanisms that
lead to plasticity. In this paper, the authors discuss how three different
modelling strategies may improve realism; (i) Stochastic Monte Carlo
Simulation, (ii) Image-based Modelling and (iii) Cellular Automata coupled
with Finite Elements. Each of these techniques are linked to the underlying
micromechanics in a different way. Case studies are presented for a range of
materials (namely graphite, bone and polycrystalline iron) to give an overview
of how each of these techniques might be used. This paper will be of interest to
engineers who might not have previously considered (a) stochastic modelling
being relevant to their work or (b) using X-ray tomography data and image-
based modelling as an alternative method for calibrating a constitutive model.
Furthermore, the three techniques could be used together by firms wanting to
custom design novel materials for extreme engineering applications.
Keywords: Stochastic, Cellular Automata, Tomography, Finite Element
NAFEMS World Congress, 17-20th June 2019, Quebec City, Canada
2
1. Introduction
Models of material behaviour for nonlinear analysis are nowadays universally
implemented in finite element software. The simplest models for material
nonlinearity (plasticity) are the von Mises, Mohr Coulomb and Tresca models.
There are models for every common type of engineering material. What they
have in common is that they are calibrated using experimental data. A tensile
test may be used to derive a load-displacement curve that has a linear elastic
region, yield and then material hardening and/or softening behaviour. Some
authors describe this as a phenomenological approach. There is no information
in this type of model about the actual physical mechanisms that lead to
plasticity; such as the presence of grain scale defects in metals or porosity in
composites.
The authors are investigating three different techniques that take into account
the mechanisms that lead to plasticity. The first uses random fields to impose
tiny spatial fluctuations in the properties of a material throughout the
computational domain. The application area is nuclear graphite. Detailed
mechanical testing shows that whilst the manufacturing process is designed to
produce a very uniform isotropic material, there are localised variations in
density and stiffness. Microscopy and X-ray imaging confirms that this is due
to the nanoscale texture of the material and porosity. On the basis of the test
data, it is possible to calibrate random fields and give each element in a mesh
its own localised property. The resulting stochastic analysis incorporates
variations in microstructure.
The second technique involves image-based modelling and homogenisation.
The case study is osteoporotic bone and the motivation is to develop patient
specific material models. A scan could be taken of a patient before surgery.
This would be used to develop a stress-strain law for their own diseased bone,
so that finite element analysis could be carried out to customise the implant and
surgical procedure. The process involves numerical homogenisation, using an
image-based model of the bone microstructure. The output is a yield surface
that is then used in a macroscale (continuum) finite element model. Currently,
this requires supercomputing due to the large element counts and the use of a
large strain nonlinear material model for the microstructural analysis.
The final technique couples continuum finite elements at the macroscale with
cellular automata at the mesoscale. The cellular automata are used to simulate
the development of micro-cracks in a metal, which is modelled as a
polycrystalline solid. Many thousands of cellular automata cells represent each
grain in the metal. The simulation starts with a finite element analysis. The
stress tensor at each integration point is mapped onto a region of the cellular
automata. The cellular automata analysis determines whether micro-cracks
form in the material. Currently, this mechanism based solution is then
simplified into a damage parameter that is returned to the finite element
NAFEMS World Congress, 17-20th June 2019, Quebec City, Canada
3
analysis. The element stiffness matrix is updated and the simulation proceeds
to the next increment.
Whilst these strategies are currently computationally intensive, computers are
becoming more powerful. Smart phones have the same performance today as
the world’s fastest supercomputer 20 years ago. Over that time, performance
has increased by a factor of 1 million. Furthermore, computational cost is no
longer the barrier it once was. Cloud Computing allows engineers to access a
supercomputer for a very modest cost to carry out more realistic simulations.
2. Random Fields
This section concerns the random assignment of material properties to
individual elements in a finite element mesh; an approach that probably
originated to deal with uncertainty in geotechnical engineering (Fenton and
Griffiths, 2008). Values for soil parameters will vary considerably due to
factors such as natural variation (depositional environment), periods of
overburden (burial and consolidation) and later erosion (relief of ground
pressure). The information available to the engineer is sparse, collected from
borehole logs, insitu tests or laboratory work. Arguably, because of the size of
the domain (100s metres to km) it is not practical to collect a sufficient number
of samples to determine a reliable mean value. Furthermore, in a finite element
model of an excavation or embankment, it is intuitive for the analyst to feel
uncomfortable assigning all the elements the same mean property value as it is
more physically realistic to assume that the properties vary with depth and
lateral extent.
Geotechnical engineers have participated in the development of a technique
called the random finite element method. The sparse knowledge in the
variability of the soil properties is converted into statistics that are used to
calibrate a random field. The input data to the random field generator includes
the mean value, the standard deviation and a parameter that describes how the
value varies in space. A large number of simulations are carried out, where the
analyses are repeated, each one having a unique, randomly generated,
distribution of material properties. This is referred to as a stochastic Monte
Carlo simulation (Arregui et al 2016a). Instead of a single answer, there are a
range of solutions and the results are typically interpreted in terms of a
probability of failure.
The stochastic Monte Carlo solution technique is very rarely used in
engineering problems where the values for the material parameters are
considered to be well known. In this paper, the authors present a new way of
looking at how the random finite element method might be used. The example
used is nuclear graphite as shown in Figure 1. This safety critical material has
been carefully designed to be isotropic and undergoes strict protocols in its
manufacture. The mean values for its physical properties are therefore
NAFEMS World Congress, 17-20th June 2019, Quebec City, Canada
4
considered to be consistent through a large volume of manufactured material.
In a finite element model, an analyst would assign all elements in the graphite
part the same mean value. The consequence is that, in practice, when using
finite element analysis to predict cracking, often the location and path of cracks
is different to what is observed in a real reactor.
Figure 1: RFEM steps applied to AGR graphite brick (Arregui et al. 2015)
Nuclear graphite is a composite material with a matrix and filler particles. It is
known that mesoscale features, such as small random variations in the size,
shape, spacing and orientation of filler particles or the presence of tiny voids
can affect the location of micro-fractures and influence the crack path. By
using a mean value for the material property, in all elements in a finite element
model, these microstructural features are averaged out of existence. Could the
random finite element method be used as an approximate method of accounting
for the macroscale effect of micromechanics?
Arregui et al. (2015) used software originally developed for geotechnical
engineering (Fenton and Griffiths, 2014. Smith et al. 2014) to investigate
conceptually whether tiny spatial variations in material properties could lead to
pre-service stress concentrations in virgin nuclear graphite. Figure 2 compares
analyses whereby a change in temperature leads to thermal expansion and
thermally induced stresses. All plots use the same contour intervals. Figures 2b,
2c and 2d show von mises stress values for three temperature profiles when a
mean value for the material properties is used. Figures 2f, 2g and 2h show
stress values for the same temperature profiles, using randomly assigned
properties. The results highlight that stress concentrations could occur in
unexpected locations such as the (macroscopically) featureless bore as well
as sharp corners (as is usually expected).
NAFEMS World Congress, 17-20th June 2019, Quebec City, Canada
5
Figure 2: Comparison of deterministic and stochastic analyses (Arregui et al, 2015)
For this initial proof of concept study, the spatial variation in the material
property values was estimated. As the results justified further experimental
work, a billet of nuclear graphite was obtained, cut into around 500 cubes and
the bulk material properties of each cube was determined (Arregui et al,
2016b). The experimental data showed that the material properties did indeed
have a tiny spatial variation in values. The data has since been processed to
calibrate a random field (Arregui et al, 2018) and the analyses are being rerun.
This technique could be applied to other areas where it is traditionally assumed
that the material properties of engineering materials do not vary, such as metals
and composites. Typically, experimental tests used to determine material
properties provide a mean and a standard deviation. The standard deviation is
often dismissed as representing experimental error, but here it is demonstrated
that it may be reasonable to assume that this, in fact, quantifies, at the
macroscale, the natural variations in material properties that occur in the
microscale. Rather than ignore the variability in the experimental data, random
finite element analysis could be considered a cheap way of including the
effects of micromechanics in a simulation. The variation would influence the
progression of nonlinear effects such as plasticity, damage and crack paths.
Stochastic simulations would need to be carried out, but the benefits would be
improved predictions of the reliability of a component or structure.
3. Image-based Modelling
Recent years have seen a marked growth of the use of image-based modelling;
as highlighted in the NAFEMS Benchmark magazine. In a nutshell, this
involves converting an image of a real material, part or structure into a
computer model. Any imaging modality can be used, including MRI, X-ray
NAFEMS World Congress, 17-20th June 2019, Quebec City, Canada
6
tomography, LIDAR (for surfaces) or even sketches. Furthermore there are
imaging devices that can be used at each length scale, from nanometres to
kilometres. The technique is typically used for reverse engineering, carrying
out analyses on naturally occurring materials where it is intractable to create
the geometry using computer aided design tools or studying the effect of
defects such as porosity in manufactured products (Evans et al, 2015).
In the context of this paper, it can also be used to simulate micromechanics,
calibrating constitutive models for materials with complex architectures
without the need to carry out real physical testing in a laboratory (Levrero-
Florencio et al, 2016a, 2016b, 2017). An example presented herein is the use of
image-based modelling for the development of material models of bone. Figure
3 shows voxel based finite element models of porous and dense bone. The
porous bone could represent diseased or osteoporotic bone and the dense bone
healthy bone. A representative elementary volume of each type of bone is
subjected to a series of virtual mechanical tests using the finite element method
loading until the volume is assumed to undergo yielding; determined by
plotting the load-displacement response.
Figure 3: FE meshes of (a) porous and (b) dense bone (Levrero et al., 2016)
The results of these tests provide data points on a yield surface as shown in
Figure 4. The method is essentially an extension of homogenisation, whereby
the bulk effective Young’s modulus of a material with complex architecture is
determined from a set of elastic micromechanical analyses (Rawson et al,
2015). Here, the material of the bone microstructure is given properties that are
nonlinear, both in terms of plasticity and geometric nonlinearity; the latter to
accurately simulate large deformations of the microstructure. Once a series of
data points are obtained, suitable yield surfaces can be proposed and calibrated.
NAFEMS World Congress, 17-20th June 2019, Quebec City, Canada
7
These models have large element counts. Coupled with nonlinearity, this
means that currently, these simulations need to be carried out using a
supercomputer and specialist software (Smith and Margetts, 2003). That said,
this activity only needs to be carried out once and the resulting model of
(macroscopic or bulk) material behaviour can be used in a standard finite
element package.
Figure 4: Macroscopic yield points (Levrero et al., 2016)
From an industry perspective, it is possible to subcontract this type of work out
to specialist companies or universities. Access to imaging facilities and
supercomputers is becoming more widespread, both on a pay-per-use basis.
Potential engineering applications might include calibrating models of material
behaviour for different levels of corrosion in steel or degradation in concrete.
Classic constitutive models typically assume newly manufactured materials,
whereas engineers are often tasked with evaluating the safety of ageing
structures.
Another application of image-based modelling is the in-silico design and
evaluation of new materials and components (Figure 5). Evans et al (2019)
used X-ray tomography data to investigate the performance of a graphite foam
in a heat exchange component for a fusion reactor without needing to
manufacture the component (or the reactor).
NAFEMS World Congress, 17-20th June 2019, Quebec City, Canada
8
Figure 5: Image-based modelling of a graphite foam (Evans et al., 2019)
4. Cellular Automata
In this section, a third technology is considered, namely cellular automata
coupled with finite elements or the CAFE technique. In this approach, the
authors are investigating whether cellular automata can be effectively used to
simulate the mechanisms that lead to fracture at the grain scale. The cellular
automata method is a grid based simulation technique that employs very simple
rules to describe how a cell interacts with its neighbours. An example of a cell
and its neighbourhood are shown in Figure 6a. In Figure 6b, a cellular
automata grid simulates solidification. Cells marked “0” represent the melt,
whilst those marked “1” and “24” are individual crystals. Depending on the
simple rules and local neighbourhood, a cell on the boundary between the melt
and a crystal will either remain as melt or change into a solid.
Figure 6: Cellular automata grid showing a solidification process (Shterenlikht 2015)
The rules can be extended to simulate the formation of fractures across grain
boundaries or along cleavage planes. Figure 7a shows a cellular automata grid
representing grains of polycrystalline iron. Figure 7b shows cells that define
grain boundaries. The white planes in Figure 7c are microcracks that have
NAFEMS World Congress, 17-20th June 2019, Quebec City, Canada
9
developed along cleavage planes. These coalesce to form a macrocrack in
Figure 7d.
Shterenlikht and Margetts (2015) have developed a multiscale framework in
which finite elements representing the engineering length scale are coupled
with cellular automata representing the grain scale. The simulation starts by
initialising both grids. A patch of the cellular automata grid will occupy the
same physical space as one finite element. To give an indication of scale, the
finite element model may comprise thousands of elements, whereas there will
be many billions of automata. Many thousands of automata may coincide with
an individual finite element. In the first load increment, stress is computed in
the normal way using the finite elements. The stress tensor for each element is
then mapped onto its associated patch of automata. The cellular automata
computation runs. If the stress provides sufficient energy, microcracks may
form. A homogenisation step returns this information back to the finite element
as a damage parameter. This reduces the stiffness of the finite element and the
coupled simulation proceeds to the next step.
Figure 7: Cellular automata used to simulate fracture at the grain scale
NAFEMS World Congress, 17-20th June 2019, Quebec City, Canada
10
The multiscale framework relies on simulating the real physical mechanisms
that occur in the grain scale to predict the evolution of damage in the material.
The macroscale response of the structure is seen through the deformation of the
finite element model. The work is novel in that there is no constitutive model
for the nonlinear material behaviour. Currently, the strategy is computationally
intensive and requires the use of supercomputers (Shterenlikht et al, 2018).
5. Conclusions
This paper has outlined three novel techniques which could be used to enrich a
nonlinear engineering simulation with micromechanics, bringing the results
closer to reality than achievable using phenomenological constitutive models.
The first technique simply enhances a finite element model by incorporating
the variability of experimentally derived property data through the use of
random fields and stochastic methods. The variation in properties, whilst
narrow in range for most typical engineering materials, captures
micromechanical information through what is effectively experimental (rather
than numerical) homogenisation. Spatial variations in properties assigned to
elements in a mesh could lead to more reliable predictions of the probability of
failure. The novelty here is the suggestion that stochastic techniques be applied
where engineers typically assume that material properties are not variable; i.e.
standard materials (metals, composites and so on) supplied with product data
sheets published by the manufacturer.
The second technique uses micromechanical models of materials based on X-
ray tomography images. Whilst this may not be novel, the authors show for the
first time that supercomputers are now powerful enough to routinely carry out
nonlinear numerical homogenisation using microstructurally faithful models.
Even including both material and geometric nonlinearity at the microscale, it
has proven possible to run a suite of simulations in a reasonable time that can
be used to calibrate new material models. These can be used in standard finite
element programs (at the macroscale). It is not suggested that engineers do this
routinely, but the strategy opens the door to the calibration of a new suite of
material models that can be made available in commercial software for
practising engineers. This could potentially transform the work of engineers
concerned with predicting the residual life of components and structures, as
models could be calibrated for materials that have undergone various forms of
ageing.
The final method proposed has the lowest technology readiness level of the
three strategies considered. Cellular automata are used at the mesoscale to
simulate the mechanisms that lead to nonlinear stress-strain responses.
Arrangements of grains can be randomly grown using algorithms or defined
using X-ray tomography data. Whilst the cellular automata rules are simple, the
emergent behaviour is not.
NAFEMS World Congress, 17-20th June 2019, Quebec City, Canada
11
All three strategies require access to computing power not yet available on the
desktop. For engineering firms, if there is a business case, this is not a barrier
as computer cycles are available for purchase from supercomputing centres or
Cloud Computing vendors. Furthermore, smart phones have the same power
today as the fastest supercomputers had twenty years ago. The processing
capability of the current crop of supercomputers will be available on
workstations in ten years time.
6. Acknowledgements
The authors acknowledge the support of EPSRC through grants EP/M507969/1
and EP/N026136/1, as well as ARCHER projects e347 and e515.
7. References
Arregui-Mena JD, Margetts L, Griffiths DV, Lever L, Hall G, Mummery PM
(2015). Spatial variability in the coefficient of thermal expansion induces pre-
service stresses in computer models of virgin Gilsocarbon bricks: Journal of
Nuclear Materials, Volume 465, Pages 793-804.
Arregui-Mena JD, Margetts L & Mummery PM (2016a). Practical application
of the stochastic finite element method: Archives in Computational Methods in
Engineering, Volume 23(1), Pages 171-190.
Arregui-Mena JD, Bodel W, Worth RN, Margetts L, Mummery PM (2016b).
Spatial variability in the mechanical properties of Gilsocarbon: Carbon,
Volume 110, Pages 497-517.
Arregui-Mena JD, Edmondson PD, Margetts L, Griffiths DV, Windes WE,
Carroll M and Mummery PM (2018). Characterisation of the spatial
variability of material properties of Gilsocarbon and NBG-18 using random
fields: Journal of Nuclear Materials, Volume 511, Pages 91-108.
Evans LM, Margetts L, Casalegno V et al. (2015). Transient thermal finite
element analysis of CFCCu ITER monoblock using X-ray tomography data:
Fusion Engineering and Design, Volume 100, Pages 100-111.
Evans LM, Margetts L, Lee PD et al. (2019). Image based in silico
characterisation of the effective thermal properties of a graphite foam: Carbon,
Volume 143, Pages 542-558.
Fenton GA & Griffiths DV (2008). Risk Assessment in Geotechnical
Engineering: Wiley.
Levrero-Florencio F, Manda K, Margetts L and Pankaj P (2017). Effect of
including damage at the tissue level in the nonlinear homogenisation of
NAFEMS World Congress, 17-20th June 2019, Quebec City, Canada
12
trabecular bone: Biomechanics and Modeling in Mechanobiology, Volume 16,
Issue 11.
Levrero-Florencio F, Margetts L, Sales E et al. (2016a). Evaluating the
macroscopic yield behaviour of trabecular bone using a nonlinear
homogenisation approach: Journal of the Mechanical Behavior of Biomedical
Materials, Volume 61.
Levrero-Florencio F, Manda K, Margetts L and Pankaj P (2016b). Nonlinear
homogenisation of trabecular bone: Effect of solid phase constitutive model:
Proceedings of the Institution of Mechanical Engineers, Part H: Journal of
Engineering in Medicine, Volume 231, Issue 5, Pages 405-414.
Rawson SD, Margetts L, Wong JKF and Cartmell SH (2015). Sutured tendon
repair; a multi-scale finite element model: Biomechanics and Modeling in
Mechanobiology, Volume 14, Issue 1, Pages 123-133.
Shterenlikht A, Margetts L and Cebamanos L (2018). Modelling fracture in
heterogeneous materials on HPC systems using a hybrid MPI/Fortran coarray
multi-scale CAFE framework: Advances in Engineering Software, Volume
125, Pages 155-166.
Shterenlikht A and Margetts L (2015). Three-dimensional cellular automata
modelling of cleavage propagation across crystal boundaries in polycrystalline
microstructures: Proceedings of the Royal Society A: Mathematical, Physical
and Engineering Sciences, Volume 471, Issue 2177.
Smith IM and Margetts L (2003). Portable parallel processing for nonlinear
problems: VII International Conference on Computational Plasticity
COMPLAS 2003, E. Oñate and D. R. J. Owen (Eds), CIMNE, Barcelona
Smith IM, Griffiths DV and Margetts L (2014). Programming the Finite
Element Method, 5th Edition, Wiley.
ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
Graphite is a candidate material for Generation IV concepts and is used as a moderator in Advanced Gas-cooled Reactors (AGR) in the UK. Spatial material variability is present within billets causing different material property values between different components. Variations in material properties and irradiation effects can produce stress concentrations and diverse mechanical responses in a nuclear reactor graphite core. In order to characterise the material variability, geostatistical techniques called variography and random field theory were adapted for studying the density and Young’s modulus of a billet of Gilsocarbon and NBG-18 graphite grades. Variography is a technique for estimating the distance over which material property values have significant spatial correlation, known as the scale of fluctuation or spatial correlation length. The paper uses random field theory to create models that mimic the original spatial and statistical distributions of the original data set. This study found different values of correlation length for density and Young’s modulus around the edges of a Gilsocarbon billet, while in the case of NBG-18, similar correlation lengths where found across the billet. Examples of several random fields are given to reproduce the spatial patterns and values found in the original data.
Article
Full-text available
Being able to predict bone fracture or implant stability needs a proper constitutive model of trabecular bone at the macroscale in multiaxial, non-monotonic loading modes. Its macroscopic damage behaviour has been investigated experimentally in the past, mostly with the restriction of uniaxial cyclic loading experiments for different samples, which does not allow for the investigation of several load cases in the same sample as damage in one direction may affect the behaviour in other directions. Homogenised finite element models of whole bones have the potential to assess complicated scenarios and thus improve clinical predictions. The aim of this study is to use a homogenisation-based multiscale procedure to upscale the damage behaviour of bone from an assumed solid phase constitutive law and investigate its multiaxial behaviour for the first time. Twelve cubic specimens were each submitted to nine proportional strain histories by using a parallel code developed in-house. Evolution of post-elastic properties for trabecular bone was assessed for a small range of macroscopic plastic strains in these nine load cases. Damage evolution was found to be non-isotropic, and both damage and hardening were found to depend on the loading mode (tensile, compression or shear); both were characterised by linear laws with relatively high coefficients of determination. It is expected that the knowledge of the macroscopic behaviour of trabecular bone gained in this study will help in creating more precise continuum FE models of whole bones that improve clinical predictions.
Article
Full-text available
Computational homogenisation approaches using high resolution images and finite element (FE) modelling have been extensively employed to evaluate the anisotropic elastic properties of trabecular bone. The aim of this study was to extend its application to characterise the macroscopic yield behaviour of trabecular bone. Twenty trabecular bone samples were scanned using a micro-computed tomography device, converted to voxelised FE meshes and subjected to 160 load cases each (to define a homogenised multiaxial yield surface which represents several possible strain combinations). Simulations were carried out using a parallel code developed in-house. The nonlinear algorithms included both geometrical and material nonlinearities. The study found that for tension-tension and compression-compression regimes in normal strain space, the yield strains have an isotropic behaviour. However, in the tension-compression quadrants, pure shear and combined normal-shear planes, the macroscopic strain norms at yield have a relatively large variation. Also, our treatment of clockwise and counter-clockwise shears as separate loading cases showed that the differences in these two directions cannot be ignored. A quadric yield surface, used to evaluate the goodness of fit, showed that an isotropic criterion adequately represents yield in strain space though errors with orthotropic and anisotropic criteria are slightly smaller. Consequently, although the isotropic yield surface presents itself as the most suitable assumption, it may not work well for all load cases. This work provides a comprehensive assessment of material symmetries of trabecular bone at the macroscale and describes in detail its macroscopic yield and its underlying microscopic mechanics.
Conference Paper
Full-text available
In this paper we discuss parallel processing by Finite Element Methods for nonlinear problems in engineering in general. In particular we focus on two completely different algorithms for the solution of large (say > 1 million degrees of freedom) problems in elastoplasticity. Our intention is to show that these two algorithms, and many others, can be readily parallelised using a subroutine library callable from Fortran. The particular library is written in MPI but this is not a restriction in principle. The authors have pursued an exclusively iterative technique for solution of the simultaneous algebraic equations. When allied with an element-by-element strategy the solution algorithms all have at their core only matrix-vector products and vector-vector operations which means that implicit plasticity calculations can be parallelised with not much more effort than explicit ones. This has been demonstrated by the parallelisation of all the algorithm types in Smith and Griffiths, amounting to some 10 explicit, implicit and eigenvalue computations. Of course, the parallel algorithms may not be optimal for the solution of any particular problem in elastoplasticity or elsewhere, but efficiency considerations are addressed. The aim is generality and portability of the approach to shared or distributed memory systems and to clusters of PCs. Finally, the solution of a “coupled” problem in geomechanics is reported which involves Biot-type coupling between solid and fluid phases of the geomaterials. The example described concerns aspects of processing of toxic waste derived from nuclear energy production.
Article
Full-text available
The thermal performance of a carbon fibre composite-copper monoblock, a sub-component of a fusion reactor divertor, was investigated by finite element analysis. High-accuracy simulations were created using an emerging technique, image-based finite element modelling, which converts X-ray tomography data into micro-structurally faithful models, capturing details such as manufacturing defects. For validation, a case study was performed where the thermal analysis by laser flash of a carbon fibre composite-copper disc was simulated such that computational and experimental results could be compared directly. Results showed that a high resolution image-based simulation (102 million elements of 32 μm width) provided increased accuracy over a low resolution image-based simulation (0.6 million elements of 194 μm width) and idealised computer aided design simulations. Using this technique to analyse a monoblock mock-up, it was possible to detect and quantify the effects of debonding regions at the carbon fibre composite-copper interface likely to impact both component performance and expected lifetime. These features would not have been accounted for in idealised computer aided design simulations.
Article
Functional materials' properties are influenced by microstructures which can be changed during manufacturing. A technique is presented which digitises graphite foam via X-ray tomography and converts it into image-based models to determine properties in silico. By simulating a laser flash analysis its effective thermal conductivity is predicted. Results show ~1 % error in the direction the foam was 'grown' during manufacturing but is significantly less accurate in plane due to effective thermal conductivity resulting from both the foam's microstructure and graphite's crystalline structure. An empirical relationship is found linking these by using a law of mixtures. A case study is presented demonstrating the technique's use to simulate a heat exchanger component containing graphite foam with micro-scale accuracy using literature material properties for solid graphite. Compared against conventional finite element modelling there is no requirement to firstly experimentally measure the foam's effective bulk properties. Additionally, improved local accuracy is achieved due to exact location of contact between the foam and other parts of the component. This capability will be of interest in design and manufacture of components using graphite materials. The software used was developed by the authors and is open source for others to undertake similar studies.
Article
A 3D multi-scale cellular automata finite element (CAFE) frame-work for modelling fracture in heterogeneous materials is described. The framework is implemented in a hybrid MPI/Fortran coarray code for efficient parallel execution on HPC platforms. Two open source BSD licensed libraries developed by the authors in modern Fortran were used: CGPACK, implementing cellular automata (CA) using Fortran coarrays, and ParaFEM, implementing finite elements (FE) using MPI. The framework implements a two-way concurrent hierarchical information exchange between the structural level (FE) and the microstructure (CA). MPI to coarrays interface and data structures are described. The CAFE framework is used to predict transgranular cleavage propagation in a polycrystalline iron round bar under tension. Novel results enabled by this CAFE framework include simulation of progressive cleavage propagation through individual grains and across grain boundaries, and emergence of a macro-crack from merging of cracks on preferentially oriented cleavage planes in individual crystals. Nearly ideal strong scaling up to at least tens of thousands of cores was demonstrated by CGPACK and by ParaFEM in isolation in prior work on Cray XE6. Cray XC30 and XC40 platforms and CrayPAT profiling were used in this work. Initially the strong scaling limit of hybrid CGPACK/ParaFEM CAFE model was 2,000 cores. After re-placing all-to-all communication patterns with the nearest neighbour algorithms the strong scaling limit on Cray XC30 was increased to 7,000 cores. TAU profiling on non-Cray systems identified deficiencies in Intel Fortran 16 optimisation of remote coarray operations. Finally, coarray synchronisation challenges and opportunities for thread parallelisation in CA are discussed.
Article
Micro-finite element models have been extensively employed to evaluate the elastic properties of trabecular bone and, to a limited extent, its yield behaviour. The macroscopic stiffness tensor and yield surface are of special interest since they are essential in the prediction of bone strength and stability of implants at the whole bone level. While macroscopic elastic properties are now well understood, yield and post-yield properties are not. The aim of this study is to shed some light on what the effect of the solid phase yield criterion is on the macroscopic yield of trabecular bone for samples with different microstructure. Three samples with very different density were subjected to a large set of apparent load cases (which is important since physiological loading is complex and can have multiple components in stress or strain space) with two different solid phase yield criteria: Drucker–Prager and eccentric–ellipsoid. The study found that these two criteria led to small differences in the macroscopic yield strains for most load cases except for those that were compression-dominated; in these load cases, the yield strains for the Drucker–Prager criterion were significantly higher. Higher density samples resulted in higher differences between the two criteria. This work provides a comprehensive assessment of the effect of two different solid phase yield criteria on the macroscopic yield strains of trabecular bone, for a wide range of load cases, and for samples with different morphology.
Article
NEW PROBABILISTIC APPROACHES FOR REALISTIC RISK ASSESSMENT IN GEOTECHNICAL ENGINEERING. This text presents a thorough examination of the theories and methodologies available for risk assessment in geotechnical engineering, spanning the full range from established single-variable and "first order" methods to the most recent, advanced numerical developments. In response to the growing application of LRFD methodologies in geotechnical design, coupled with increased demand for risk assessments from clients ranging from regulatory agencies to insurance companies, authors Fenton and Griffiths have introduced an innovative reliability-based risk assessment method, the Random Finite Element Method (RFEM). The authors have spent more than fifteen years developing this statistically based method for modeling the real spatial variability of soils and rocks. As demonstrated in the book, RFEM performs better in real-world applications than traditional risk assessment tools that do not properly account for the spatial variability of geomaterials. This text is divided into two parts: Part One, Theory, explains the theory underlying risk assessment methods in geotechnical engineering. This part's seven chapters feature more than 100 worked examples, enabling you to develop a detailed understanding of the methods. Part Two, Practice, demonstrates how to use advanced probabilistic tools for several classical geotechnical engineering applications. Working with the RFEM, the authors show how to assess risk in problems familiar to all geotechnical engineers. All the programs used for the geotechnical applications discussed in Part Two may be downloaded from the authors' Web site at www.engmath.dal.ca/rfem/ at no charge, enabling you to duplicate the authors' results and experiment with your own data. In short, you get all the theory and practical guidance you need to apply the most advanced probabilistic approaches for managing uncertainty in geotechnical design.
Article
In this paper, the authors test the hypothesis that tiny spatial variations in material properties may lead to significant pre-service stresses in virgin graphite bricks. To do this, they have customized ParaFEM, an open source parallel finite element package, adding support for stochastic thermo-mechanical analysis using the Monte Carlo Simulation method. For an Advanced Gas-cooled Reactor brick, three heating cases have been examined: a uniform temperature change; a uniform temperature gradient applied through the thickness of the brick and a simulated temperature profile from an operating reactor. Results are compared for mean and stochastic properties. These show that, for the proof-of-concept analyses carried out, the pre-service von Mises stress is around twenty times higher when spatial variability of material properties is introduced. The paper demonstrates that thermal gradients coupled with material incompatibilities may be important in the generation of stress in nuclear graphite reactor bricks. Tiny spatial variations in coefficient of thermal expansion (CTE) and Young’s modulus can lead to the presence of thermal stresses in bricks that are free to expand.