Mode of Action Frameworks
in Toxicity Testing and Chemical Risk
Mary Elizabeth Meek
Research included in this thesis was conducted as part of the author’s
responsibilities at Health Canada.
Meek, Mary Elizabeth 2009
Mode of Action Frameworks in Toxicity Testing and Chemical Risk
-with a summary in Dutch-
PhD thesis, Institute for Risk Assessment Sciences (IRAS), Utrecht University, the
Cover lay-out: Marije Brouwer, Division Multimedia of the Faculty of Veterinary
Medicine, Utrecht University, the Netherlands
Lay-out: Harry Otter, Division Multimedia of the Faculty of Veterinary
Medicine, Utrecht University, the Netherlands
Printed by: Ridderprint, Ridderkerk, the Netherlands
Mode of Action Frameworks
in Toxicity Testing and Chemical Risk
Het gebruik van analytische kaders
gebaseerd op de werking van stoffen bij
het testen van toxiciteit en de
(met een samenvatting in het Nederlands)
ter verkrijging van de graad van doctor aan de Universiteit Utrecht,
op gezag van de rector magnificus, prof. dr. J.C. Stoof,
ingevolge het besluit van het college voor promoties
in het openbaar te verdedigen
op dinsdag 1 oktober 2009
des middags te 12.00 uur
Mary Elizabeth Meek
geboren op 4 november 1954
te Kingston, Ontario, Canada
Promotoren: Prof. dr. W. Slob
Prof. dr. M. van den Berg
Table of contents
Historical Regulatory Context
Evolution of the Paradigm for Consideration of Hazard in
Chemical Risk Assessment
The Need for Analytical Frameworks for Human Relevance of
Mode of Action Consideration in Hazard Characterization and
Overview of this Thesis
Meek, M.E. and Armstrong, V.C. (2007) The assessment and
management of industrial chemicals in Canada, Risk Assessment
of Chemicals, Van Leeuwen, K. and Vermeire, T. Kluwer
Academic Publishers, Dordrecht, the Netherlands.
Meek, M.E., Patterson, J., Strawson, J. and Liteplo, R. (2007)
Engaging expert peers in the development of risk assessments.
Risk Analysis 27(6):1609-1621 and Erratum to “Engaging Expert
Peers in the Development of Risk Assessments” Risk Analysis,
Vol. 28(1): 249 (2008).
Meek, M.E. (2008) Recent developments in frameworks to
consider human relevance of hypothesized modes of action for
tumours in animals. Environmental and Molecular Mutagenesis
Meek, M. E., Bucher, J. R., Cohen, S. M., Dellarco, V., Hill, R. N.,
Lehman-McKeeman, L. D., Longfellow, D. G., Pastoor, T., Seed,
J., Patton, D. E. (2003) A framework for human relevance
analysis of information on carcinogenic modes of action. Critical
Reviews in Toxicology 33(6): 591-653.
Meek, M.E. (2001) Categorical default uncertainty factors -
Interspecies variation and adequacy of database. Human &
Ecological Risk Assessment 7: 157-163.
Meek, M.E. (2004) Toxicological highlight - Biologically motivated
computational modelling: Contribution to risk assessment.
Toxicological Sciences 82(1): 1-2.
Meek, M.E. (2005) Chemical-Specific Adjustment Factors (CSAF).
Encyclopedia of Toxicology, Elsevier.
Meek, B. and Renwick, A. (2006) Guidance for the development
of chemical specific adjustment factors - Integration with mode
of action frameworks. Toxicokinetics and Risk Assessment, Ed.
Lipscomb, J.C. and Ohanian, E.V., Informa Healthcare, New
Meek, M.E., Beauchamp, R., Long, G., Moir, D., Turner, L. and
Walker, W. (2002) Chloroform: exposure estimation, hazard
characterization, and exposure-response analysis, J. Toxicol.
Environ. Health B 5(3): 283-334.
Liteplo, R.G. and Meek, M.E. (2003) Inhaled formaldehyde:
Exposure estimation, hazard characterization and exposure-
response analysis. J. Toxicol. Environ. Health, B6(1), 85-114.
Evolution of Analytical Frameworks for Hazard Characterization
and Dose-Response Analysis
Expansion of the Frameworks: Addressing Combined Exposures
and Physiologically Based Pharmacokinetic Modelling
Implications for Testing:
Next Steps (Conclusions and Recommendations)
Recently, legislative mandates worldwide are requiring systematic consideration of
much larger numbers of chemicals. This necessitates more efficient and effective
toxicity testing, as a basis to be more predictive in a risk assessment context. This in
turn requires much more emphasis early in the design of test strategies on both
potential exposure and mechanism or modes of toxicity and a resulting shift in focus
from hazard identification to hazard characterization. This enables grouping of
substances and development of predictive computational tools.
It also requires a much better common understanding in the regulatory risk
assessment community of the nature of appropriate data to inform consideration of
mode of action and resulting implications for dose-response analysis and ultimately,
risk characterization. This requires a shift in focus from the previously principally
qualitative considerations of toxicological science to the necessarily more predictive and
quantitative focus of risk assessment. It also has implications for appropriate
communication and training of risk assessors.
Analytical frameworks such as those for human relevance of hypothesized
mode(s) of action (MOA/HR) and chemical specific adjustment factors (CSAF) are
important components in this evolution. They serve to better coordinate and integrate
input of both the research and regulatory communities in the translation of relevant
mechanistic data into quantitative characterization of risk. They also present essential
pragmatic tools in interim strategies to advance common understanding in these
diverse communities of appropriate application of data from evolving technologies.
The background to the critical role of these frameworks is introduced.
Research related to their development is described in the body of the thesis. The
critical role of the analysis which they promote in the evolution of more focussed,
efficient and effective, public health protective approaches is addressed in the
discussion. Recommendations for relevant next steps are also presented.
Historical Regulatory Context:
Modern chemicals legislation was introduced in Europe and North America in the
1970’s. In the intervening years, its focus has expanded with respect to the media of
interest (air, water, products), environmental as well as human health effects and
incrementally greater numbers of substances. The need to adopt a life cycle approach
to effectively manage the harmful effects of chemicals has also been increasingly
Introduction and/or renewal of early chemicals legislation focussed principally
on information requirements for New Chemicals. Resulting systematic consideration of
New Chemicals prior to their introduction into commerce encouraged the chemical
industry worldwide to minimize intrinsic hazard in the development of their products.
On the other hand, Existing Chemicals, i.e., those which were in commerce at the time
of introduction of relevant legislation, were “grandfathered”. That is, systematic
consideration of all as a basis to identify priorities for risk management was not
required, though a limited number were identified early for assessment.
This resulted in detailed assessments being conducted for several hundred
identified “Priority” chemicals within Canada and Europe over the past few decades. In
Canada, for example, it involved in depth evaluation of 69 substances (including
complex mixtures and groups) identified as priorities under the original and first
renewal of the Canadian Environmental Protection Act (CEPA -1988 and CEPA – 1999).
This transpired in two mandated five year timeframes between 1989 and 2000 [Meek,
2001; Meek, 1997; Meek, 1996; Meek and Hughes,1997; Meek and Hughes, 1995;
Meek et al.,1994]. These assessments were followed by the implementation of risk
management measures for a significant proportion that were deemed to present a risk
to the environment or human health.
More recently, mandates have required systematic consideration of priorities
for risk management from amongst all of the hundreds of thousands of existing
chemicals used worldwide. This is legitimately based on the likelihood that
unconsidered Existing Chemicals present potentially greater risk to health and the
environment than those introduced as New Chemicals following the advent of modern
For example, under the Canadian Environmental Protection Act (CEPA),
precedent-setting provisions were introduced to systematically identify priorities for
assessment and management from amongst the approximately 23, 000 substances
used commercially. This work was to be completed within a mandated 7 year time
frame between 1999 and 2006. This necessitated the development of innovative
methodology including evolution of the previously linear or sequential steps of risk
assessment and risk management to a more iterative approach where the need for,
and focus of, potential control options are identified at as early a stage as possible. It
has also required development of assessment products that efficiently dedicate
resources, investing no more effort than is necessary to set aside a substance as a
non-priority or to provide necessary information to permit risk management.
More recently, in Europe, a law entitled the Registration, Evaluation,
Authorisation and Restriction of Chemical substances (REACH) entered into force on 1
June 2007. Its objective is to effect greater parity between the consideration of New
and Existing Substances. This much broader consideration of Existing Chemicals, as
required under legislation in both Canada and Europe, necessarily has implications for
the efficiency of both testing and assessment.
Evolution of the Paradigm for Consideration of Hazard in Chemical Risk
Risk assessment i.e., the characterization of the potential adverse effects of human
exposures, is the requisite basis for the development and implementation of control
measures that are protective of public health (i.e., risk management), Traditionally, it
has been considered to be composed of four different elements: hazard identification
(i.e., the intrinsic capability of a chemical to do harm), dose-response assessment,
exposure estimation and risk characterization. The latter is a synthesis of relevant data
from all of the component steps with a clear delineation of uncertainties and their
implications for risk management.
This paradigm, proposed initially by the U.S. National Research Council in the
now infamous “Red Book” (NRC, 1983), is more than 25 years old. It is, perhaps, in
need of revisiting, given the essential shift in focus in chemicals risk assessment
necessitated by evolution in regulatory mandates. Specifically, the emphasis on hazard
identification must necessarily shift to hazard characterization. The latter involves a
comprehensive, integrated judgment of all relevant information supporting conclusions
regarding a toxicological effect including human relevance, but most importantly,
taking into account mechanistic information. This shift in focus from hazard
identification to hazard characterization is essential as a basis to avoid labor intensive
testing strategies which provide no or minimal data on mechanistic underpinnings of
observed toxicological effects. Continued reliance on studies designed to identify
hazard in animals at high doses without accompanying relevant mechanistic data
necessarily limits capability to predict risks to the public from exposure to chemicals
and the adequacy of resulting measures to protect public health.
Currently, toxicological studies focus on specific systems or types of effects.
Indeed, much effort and resources are invested currently in conducting and considering
the adequacy of toxicological studies on individual endpoints in experimental animals
(e.g., cancer, reproductive and developmental effects, etc.) as a basis to identify
hazard. There is, however, very little consideration at early stage in their design of
relevance to the prediction of risk. Rather, standardization has been emphasized as a
basis to ensure comparability of outcome. This has led often, to focus in assessment
on features of standardized study design based on criteria for their technical adequacy
in test guidelines as established by organizations such as the US Environmental
Protection Agency (USEPA) or OECD (Organization for Economic Cooperation and
Development). This includes aspects, for example, of whether the study adhered to
the principles of good laboratory practice, versus their relevance to risk assessment.
This is a function, likely, of the need for simplicity.
Since studies focus on particular systems or types of effects, weight of
evidence determinations in hazard identification relate to particular effects rather than
being integrated across systems. For example, consistency is considered in the context
of whether similar effects (e.g., cancer or reproductive) have been observed in other
studies or species. The types, specific site, incidence and severity of these effects and
the nature of the exposure- or dose-response relationship are also taken into account
in assessing weight of evidence for the observed effect. In assessing potential to
induce tumors, for example, aspects that add to the weight of evidence include
observation of uncommon tumor types, occurrence at multiple sites by more than one
route of administration in multiple strains, sexes and species, progression of lesions
from preneoplastic to benign to malignant, including metastases and comparatively
short latency periods. Consideration of contribution of the nature of changes in other
systems in the same animals, might, however, permit more informative and predictive
integration across biological systems, providing greater mechanistic insight.
Traditionally, also, weight of evidence descriptors for cancer and other effects
(principally mutagens and developmental/reproductive toxins) such as “carcinogenic to
humans”, “probably carcinogenic to humans” etc., have been developed by a number
of international organizations such as the International Agency for Research on Cancer
(IARC) and various national regulatory agencies including the US EPA and Health
Canada. These are delineated both as a basis for distinguishing approaches to dose-
response analysis in subsequent risk characterization and also as a basis to
communicate hazard. These characterizations represent, then, weight of evidence
determinations in hazard identification (i.e., intrinsic capability to cause harm) for
particular types of effects but are often misinterpreted to be risk-based (where
exposure, relevance and dose-response have been taken into consideration). In
recognition of this shortcoming, there is trend to providing more narrative and accurate
descriptors, which include reference to the conditions under which the effect is
observed, as a basis to avoid misinterpretation.
Undue emphasis on hazard identification as described above not only leads to
potential misinterpretation in the context of risk, but necessarily limits investment of
resources in more relevant and predictive components of risk assessment, such as
hazard and risk characterization. Given the need in future to be much more efficient
(and resultingly predictive in the context of human health risk), it seems essential to
focus early in testing and assessment on assimilation of information that informs in the
context of hazard characterization.
As indicated above, hazard characterization takes into account not only results
of traditional test guideline studies designed to identify hazard for individual endpoints
but additionally, mechanistic data, which are considered in the context of “mode” of
induction of toxic effects. In fact, an increasingly common understanding of the
concept of “mode of action” and its contrast with “mechanism of action” has been a
major area of advance in risk assessment. “Mode of action” is essentially a description
of the critical metabolic, cytological, genetic and biochemical events that lead to
induction of the relevant end-point of toxicity for which the weight of evidence supports
plausibility. “Mechanism of action”, on the other hand, implies a more detailed
molecular description of causality.
A postulated mode of action (MOA), then, is a biologically plausible sequence
of “key events” leading to an observed effect supported by robust experimental
observations and mechanistic data. Identification of “key” events – i.e., those that are
both measurable and necessary to the observed effect is fundamental to the concept
and the quintessential element of mode of action analysis. Delineation of the key
events in an hypothesized mode of action forces early interdisciplinary collaboration in
consideration and development of data. It is also a unifying theme in the various
components of risk assessment, imposing more explicit delineation of relevant
considerations for human relevance and subsequent dose-response analysis.
Mode of action as considered in this thesis (and the relevant frameworks
addressed herein) is comprised of both toxicokinetics (absorption, distribution,
metabolism and excretion) and toxicodynamics (interaction with target sites and the
subsequent reactions leading to adverse effects). This contrasts with previous
specification, for example, by the US EPA (2005), wherein it is stated that processes
that lead to formation or distribution of the active agent to the target tissue are
considered in estimating dose but are not part of the mode of action. Toxicokinetics is
included here as part of mode of action, given that often, the critical (and sometimes
rate limiting) early key event (i.e., that which is driving the process) involves metabolic
activation to a relevant toxic entity. Toxicokinetics is also included as a basis to
integrate rate limiting (key) steps in subsequent dose-response analysis, which is often
delivery of the parent compound to the target tissue and/or metabolism to the active
agent. It’s of interest in this context that while US EPA (2005) indicates that
toxicokinetics are excluded, mode of action statements in their assessments reference
both toxicokinetic and toxicodynamic aspects.
Information on mode of action is critically important to prediction of risk - in
determining relevance of observed effects in animals to humans, transitions in effect at
various doses and potentially susceptible subgroups. It is also critical as a basis to
address whether or not there is likely to be site concordance of effects between animals
and humans. While there is indication that, for example, growth control mechanisms at
the level of the cell are homologous among mammals, there is no evidence for nor
reason to believe that mechanisms for effects such as cancer induction are site
concordant. Rather information on likely variations between animals and humans in
toxicokinetics and toxicodynamics, based on understanding of mode of induction will
inform in this context. This information is essential also to integrate the results of
studies in animals and humans. For example, it is critical as a basis to interpret
(particularly) the significance of negative epidemiological data, taking into account the
sensitivity of the study to detect effects at most likely tumour sites in humans (i.e.,
which are not necessarily those observed in animals). It is also critical in the
development of relevant biomarkers in epidemiological studies, to increase their utility
as a basis for consideration of the risks to exposure to chemical in both the
occupational and general environments.
In hazard characterization, then, the weight of evidence of hazard integrating
information on mode of action for a spectrum of (often interrelated) end-points is
assessed critically but separately in order to define appropriate end-points for and
approaches to characterization of dose/concentration–response.
Dose- or exposure-response (dose/exposure-response) assessment, involves
quantitation of the probability that an exposure may result in a health deficit in a
population. This is necessarily based on characterization of hazards that are
considered critical and relevant to humans (i.e., those that are biologically relevant at
lowest doses). Advances in common understanding of the contrast of “mode of action”
(a less detailed description with emphasis on critical key events) with “mechanism of
action” (the molecular basis) and the pivotal role of “key events” in this context are
instrumental in encouraging more predictive testing and assessment as a basis for
better informed dose-response assessment. This includes taking into account the
shapes of the dose-response curves for the various key events (not just the adverse
effect, itself) and considering on the basis of the mode of action analysis, which of
these key events is likely to be rate limiting at various doses.
The toxicokinetic and toxicodynamic aspects considered in a mode of action
analysis are also potentially informative in quantitating interspecies differences and
variability within humans. Indeed, there is a continuum of increasingly data (mode of
action)-informed approaches to account for interspecies differences and human
variability, which range from default (“presumed protective”) to more biologically based
(“predictive”). The least data-informed option is incorporation of straight default
values, which incorporate no chemical or species-specific considerations. The basis for
such defaults is largely unknown. Cited support remains nebulous, though they are
sometimes justified, taking into account uncertain retrospective analyses of available
data on empirical relationships (Dourson and Parker, 2007).
Where data permit, categorical defaults, which permit more refinement
through delineation of categories based on, for example, characteristics of the
compounds themselves or of the species in which the critical effect has been
determined, can be developed. The latter include allometric (i.e., surface area to body
weight) scaling for different species or the approaches to development of reference
concentrations for inhalation for various types of gases/particles adopted by the U.S.
EPA (Jarabek, 1994). Additional data permit replacement of kinetic or dynamic
components of interspecies or interindividual variation with chemical specific
adjustments, based on comparative kinetic and dynamic parameters between animals
and humans or within humans. More quantitative toxicokinetic data may permit
development of a physiologically-based pharmacokinetic model (PBPK) which estimates
a "biologically effective dose” based on the mode of action and quantitative
physiological scaling taking into account, relevant chemical-specific physical chemical
properties and biological constants. Though rarely the case, where there is fuller
quantitative characterization of toxicokinetic and toxicodynamic aspects, a case specific
or biologically-based dose-response model can be adopted.
INTRODUCTION Download full-text
necessarily determined principally by the availability of relevant data. Availability of
relevant data has often been limited in the past, owing to the (perhaps unwarranted)
focus on the desire to repeat high dose studies designed to address hazard
Increasingly, for a limited number of Existing Chemicals identified as priorities
for assessment under early chemicals legislation, mode of action data have been
developed as a basis to reduce uncertainties in the areas of greatest inference in risk
assessment: namely, extrapolations across and within species (as a basis to identify
susceptible subgroups) and doses. For a limited number, biologically motivated case-
specific models or fully biologically based dose-response models that integrate
significant amounts of data have been developed. These advances and data have been
informative not only in the context of the individual chemicals, themselves, but also in
identifying patterns of effects associated with particular modes of action and their
implications for both human relevance and dose-response analysis.
In the vast majority of cases, however, even for substances for which there
are significant amounts of data on mode of action, default approaches to extrapolation
of dose-response and consideration of interspecies differences and human variability
are adopted in regulatory risk assessment. These approaches are described here.
Traditionally, then, default approaches
extrapolation are distinguished for effects for which it is believed that there may be a
probability of harm at all levels of exposure (e.g., interaction with DNA leading to
cancer) versus those for which it is believed that there is a level of exposure below
which effects will not be observed. The former approach is justified on the theoretical
basis that a single molecule could be sufficient to induce harm, if it interacts with the
These different assumptions generally lead to two distinct default approaches,
the first of which can result in a (highly uncertain) estimate of risk at various levels of
exposure (i.e., low dose risk estimates) and the second which results in development of
a “safe” dose (acceptable, reference or tolerable intakes). For the latter, this is a level
to which it is believed that a population can be exposed over a lifetime without adverse
effect. Both approaches to dose-response assessment are generally based on only two
to three data points in the experimental range, that is, in groups of animals exposed to
doses which exceed considerably those associated with most human exposures. The
relatively high exposures in toxicological studies designed to identify hazard have
traditionally been justified on the basis that only small numbers of animals per group
are surrogates for a much larger human population.
In the development of acceptable, reference or tolerable daily intakes (ADIs,
RFDs, TDIs) the experimental data are assessed to determine a level without adverse
effects (the no-observed-adverse-effect level or NOAEL). Alternatively, a curve is fitted
that best fits the central estimates of the relationship defined by these experimental
data points and confidence intervals are calculated. Reference or tolerable intakes are
commonly adopted for organ-specific, neurological, immunological, and reproductive-
developmental effects and carcinogenesis not induced by direct interaction with genetic
material. Without information on mode of action, however, there is no reason to
believe that this or the alternative (i.e., linear extrapolation) is more appropriate.
Development of a reference, acceptable or tolerable intake is traditionally
based, then, on an approximation of the threshold, through division of a no- or lowest-
observed-(adverse)-effect-level [NO(A)EL or LO(A)EL] by uncertainty factors (Dourson
The approach along this continuum adopted for any single substance is