[Show abstract][Hide abstract] ABSTRACT: The European Food Safety Authority (EFSA) and the World Health Organization (WHO), with the support of the International Life Sciences Institute, European Branch (ILSI Europe), organized an international conference on 16-18 November 2005 to discuss how regulatory and advisory bodies evaluate the potential risks of the presence in food of substances that are both genotoxic and carcinogenic. The objectives of the conference were to discuss the possible approaches for risk assessment of such substances, how the approaches may be interpreted and whether they meet the needs of risk managers. ALARA (as low as reasonably achievable) provides advice based solely on hazard identification and does not take into account either potency or human exposure. The use of quantitative low-dose extrapolation of dose-response data from an animal bioassay raises numerous scientific uncertainties related to the selection of mathematical models and extrapolation down to levels of human exposure. There was consensus that the margin of exposure (MOE) was the preferred approach because it is based on the available animal dose-response data, without extrapolation, and on human exposures. The MOE can be used for prioritisation of risk management actions but the conference recognised that it is difficult to interpret it in terms of health risk.
Food and Chemical Toxicology 11/2006; 44(10):1636-50. DOI:10.1016/j.fct.2006.06.020 · 2.90 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: The present paper examines the particular difficulties presented by low levels of food-borne DNA-reactive genotoxic carcinogens, some of which may be difficult to eliminate completely from the diet, and proposes a structured approach for the evaluation of such compounds. While the ALARA approach is widely applicable to all substances in food that are both carcinogenic and genotoxic, it does not take carcinogenic potency into account and, therefore, does not permit prioritisation based on potential risk or concern. In the absence of carcinogenicity dose-response data, an assessment based on comparison with an appropriate threshold of toxicological concern may be possible. When carcinogenicity data from animal bioassays are available, a useful analysis is achieved by the calculation of margins of exposure (MOEs), which can be used to compare animal potency data with human exposure scenarios. Two reference points on the dose-response relationship that can be used for MOE calculation were examined; the T25 value, which is derived from linear extrapolation, and the BMDL10, which is derived from mathematical modelling of the dose-response data. The above approaches were applied to selected food-borne genotoxic carcinogens. The proposed approach is applicable to all substances in food that are DNA-reactive genotoxic carcinogens and enables the formulation of appropriate semi-quantitative advice to risk managers.
Food and Chemical Toxicology 11/2006; 44(10):1613-35. DOI:10.1016/j.fct.2006.07.004 · 2.90 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: This review provides a framework contributing to the risk assessment of acrylamide in food. It is based on the outcome of the ILSI Europe FOSIE process, a risk assessment framework for chemicals in foods and adds to the overall framework by focusing especially on exposure assessment and internal dose assessment of acrylamide in food. Since the finding that acrylamide is formed in food during heat processing and preparation of food, much effort has been (and still is being) put into understanding its mechanism of formation, on developing analytical methods and determination of levels in food, and on evaluation of its toxicity and potential toxicity and potential human health consequences. Although several exposure estimations have been proposed, a systematic review of key information relevant to exposure assessment is currently lacking. The European and North American branches of the International Life Sciences Institute, ILSI, discussed critical aspects of exposure assessment, parameters influencing the outcome of exposure assessment and summarised data relevant to the acrylamide exposure assessment to aid the risk characterisation process. This paper reviews the data on acrylamide levels in food including its formation and analytical methods, the determination of human consumption patterns, dietary intake of the general population, estimation of maximum intake levels and identification of groups of potentially high intakes. Possible options and consequences of mitigation efforts to reduce exposure are discussed. Furthermore the association of intake levels with biomarkers of exposure and internal dose, considering aspects of bioavailability, is reviewed, and a physiologically-based toxicokinetic (PBTK) model is described that provides a good description of the kinetics of acrylamide in the rat. Each of the sections concludes with a summary of remaining gaps and uncertainties.
Food and Chemical Toxicology 04/2005; 43(3):365-410. DOI:10.1016/j.fct.2004.11.004 · 2.90 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: Traditionally, different approaches have been used to determine the recommended dietary allowances for micronutrients, above which there is a low risk of deficiency, and safe upper levels, below which there is a negligible risk of toxicity. The advice given to risk managers has been in the form of point estimates, such as the recommended dietary allowance (RDA) and the tolerable upper level (UL). In future, the gap between the two intake-response curves may become narrower, as more sensitive indicators of deficiency and toxicity are used, and as health benefits above the recommended daily allowance are taken into account. This paper reviews the traditional approaches and proposes a novel approach to compare beneficial and adverse effects across intake levels. This model can provide advice for risk managers in a form that will allow the risk of deficiency or the risk of not experiencing the benefit to be weighed against the risk of toxicity. The model extends the approach used to estimate recommended dietary allowances to make it applicable to both beneficial and adverse effects and to extend the intake-incidence data to provide a range of estimates that can be considered by the risk manager. The data-requirements of the model are the incidence of a response at one or more levels of intake, and a suitable coefficient of variation to represent the person-to-person variations within the human population. A coefficient of variation of 10% or 15% has been used for established recommended dietary allowances and a value of 15% is proposed as default for considerations of benefit. A coefficient of variation of 45% is proposed as default for considerations of toxicity, based on analyses of human variability in the fate and effects of therapeutic drugs. Using this approach risk managers, working closely with risk assessors, will be able to define ranges of intake based on a balance between the risks of deficiency (or lack of benefit) and toxicity.
Food and Chemical Toxicology 01/2005; 42(12):1903-22. DOI:10.1016/j.fct.2004.07.013 · 2.90 Impact Factor
[Show abstract][Hide abstract] ABSTRACT: This report presents a review of risk characterisation, the final step in risk assessment of exposures to food chemicals. The report is the second publication of the project "Food Safety in Europe: Risk Assessment of Chemicals in the Food and Diet (FOSIE)". The science underpinning the hazard identification, hazard characterisation and exposure assessment steps has been published in a previous report (Food Safety in Europe, 2002). Risk characterisation is the stage of risk assessment that integrates information from exposure assessment and hazard characterisation into advice suitable for use in decision-making. The focus of this review is primarily on risk characterisation of low molecular weight chemicals, but consideration is also given to micronutrients and nutritional supplements, macronutrients and whole foods. Problem formulation, as discussed here, is a preliminary step in risk assessment that considers whether an assessment is needed, who should be involved in the process and the further risk management, and how the information will provide the necessary support for risk management. In this step an evaluation is made of whether data are available and what level of resources are needed, as well as the timeline for completing the assessment. The report describes good evaluation practice as an organisational process and the necessary condition under which risk assessment of chemicals should be planned, performed, scrutinised and reported. The outcome of risk characterisation may be quantitative estimates of risks, if any, associated with different levels of exposure, or advice on particular levels of exposure that would be without appreciable risk to health, e.g. a guidance value such as an acceptable daily intake (ADI). It should be recognised that risk characterisation often is an iterative and evolving process. Historically, different approaches have been adopted for the risk characterisation of threshold and non-threshold effects. The hazard characterisation for threshold effects involves the derivation of a level of exposure at or below which there would be no appreciable risk to health if the chemical were to be consumed daily throughout life. A guidance value such as the ADI, is derived from the no-observed-adverse-effect-level (NOAEL) or other starting point, such as the benchmark dose (BMD), by the use of an uncertainty or adjustment factor. In contrast, for non-threshold effects a quantitative hazard estimate can be calculated by extrapolation, usually in a linear fashion, from an observed incidence within the experimental dose-response range to a given low incidence at a low dose. This traditional approach is based on the assumption that there may not be a threshold dose for effects involving genotoxicity. Alternatively, for compounds that are genotoxic, advice may be given that the exposure should be reduced to as low as reasonably achievable (ALARA) or practicable (ALARP). When a NOAEL can be derived from a study in humans, this would be utilised in the derivation of guidance values or advice. However, there may be uncertainties related to the possible role of confounders and the precision of both the incidence and exposure data. Individuals may be at an increased risk because of their greater exposure or their greater sensitivity. Risk characterisation should include information not only on the general population, but also on any subpopulation considered to be potentially susceptible. Risk characterisation considers both individuals with average exposures and those with high exposures. High exposure may be related to life stage, cultural practices and/or qualitative and/or quantitative food preferences. Inter-individual differences in toxicokinetics are an important source of variability in response. This may arise from differences in genetic constitution or environmental influences including diet, nutritional status, physiological status such as pregnancy, as well as patho-physiological states. Studies undertaken for hazard identification and characterisation investigate a substance in isolation, and not in combination with other substances to which humans may be exposed at the same time. It is recognised that food represents an extremely complex mixture of substances. In general, the available data indicate that interactions between chemicals in food are unlikely to be a significant problem for health. However, attention needs to be focused during risk characterisation on substances that share a common mode of action. The patterns of human exposure to chemicals in food may be chronic (usually low-level), short-term (often at higher levels) or chronic low-level with occasional high intakes. This may necessitate the development of guidance values for acute exposures (the acute reference dose, ARfD) based on shorter-term studies, in addition to an ADI-value usually based on chronic studies. The possibility of increased risks of chronic adverse effects associated with long-term low-level exposure, combined with occasional peak exposures has generally been handled by averaging such exposures. The significance of intakes above the ADI is difficult to assess. Consideration in this respect has to be given to the nature of the effect, the magnitude of the excessive intake, as well as the duration of excessive intake, in relation to the half-life of the compound in the body and the associated body burden. An intake above the ADI may not necessarily be associated with significant adverse health outcomes since the ADI usually is based on chronic intake and incorporates a safety margin. However, an intake above the ADI would have the effect of eroding the safety margin by the ratio of the ADI to the predicted excess intake. Alternative approaches to assessment of the significance of intakes above the guidance value are provided by categorical regression analysis and probabilistic methods. For non-threshold effects, such as for some cancers, that have undergone risk characterisation by the use of quantitative low-dose hazard extrapolation, any increase in risk with increased exposure can be readily interpreted using the same mathematical model. The narrative that accompanies the risk characterisation should explain the strengths and limitations of the data. When risk characterisation is based on animal data, the validity of such data needs to be affirmed and reported. Also, uncertainties associated with the extrapolation of data from studies in animals to predict human risk should be presented. Uncertainty can be expressed numerically when intake assessment and hazard characterisation are based on mathematical calculations and/or empirical distributions. Such numerical analyses can also be subject to sensitivity analyses, to test the contribution of different aspects of the database to overall uncertainty. Knowledge regarding the influence of human genetic polymorphisms on toxic responses is advancing rapidly. This has led to increasing concern that the default uncertainty factor may not provide adequate protection in the case of certain polymorphisms. The methods used for risk characterisation of low molecular weight chemicals are applicable in many respects to micronutrients. However, there are some unique aspects, the most obvious ones being that some intake is essential for life and the margins between essential intakes and toxic intakes may, for a number of micronutrients, be small. Since both deficiency and excess of a micronutrient can cause health problems, two guidance values for a micronutrient may be expressed. The setting of a tolerable upper intake level (UL) includes consideration of what does not cause physiological perturbations as well as consideration of the probability of an adverse effect occurring at some specified level of exposure. Macronutrients, such as dietary lipids, proteins and carbohydrates, may be present in the food/diet in substantial amounts. Consideration needs to be given in hazard characterisation of macronutrients to tolerance and to toxicological and nutritional impact. Hazard characterisation using animal studies may not be possible because the addition of bulky macroingredients to experimental diets, in amounts that are exaggerated relative to the human diet, may render such diets unpalatable and/ or cause nutritional imbalance. Because of this, the role of human trials and observational studies are widely viewed as particularly important for macronutrients, addressing toxicokinetics, nutritional issues and tolerance. Observational epidemiological studies may also help to identify adverse effects. As for micronutrients, there may need to be more than one guidance value for a macronutrient. In certain instances, a margin of safety approach may be preferable, when it is not possible in animal studies to exaggerate the dosage sufficiently to accommodate the usual uncertainty factors. This project also addresses hazard and risk characterisation related to whole foods, except those based on GM technology, which is dealt with in another European Union (EU) project, ENTRANSFOOD. Whole foods may be foods currently on the market or novel foods for which approval is being sought. There is as yet no world-wide consensus on the most appropriate approaches to hazard and risk characterisation of whole foods, other than to recommend that a case-by-case consideration and evaluation is needed. The initial approach to novel foods requires consideration of the extent to which the novel food differs from any traditional counterparts, or other related products, and hence whether it can be considered as safe as traditional counterparts/related products (the principle of substantial equivalence). As for macronutrients, epidemiological data identifying adverse effects, including allergic reactions, may also exist. Human trials on whole foods, including novel foods, will only be performed when no serious adverse effects are expected.
Food and Chemical Toxicology 10/2003; 41(9):1211-71. DOI:10.1016/S0278-6915(03)00064-4 · 2.90 Impact Factor