Are you D.J.G. Müller?

Claim your profile

Publications (6)14.48 Total impact

  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The European Food Safety Authority (EFSA) and the World Health Organization (WHO), with the support of the International Life Sciences Institute, European Branch (ILSI Europe), organized an international conference on 16-18 November 2005 to discuss how regulatory and advisory bodies evaluate the potential risks of the presence in food of substances that are both genotoxic and carcinogenic. The objectives of the conference were to discuss the possible approaches for risk assessment of such substances, how the approaches may be interpreted and whether they meet the needs of risk managers. ALARA (as low as reasonably achievable) provides advice based solely on hazard identification and does not take into account either potency or human exposure. The use of quantitative low-dose extrapolation of dose-response data from an animal bioassay raises numerous scientific uncertainties related to the selection of mathematical models and extrapolation down to levels of human exposure. There was consensus that the margin of exposure (MOE) was the preferred approach because it is based on the available animal dose-response data, without extrapolation, and on human exposures. The MOE can be used for prioritisation of risk management actions but the conference recognised that it is difficult to interpret it in terms of health risk.
    Full-text · Article · Nov 2006 · Food and Chemical Toxicology
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The present paper examines the particular difficulties presented by low levels of food-borne DNA-reactive genotoxic carcinogens, some of which may be difficult to eliminate completely from the diet, and proposes a structured approach for the evaluation of such compounds. While the ALARA approach is widely applicable to all substances in food that are both carcinogenic and genotoxic, it does not take carcinogenic potency into account and, therefore, does not permit prioritisation based on potential risk or concern. In the absence of carcinogenicity dose-response data, an assessment based on comparison with an appropriate threshold of toxicological concern may be possible. When carcinogenicity data from animal bioassays are available, a useful analysis is achieved by the calculation of margins of exposure (MOEs), which can be used to compare animal potency data with human exposure scenarios. Two reference points on the dose-response relationship that can be used for MOE calculation were examined; the T25 value, which is derived from linear extrapolation, and the BMDL10, which is derived from mathematical modelling of the dose-response data. The above approaches were applied to selected food-borne genotoxic carcinogens. The proposed approach is applicable to all substances in food that are DNA-reactive genotoxic carcinogens and enables the formulation of appropriate semi-quantitative advice to risk managers.
    Full-text · Article · Nov 2006 · Food and Chemical Toxicology
  • [Show abstract] [Hide abstract]
    ABSTRACT: The Process for the Assessment of Scientific Support for Claims on Foods (PASSCLAIM) had the following principal objectives: • to evaluate existing schemes which assess scientific substantiation; • to produce a generic tool for assessing the scientific support for health claims for foods; • to establish criteria for markers which can be used to explore the links between diet and health. It has involved more than 160 experts from academia, industry,public interest groups and the regulatory environment. It has been supported by the Fifth European Community Framework Programme for Research and Technological Development and was co-ordinated by ILSI Europe. Through an iterative process of discussion in expert groups and workshops, a set of criteria which define requirements for assessing the quality of scientific data reporting the impact of foods and food components on health and well-being have been proposed and progressively refined. As a basis for the development of the criteria, seven comprehensive reviews were produced covering examples of areas of diet, health and performance in which health claims are likely to be made. An eighth paper reviewed existing processes and regulations. The criteria: • emphasise the need for direct evidence of benefit to humans in circumstances consistent with the likely use of the food in order for a case to be made; • recognise the usefulness of markers of intermediate effects when ideal endpoints are not accessible to measurement; • stress the importance of using only those markers which are of proven validity; and • highlight the necessity of ensuring that the magnitude and character of effects on which claims are based are statistically and biologically meaningful. The criteria are presented in summary form, with an outline of the context within which the detailed assessment of the scientific evidence is to be undertaken. The criteria and the context within which they are to be assessed are further discussed and explained in depth in the present document. Whereas requirements relating to safety and other aspects of legislation are part of the context in which foods carrying claims are presented, and must be complied with, they are not part of the PASSCLAIM process and are excluded from the scope of the criteria. The context within which a claim and the case made in its support should be assessed, involves considering existing legislation and dietary guidelines; the need for review in the light of evolving science; and the comprehensibility of the claim to consumers. These aspects are not thought to be part of the scientific criteria reviewed by PASSCLAIM. They nevertheless provide the background against which the scientific validity of claims should be justified. Criteria for the scientific substantiation of claims 1. The food or food component to which the claimed effect is attributed should be characterised. 2. Substantiation of a claim should be based on human data, primarily from intervention studies the design of which should include the following considerations: 2 (a) Study groups that are representative of the target group. 2 (b) Appropriate controls. 2 (c) An adequate duration of exposure and follow up to demonstrate the intended effect. 2 (d) Characterisation of the study groups' background diet and other relevant aspects of lifestyle. 2 (e) An amount of the food or food component consistent with its intended pattern of consumption. 2 (f) The influence of the food matrix and dietary context on the functional effect of the component. 2 (g) Monitoring of subjects' compliance concerning intake of food or food component under test. 2 (h) The statistical power to test the hypothesis. 3. When the true endpoint of a claimed benefit cannot be measured directly, studies should use markers. 4. Markers should be: - biologically valid in that they have a known relationship to the final outcome and their variability within the target population is known; - methodologically valid with respect to their analytical characteristics. 5. Within a study the target variable should change in a statistically significant way and the change should be biologically meaningful for the target group consistent with the claim to be supported. 6. A claim should be scientifically substantiated by taking into account the totality of the available data and by weighing of the evidence. This document presents a consensus view of criteria which, if met, provide a reasonable assurance that scientific data underpinning health claims made for foods are adequate for the purpose and that the claims can be considered valid. It also discusses the relative strengths and limitations of types of scientific approaches and data that are relevant to different health and disease states. The discussion provides guidance on the interpretation of the criteria. The criteria describe the standards by which the quality and relevance of the scientific evidence including new data should be judged, and thus the extent to which claims based on them can be said to be scientifically valid. As the view of a broad-based partnership of scientific and other experts, the criteria provide a basis for harmonising the requirements for, and the assessment of, scientific data supporting health claims made on foods which has a potential for positive impact across a spectrum of stakeholder activities, including those of interest groups within (consumers, health professionals and industry) and across (national and international regulatory agencies) geographic regions. By raising the level of awareness of the essential attributes of the scientific data supporting health claims, the criteria have the potential to increase public confidence in the role of diet in maintaining and improving health and well-being. By defining the quality and type of scientific data required to substantiate health claims, the criteria will assist industry, including small and medium sized enterprises, to identify the scope for new products offering health benefits to consumers. Where there is a lack of specific expertise or resource to undertake development projects, the need for sound evidence bases, as illustrated by these criteria, could be seen as a stimulus for industry and government to encourage and support co-operative initiatives. Thus a harmonised regulatory approach to health claims for foods, operating within a EU single market in an ethos of increased consumer awareness of nutrition, along with confidence in the validity of claims,will provide a driver for innovative production of healthier foods appropriate for modern and changing lifestyles and needs. Collectively these factors should benefit public health and increase the competitiveness of the European agri-food industry in the global market. SME Small or Medium sized Enterprise WHO World Health Organization.
    No preview · Article · Jun 2005
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This review provides a framework contributing to the risk assessment of acrylamide in food. It is based on the outcome of the ILSI Europe FOSIE process, a risk assessment framework for chemicals in foods and adds to the overall framework by focusing especially on exposure assessment and internal dose assessment of acrylamide in food. Since the finding that acrylamide is formed in food during heat processing and preparation of food, much effort has been (and still is being) put into understanding its mechanism of formation, on developing analytical methods and determination of levels in food, and on evaluation of its toxicity and potential toxicity and potential human health consequences. Although several exposure estimations have been proposed, a systematic review of key information relevant to exposure assessment is currently lacking. The European and North American branches of the International Life Sciences Institute, ILSI, discussed critical aspects of exposure assessment, parameters influencing the outcome of exposure assessment and summarised data relevant to the acrylamide exposure assessment to aid the risk characterisation process. This paper reviews the data on acrylamide levels in food including its formation and analytical methods, the determination of human consumption patterns, dietary intake of the general population, estimation of maximum intake levels and identification of groups of potentially high intakes. Possible options and consequences of mitigation efforts to reduce exposure are discussed. Furthermore the association of intake levels with biomarkers of exposure and internal dose, considering aspects of bioavailability, is reviewed, and a physiologically-based toxicokinetic (PBTK) model is described that provides a good description of the kinetics of acrylamide in the rat. Each of the sections concludes with a summary of remaining gaps and uncertainties.
    Full-text · Article · Apr 2005 · Food and Chemical Toxicology
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Traditionally, different approaches have been used to determine the recommended dietary allowances for micronutrients, above which there is a low risk of deficiency, and safe upper levels, below which there is a negligible risk of toxicity. The advice given to risk managers has been in the form of point estimates, such as the recommended dietary allowance (RDA) and the tolerable upper level (UL). In future, the gap between the two intake-response curves may become narrower, as more sensitive indicators of deficiency and toxicity are used, and as health benefits above the recommended daily allowance are taken into account. This paper reviews the traditional approaches and proposes a novel approach to compare beneficial and adverse effects across intake levels. This model can provide advice for risk managers in a form that will allow the risk of deficiency or the risk of not experiencing the benefit to be weighed against the risk of toxicity. The model extends the approach used to estimate recommended dietary allowances to make it applicable to both beneficial and adverse effects and to extend the intake-incidence data to provide a range of estimates that can be considered by the risk manager. The data-requirements of the model are the incidence of a response at one or more levels of intake, and a suitable coefficient of variation to represent the person-to-person variations within the human population. A coefficient of variation of 10% or 15% has been used for established recommended dietary allowances and a value of 15% is proposed as default for considerations of benefit. A coefficient of variation of 45% is proposed as default for considerations of toxicity, based on analyses of human variability in the fate and effects of therapeutic drugs. Using this approach risk managers, working closely with risk assessors, will be able to define ranges of intake based on a balance between the risks of deficiency (or lack of benefit) and toxicity.
    Full-text · Article · Jan 2005 · Food and Chemical Toxicology
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: This report presents a review of risk characterisation, the final step in risk assessment of exposures to food chemicals. The report is the second publication of the project "Food Safety in Europe: Risk Assessment of Chemicals in the Food and Diet (FOSIE)". The science underpinning the hazard identification, hazard characterisation and exposure assessment steps has been published in a previous report (Food Safety in Europe, 2002). Risk characterisation is the stage of risk assessment that integrates information from exposure assessment and hazard characterisation into advice suitable for use in decision-making. The focus of this review is primarily on risk characterisation of low molecular weight chemicals, but consideration is also given to micronutrients and nutritional supplements, macronutrients and whole foods. Problem formulation, as discussed here, is a preliminary step in risk assessment that considers whether an assessment is needed, who should be involved in the process and the further risk management, and how the information will provide the necessary support for risk management. In this step an evaluation is made of whether data are available and what level of resources are needed, as well as the timeline for completing the assessment. The report describes good evaluation practice as an organisational process and the necessary condition under which risk assessment of chemicals should be planned, performed, scrutinised and reported. The outcome of risk characterisation may be quantitative estimates of risks, if any, associated with different levels of exposure, or advice on particular levels of exposure that would be without appreciable risk to health, e.g. a guidance value such as an acceptable daily intake (ADI). It should be recognised that risk characterisation often is an iterative and evolving process. Historically, different approaches have been adopted for the risk characterisation of threshold and non-threshold effects. The hazard characterisation for threshold effects involves the derivation of a level of exposure at or below which there would be no appreciable risk to health if the chemical were to be consumed daily throughout life. A guidance value such as the ADI, is derived from the no-observed-adverse-effect-level (NOAEL) or other starting point, such as the benchmark dose (BMD), by the use of an uncertainty or adjustment factor. In contrast, for non-threshold effects a quantitative hazard estimate can be calculated by extrapolation, usually in a linear fashion, from an observed incidence within the experimental dose-response range to a given low incidence at a low dose. This traditional approach is based on the assumption that there may not be a threshold dose for effects involving genotoxicity. Alternatively, for compounds that are genotoxic, advice may be given that the exposure should be reduced to as low as reasonably achievable (ALARA) or practicable (ALARP). When a NOAEL can be derived from a study in humans, this would be utilised in the derivation of guidance values or advice. However, there may be uncertainties related to the possible role of confounders and the precision of both the incidence and exposure data. Individuals may be at an increased risk because of their greater exposure or their greater sensitivity. Risk characterisation should include information not only on the general population, but also on any subpopulation considered to be potentially susceptible. Risk characterisation considers both individuals with average exposures and those with high exposures. High exposure may be related to life stage, cultural practices and/or qualitative and/or quantitative food preferences. Inter-individual differences in toxicokinetics are an important source of variability in response. This may arise from differences in genetic constitution or environmental influences including diet, nutritional status, physiological status such as pregnancy, as well as patho-physiological states. Studies undertaken for hazard identification and characterisation investigate a substance in isolation, and not in combination with other substances to which humans may be exposed at the same time. It is recognised that food represents an extremely complex mixture of substances. In general, the available data indicate that interactions between chemicals in food are unlikely to be a significant problem for health. However, attention needs to be focused during risk characterisation on substances that share a common mode of action. The patterns of human exposure to chemicals in food may be chronic (usually low-level), short-term (often at higher levels) or chronic low-level with occasional high intakes. This may necessitate the development of guidance values for acute exposures (the acute reference dose, ARfD) based on shorter-term studies, in addition to an ADI-value usually based on chronic studies. The possibility of increased risks of chronic adverse effects associated with long-term low-level exposure, combined with occasional peak exposures has generally been handled by averaging such exposures. The significance of intakes above the ADI is difficult to assess. Consideration in this respect has to be given to the nature of the effect, the magnitude of the excessive intake, as well as the duration of excessive intake, in relation to the half-life of the compound in the body and the associated body burden. An intake above the ADI may not necessarily be associated with significant adverse health outcomes since the ADI usually is based on chronic intake and incorporates a safety margin. However, an intake above the ADI would have the effect of eroding the safety margin by the ratio of the ADI to the predicted excess intake. Alternative approaches to assessment of the significance of intakes above the guidance value are provided by categorical regression analysis and probabilistic methods. For non-threshold effects, such as for some cancers, that have undergone risk characterisation by the use of quantitative low-dose hazard extrapolation, any increase in risk with increased exposure can be readily interpreted using the same mathematical model. The narrative that accompanies the risk characterisation should explain the strengths and limitations of the data. When risk characterisation is based on animal data, the validity of such data needs to be affirmed and reported. Also, uncertainties associated with the extrapolation of data from studies in animals to predict human risk should be presented. Uncertainty can be expressed numerically when intake assessment and hazard characterisation are based on mathematical calculations and/or empirical distributions. Such numerical analyses can also be subject to sensitivity analyses, to test the contribution of different aspects of the database to overall uncertainty. Knowledge regarding the influence of human genetic polymorphisms on toxic responses is advancing rapidly. This has led to increasing concern that the default uncertainty factor may not provide adequate protection in the case of certain polymorphisms. The methods used for risk characterisation of low molecular weight chemicals are applicable in many respects to micronutrients. However, there are some unique aspects, the most obvious ones being that some intake is essential for life and the margins between essential intakes and toxic intakes may, for a number of micronutrients, be small. Since both deficiency and excess of a micronutrient can cause health problems, two guidance values for a micronutrient may be expressed. The setting of a tolerable upper intake level (UL) includes consideration of what does not cause physiological perturbations as well as consideration of the probability of an adverse effect occurring at some specified level of exposure. Macronutrients, such as dietary lipids, proteins and carbohydrates, may be present in the food/diet in substantial amounts. Consideration needs to be given in hazard characterisation of macronutrients to tolerance and to toxicological and nutritional impact. Hazard characterisation using animal studies may not be possible because the addition of bulky macroingredients to experimental diets, in amounts that are exaggerated relative to the human diet, may render such diets unpalatable and/ or cause nutritional imbalance. Because of this, the role of human trials and observational studies are widely viewed as particularly important for macronutrients, addressing toxicokinetics, nutritional issues and tolerance. Observational epidemiological studies may also help to identify adverse effects. As for micronutrients, there may need to be more than one guidance value for a macronutrient. In certain instances, a margin of safety approach may be preferable, when it is not possible in animal studies to exaggerate the dosage sufficiently to accommodate the usual uncertainty factors. This project also addresses hazard and risk characterisation related to whole foods, except those based on GM technology, which is dealt with in another European Union (EU) project, ENTRANSFOOD. Whole foods may be foods currently on the market or novel foods for which approval is being sought. There is as yet no world-wide consensus on the most appropriate approaches to hazard and risk characterisation of whole foods, other than to recommend that a case-by-case consideration and evaluation is needed. The initial approach to novel foods requires consideration of the extent to which the novel food differs from any traditional counterparts, or other related products, and hence whether it can be considered as safe as traditional counterparts/related products (the principle of substantial equivalence). As for macronutrients, epidemiological data identifying adverse effects, including allergic reactions, may also exist. Human trials on whole foods, including novel foods, will only be performed when no serious adverse effects are expected.
    Full-text · Article · Oct 2003 · Food and Chemical Toxicology