Chapter

Translation of New Technologies in Biomedicines: Shaping the Road from Basic Research to Drug Development and Clinical Application-and Back Again

Authors:
Chapter

Translation of New Technologies in Biomedicines: Shaping the Road from Basic Research to Drug Development and Clinical Application-and Back Again

If you want to read the PDF, try requesting it from the authors.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... At one extreme of a spectrum of methods will be inexpensive, highthroughput systems applicable to large numbers of chemicals at the drug discovery stage, while at the other extreme will be more-sophisticated and expensive systems appropriate for use at much later stages of product development, and even with individual patients. FRAME-associated scientists are involved in these developments, for example, by editing a book on New Technologies for Toxicity Testing, due to be published in July 2011, 5 and by writing chapters for Pharmaceutical Biotechnology -A Compre hensive Handbook 6 and Implant Dentistry Research Guide: Basic, Translational and Experimental Clinical Research. 7 The FRAME Alternatives Laboratory (FAL), in the University of Nottingham's School of Biomedical Sciences, is currently actively involved in the development of DILI testing methods based on the use of patient-derived hepatocytes. ...
... It does not include developments in relation to in silico technology, high content screening, the omics approaches, physiologically-based pharmacokinetic (PBPK) models, virtual tissue models, virtual patient populations, biomarkers and clinical imaging, which are also of great significance, especially when used in combination with, and with the benefit of, bioinformatics and systems biology approaches. 6 1. Cell fractions: Short-term studies can be conducted with cell fractions and isolated cell components, such as nuclei, membranes, mitochondria and receptors. This can be the basis for high-throughput screening to answer relatively specific and narrow questions. ...
... There is a vast and rapidly-expanding literature on this subject and all we can do here is to give a few examples, including some recent and comprehensive reviews. [33][34][35] 1. The use of existing knowledge. ...
Article
There is increasing concern that insurmountable differences between humans and laboratory animals limit the relevance and reliability for hazard identification and risk assessment purposes of animal data produced by traditional toxicity test procedures. A way forward is offered by the emerging new technologies, which can be directly applied to human material or even to human beings themselves. This promises to revolutionise the evaluation of the safety of chemicals and chemical products of various kinds and, in particular, pharmaceuticals. The available and developing technologies are summarised and it is emphasised that they will need to be used selectively, in integrated and intelligent testing strategies, which, in addition to being scientifically sound, must be manageable and affordable. Examples are given of proposed testing strategies for general chemicals, cosmetic ingredients, candidate pharmaceuticals, inhaled substances, nanoparticles and neurotoxicity.
Article
Full-text available
With support from the Institutes and Centers forming the NIH Blueprint for Neuroscience Research, we have designed and implemented a new initiative for integrating access to and use of Web-based neuroscience resources: the Neuroscience Information Framework. The Framework arises from the expressed need of the neuroscience community for neuroinformatic tools and resources to aid scientific inquiry, builds upon prior development of neuroinformatics by the Human Brain Project and others, and directly derives from the Society for Neuroscience’s Neuroscience Database Gateway. Partnered with the Society, its Neuroinformatics Committee, and volunteer consultant-collaborators, our multi-site consortium has developed: (1) a comprehensive, dynamic, inventory of Web-accessible neuroscience resources, (2) an extended and integrated terminology describing resources and contents, and (3) a framework accepting and aiding concept-based queries. Evolving instantiations of the Framework may be viewed at http://nif.nih.gov, http://neurogateway.org, and other sites as they come on line.
Article
Full-text available
We have developed individual, integrated testing strategies (ITS) for predicting the toxicity of general chemicals, cosmetics, pharmaceuticals, inhaled chemicals, and nanoparticles. These ITS are based on published schemes developed previously for the risk assessment of chemicals to fulfil the requirements of REACH, which have been updated to take account of the latest developments in advanced in chemico modelling and in vitro technologies. In addition, we propose an ITS for neurotoxicity, based on the same principles, for incorporation in the other ITS. The technologies are deployed in a step-wise manner, as a basis for decision-tree approaches, incorporating weight-of-evidence stages. This means that testing can be stopped at the point where a risk assessment and/or classification can be performed, with labelling in accordance with the requirements of the regulatory authority concerned, rather than following a checklist approach to hazard identification. In addition, the strategies are intelligent, in that they are based on the fundamental premise that there is no hazard in the absence of exposure - which is why pharmacokinetic modelling plays a key role in each ITS. The new technologies include the use of complex, three-dimensional human cell tissue culture systems with in vivo-like structural, physiological and biochemical features, as well as dosing conditions. In this way, problems of inter-species extrapolation and in vitro/in vivo extrapolation are minimised. This is reflected in the ITS placing more emphasis on the use of volunteers at the whole organism testing stage, rather than on existing animal testing, which is the current situation.
Article
Full-text available
Chronic wounds require prolonged healthcare and adversely affect the quality of life of patients. They are particularly prominent in patients with diabetes, and their relative numbers are set to increase with the rise of diabetes within our population. Research is still needed to understand the factors leading to such wounds, to understand why they persist for such long periods of time, and also to develop new and efficacious treatment strategies. One problem facing this research is a lack of adequate animal models, as the current models do not truly reflect the human condition and often lead to much animal suffering. Hence, over the past four years, our group has been trying to develop a human-based in vitro diabetic wound model, which could be used as a high-throughput screening system to pre-screen potential chronic diabetic wound healing agents and to reduce unnecessary animal pain and suffering. To this end, we have isolated healthy and diseased skin fibroblasts from patient tissue biopsies. Crucially, to create a cell reporter system that can be widely used in the future, the cells were immortalised in order to escape senescence. By using microarray analysis, gene expression pattern differences have been identified between healthy and diseased cells, and disease-specific 'reporter' genes have been selected for further studies. The promoters of these reporter genes have been coupled to fluorescent reporter constructs and inserted back into the diseased fibroblasts, so that we now have proof-of-concept for a real-time diabetic reporter system for future exploitation.
Article
Full-text available
Damage and degeneration of articular joints is a major healthcare concern, due to the association of joint disease with ageing, the current strong demographic changes in the proportion of elderly in the population, and the increased incidence of trauma in a sports-active younger population. These joints are biomechanical organs that transmit load between bones in our skeleton, and the articular cartilage forms a load-bearing surface that covers the bone within the joints. All the forces across the joints are thus transmitted through the cartilage, and it therefore makes an important biomechanical contribution to joint function. The cartilage is particularly prone to damage, and has limited capacity for natural repair. Although joint replacement is successful, it is less so in younger patients. For these patients, there is currently great interest in developing cell-based treatments for the biological repair of articular cartilage.
Article
Full-text available
The pharmaceutical industry is failing in its primary function, with increasing expenditure and decreased output in terms of new medicines brought to market. It cannot carry on as it is, without sliding into a terminal decline. It must, therefore, take some positive steps toward addressing its problems. We do not have to look far to see one very obvious problem, namely, the industry's continuing reliance on nonhuman biology as the basis of its evaluation of potential safety and efficacy. The time has come to focus on the relevant, and to realise that more human-based testing is essential, if the industry is to survive as a source of innovation in drug therapy. This can incorporate earlier clinical testing, in the form of microdosing, and promotion of the development of more-powerful computational approaches based on human information. Fortunately, headway is being made in both approaches. However, a problem remains in the lack of functional evaluation of human tissues, where the lack of commitment, and the inadequacy of the tissue resource itself, are hampering any serious developments. An outline of a collaborative scheme is proposed, that will address this issue, central to which is improved access to research tissues from heart-beating organ donors.
Article
Full-text available
The unmet needs of biomedical and clinical research are highlighted by reference to drug -induced liver injury(DILI), non-alcoholic fatty liver disease (NAFLD) and its severe form, non-alcoholic steatohepatitis (NASH). Examples in these areas highlight the major limitations of animal models with respect to predicting, examining and managing these clinically significant forms of liver injury. The way in which these knowledge gaps are being bridged by studies involving the use of human tissues and primary cells are described.
Article
Full-text available
The advent of early Absorption, Distribution, Metabolism, Excretion, and Toxicity (ADMET) screening has increased the elimination rate of weak drug candidates early in the drug-discovery process, and decreased the proportion of compounds failing in clinical trials for ADMET reasons. This paper reviews the history of ADMET screening and why it has become so important in drug discovery and development. Assays that have been developed in response to specific needs, and improvements in technology that result in higher throughput and greater accuracy of prediction of human mechanisms of toxicity, are discussed. The paper concludes with the authors' forecast of new models that will better predict human efficacy and toxicity.
Article
Full-text available
The decreasing cost-efficiency of drug development threatens to result in a severe shortage of innovative drugs, which may seriously compromise patient healthcare. This risk underlines the urgency to change the paradigm in clinical research. Here, we examine a novel concept of conducting virtual clinical trials for efficiently screening drug candidates, and for evaluating their prospects of being brought to the market successfully. The virtual clinical trials are carried out by using virtual patients (denoted Optimata Virtual Patients -- OVPs). The OVP, a set of mathematical algorithms that describe the main pathological and physiological dynamic processes affected by the administered drug, has been shown to accurately predict docetaxel efficacy and safety in individual breast cancer patients. We report a test case in which virtual clinical trials have been conducted by using OVP populations for rescuing a discontinued oncology compound, ISIS-5132 (ISIS Pharmaceuticals Inc.). Our in silico study suggested that ISIS-5132 may be efficacious in combination with another drug, sunitinib malate (Sutent, Pfizer Inc.), for the treatment of prostate cancer. The recommended combined treatment is predicted to result in a higher five-year Progression-Free Survival than monotherapy with either drug alone.
Article
Full-text available
While the duration and size of human clinical trials may be difficult to reduce, there are several parameters in pre-clinical vaccine development that may be possible to further optimise. By increasing the accuracy of the models used for pre-clinical vaccine testing, it should be possible to increase the probability that any particular vaccine candidate will be successful in human trials. In addition, an improved model will allow the collection of increasingly more-informative data in pre-clinical tests, thus aiding the rational design and formulation of candidates entered into clinical evaluation. An acceleration and increase in sophistication of pre-clinical vaccine development will thus require the advent of more physiologically-accurate models of the human immune system, coupled with substantial advances in the mechanistic understanding of vaccine efficacy, achieved by using this model. We believe the best viable option available is to use human cells and/or tissues in a functional in vitro model of human physiology. Not only will this more accurately model human diseases, it will also eliminate any ethical, moral and scientific issues involved with use of live humans and animals. An in vitro model, termed "MIMIC" (Modular IMmune In vitro Construct), was designed and developed to reflect the human immune system in a well-based format. The MIMIC System is a laboratory-based methodology that replicates the human immune system response. It is highly automated, and can be used to simulate a clinical trial for a diverse population, without putting human subjects at risk. The MIMIC System uses the circulating immune cells of individual donors to recapitulate each individual human immune response by maintaining the autonomy of the donor. Thus, an in vitro test system has been created that is functionally equivalent to the donor's own immune system and is designed to respond in a similar manner to the in vivo response.
Article
Full-text available
Prostate cancer (PC) is the second most common cause of cancer related death in men. A number of key limitations with prostate specific antigen (PSA), currently the standard detection test, has justified evaluation of new biomarkers. We have assessed the diagnostic potential of Engrailed-2 (EN2) protein, a homeodomain-containing transcription factor expressed in PC cell lines and secreted into the urine by PC in men. EN2 expression in PC cell lines and prostate cancer tissue was determined by semi-quantative RT-PCR and immunohistochemistry. First pass urine [without prior digital rectal examination (DRE)] was collected from men presenting with urinary symptoms (referred to exclude/confirm the presence of prostate cancer) and from controls. EN2 protein was measured by ELISA in urine from men with PC (n = 82) and controls (n = 102). EN2 was expressed and secreted by PC cell lines and PC tissue but not by normal prostate tissue or stroma. The presence of EN2 in urine was highly predictive of PC, with a sensitivity of 66% and a specificity of 88.2%, without requirement for DRE. There was no correlation with PSA levels. These results were confirmed independently by a second academic center. Urinary EN2 is a highly specific and sensitive candidate biomarker of prostate cancer. A larger multicenter study to further evaluate the diagnostic potential of EN2 is justified.
Article
Full-text available
Genevar (GENe Expression VARiation) is a database and Java tool designed to integrate multiple datasets, and provides analysis and visualization of associations between sequence variation and gene expression. Genevar allows researchers to investigate expression quantitative trait loci (eQTL) associations within a gene locus of interest in real time. The database and application can be installed on a standard computer in database mode and, in addition, on a server to share discoveries among affiliations or the broader community over the Internet via web services protocols. Availability: http://www.sanger.ac.uk/resources/software/genevar Contact: emmanouil.dermitzakis@unige.ch
Article
Full-text available
High-content screening (HCS) was introduced in 1997 based on light microscope imaging technologies to address the need for an automated platform that could analyze large numbers of individual cells with subcellular resolution using standard microplates. Molecular specificity based on fluorescence was a central element of the platform taking advantage of the growing list of reagent classes and the ability to multiplex. In addition, image analysis coupled to data management, data mining, and data visualization created a tool that focused on biological information and knowledge to begin exploring the functions of genes identified in the genomics revolution. This overview looks at the development of HCS, the evolution of the technologies, and the market up to the present day. In addition, the options for adopting uniform definitions is suggested along with a perspective on what advances are needed to continue building the value of HCS in biomedical research, drug discovery, and development and diagnostics.
Article
Full-text available
In this paper, we illustrate how advanced computational modelling and simulation can be used to investigate drug-induced effects on cardiac electrophysiology and on specific biomarkers of pro-arrhythmic risk. To do so, we first perform a thorough literature review of proposed arrhythmic risk biomarkers from the ionic to the electrocardiogram levels. The review highlights the variety of proposed biomarkers, the complexity of the mechanisms of drug-induced pro-arrhythmia and the existence of significant animal species differences in drug-induced effects on cardiac electrophysiology. Predicting drug-induced pro-arrhythmic risk solely using experiments is challenging both preclinically and clinically, as attested by the rise in the cost of releasing new compounds to the market. Computational modelling and simulation has significantly contributed to the understanding of cardiac electrophysiology and arrhythmias over the last 40 years. In the second part of this paper, we illustrate how state-of-the-art open source computational modelling and simulation tools can be used to simulate multi-scale effects of drug-induced ion channel block in ventricular electrophysiology at the cellular, tissue and whole ventricular levels for different animal species. We believe that the use of computational modelling and simulation in combination with experimental techniques could be a powerful tool for the assessment of drug safety pharmacology.
Article
Full-text available
The first formal qualification of safety biomarkers for regulatory decision making marks a milestone in the application of biomarkers to drug development. Following submission of drug toxicity studies and analyses of biomarker performance to the Food and Drug Administration (FDA) and European Medicines Agency (EMEA) by the Predictive Safety Testing Consortium's (PSTC) Nephrotoxicity Working Group, seven renal safety biomarkers have been qualified for limited use in nonclinical and clinical drug development to help guide safety assessments. This was a pilot process, and the experience gained will both facilitate better understanding of how the qualification process will probably evolve and clarify the minimal requirements necessary to evaluate the performance of biomarkers of organ injury within specific contexts.
Article
Full-text available
Pharmacogenomics has employed candidate gene studies and, more recently, genome-wide association studies (GWAS) in efforts to identify loci associated with drug response and/or toxicity. The advantage of GWAS is the simultaneous, unbiased testing of millions of SNPs; the challenge is that functional information is absent for the vast majority of loci that are implicated. In the present study, we systematically evaluated SNPs associated with chemotherapeutic agent-induced cytotoxicity for six different anticancer agents and evaluated whether these SNPs were disproportionately likely to be within a functional class such as coding (consisting of missense, nonsense, or frameshift polymorphisms), noncoding (such as 3'UTRs or splice sites), or expression quantitative trait loci (eQTLs; indicating that a SNP genotype is associated with the transcript abundance level of a gene). We found that the chemotherapeutic drug susceptibility-associated SNPs are more likely to be eQTLs, and, in fact, more likely to be associated with the transcriptional expression level of multiple genes (n > or = 10) as potential master regulators, than a random set of SNPs in the genome, conditional on minor allele frequency. Furthermore, we observed that this enrichment compared with random expectation is not present for other traditionally important coding and noncoding SNP functional categories. This research therefore has significant implications as a general approach for the identification of genetic predictors of drug response and provides important insights into the likely function of SNPs identified in GWAS analysis of pharmacologic studies.
Article
Full-text available
The explosive growth in our knowledge of genomes, proteomes, and metabolomes is driving ever-increasing fundamental understanding of the biochemistry of life, enabling qualitatively new studies of complex biological systems and their evolution. This knowledge also drives modern biotechnologies, such as molecular engineering and synthetic biology, which have enormous potential to address urgent problems, including developing potent new drugs and providing environmentally friendly energy. Many of these studies, however, are ultimately limited by their need for even-higher-throughput measurements of biochemical reactions. We present a general ultrahigh-throughput screening platform using drop-based microfluidics that overcomes these limitations and revolutionizes both the scale and speed of screening. We use aqueous drops dispersed in oil as picoliter-volume reaction vessels and screen them at rates of thousands per second. To demonstrate its power, we apply the system to directed evolution, identifying new mutants of the enzyme horseradish peroxidase exhibiting catalytic rates more than 10 times faster than their parent, which is already a very efficient enzyme. We exploit the ultrahigh throughput to use an initial purifying selection that removes inactive mutants; we identify approximately 100 variants comparable in activity to the parent from an initial population of approximately 10(7). After a second generation of mutagenesis and high-stringency screening, we identify several significantly improved mutants, some approaching diffusion-limited efficiency. In total, we screen approximately 10(8) individual enzyme reactions in only 10 h, using < 150 microL of total reagent volume; compared to state-of-the-art robotic screening systems, we perform the entire assay with a 1,000-fold increase in speed and a 1-million-fold reduction in cost.
Article
Full-text available
A number of toxic effects are brought about by the covalent interaction between the toxicant and biological macromolecules. In chemico assays are available that attempt to identify reactive compounds. These approaches have been developed independently for pharmaceuticals and for other nonpharmaceutical compounds. The assays vary widely in terms of the macromolecule (typically a peptide) and the analytical technique utilised. For both sets of methods, there are great opportunities to capture in chemico information by using in silico methods to provide computational tools for screening purposes. In order to use these in chemico and in silico methods, integrated testing strategies are required for individual toxicity endpoints. The potential for the use of these approaches is described, and a number of recommendations to improve this extremely useful technique, in terms of implementing the Three Rs in toxicity testing, are presented.
Article
Full-text available
The global pharmaceutical industry is estimated to use close to 20 million animals annually, in in vivo studies which apply the results of fundamental biomedical research to the discovery and development of novel pharmaceuticals, or to the application of existing pharmaceuticals to novel therapeutic indications. These applications of in vivo experimentation include: a) the use of animals as disease models against which the efficacy of therapeutics can be tested; b) the study of the toxicity of those therapeutics, before they are administered to humans for the first time; and c) the study of their pharmacokinetics - i.e. their distribution throughout, and elimination from, the body. In vivo pharmacokinetic (PK) studies are estimated to use several hundred thousand animals annually. The success of pharmaceutical research currently relies heavily on the ability to extrapolate from data obtained in such in vivo studies to predict therapeutic behaviour in humans. Physiologically-based modelling has the potential to reduce the number of in vivo animal studies that are performed by the pharmaceutical industry. In particular, the technique of physiologically-based pharmacokinetic (PBPK) modelling is sufficiently developed to serve as a replacement for many in vivo PK studies in animals during drug discovery. Extension of the technique to incorporate the prediction of in vivo therapeutic effects and/or toxicity is less well-developed, but has potential in the longer-term to effect a significant reduction in animal use, and also to lead to improvements in drug discovery via the increased rationalisation of lead optimisation.
Article
Full-text available
The testing of substances for adverse effects on humans and the environment needs a radical overhaul if we are to meet the challenges of ensuring health and safety.
Article
Full-text available
Cardiotoxicity is among the leading reasons for drug attrition and is therefore a core subject in non-clinical and clinical safety testing of new drugs. European Centre for the Validation of Alternative Methods held in March 2008 a workshop on "Alternative Methods for Drug-Induced Cardiotoxicity" in order to promote acceptance of alternative methods reducing, refining or replacing the use of laboratory animals in this field. This review reports the outcome of the workshop. The participants identified the major clinical manifestations, which are sensitive to conventional drugs, to be arrhythmias, contractility toxicity, ischaemia toxicity, secondary cardiotoxicity and valve toxicity. They gave an overview of the current use of alternative tests in cardiac safety assessments. Moreover, they elaborated on new cardiotoxicological endpoints for which alternative tests can have an impact and provided recommendations on how to cover them.
Article
Full-text available
Drug-induced liver injury (DILI) is an important cause of serious liver disease. The antimicrobial agent flucloxacillin is a common cause of DILI, but the genetic basis for susceptibility remains unclear. We conducted a genome-wide association (GWA) study using 866,399 markers in 51 cases of flucloxacillin DILI and 282 controls matched for sex and ancestry. The GWA showed an association peak in the major histocompatibility complex (MHC) region with the strongest association (P = 8.7 x 10(-33)) seen for rs2395029[G], a marker in complete linkage disequilibrium (LD) with HLA-B*5701. Further MHC genotyping, which included 64 flucloxacillin-tolerant controls, confirmed the association with HLA-B*5701 (OR = 80.6, P = 9.0 x 10(-19)). The association was replicated in a second cohort of 23 cases. In HLA-B*5701 carrier cases, rs10937275 in ST6GAL1 on chromosome 3 also showed genome-wide significance (OR = 4.1, P = 1.4 x 10(-8)). These findings provide new insights into the mechanism of flucloxacillin DILI and have the potential to substantially improve diagnosis of this serious disease.
Article
Over the past two or three decades, there has been mounting pressure to reduce reliance on laboratory animal testing for predicting hazards and assessing risks in relation to industrial chemicals and chemical products of various kinds including pharmaceuticals and agrochemicals, household and person...
Book
Current Developments in Cell Culture Technology.- Embryonic Stem Cells in Safety Pharmacology and Toxicology.- Trends in Cell Culture Technology.- Tissue Engineering in the Development of Replacement Technologies.- Toxicity Testing of Nanomaterials.- Physiologically-Based Pharmacokinetic (PBPK) Models in Toxicity Testing and Risk Assessment.- In Silico Methods for Toxicity Prediction.- Luminescent Quantum Dots for Molecular Toxicology.- Engineering Quasi-Vivo(R) In-Vitro Organ Models.- ECVAM and New Technologies for Toxicity Testing.- Medium to High Throughput Screening: Microfabricat ion and Chip-Based Technology.- The Use of Genomics in Model In Vitro Systems.- The Use of Integrated and Intelligent Testing Strategies in the Prediction of Toxic Hazard and in Risk Assessment.
Article
The ideal features of a cell culture system for in vitro investigation depend on what questions the system is to address. However, in general, highly valuable systems will replicate the characteristics and more specifically, the responses, of normal human tissues. Systems that can faithfully replicate different tissue types provide tremendous potential value for in vitro research and have been the subject of much research effort in this area over many years. Furthermore, a range of such systems that could mimic key genetic variations or diseases would have special value for toxicology and drug discovery. In the pursuit of such model systems, there are a number of significant practical issues to consider for their application, which includes ability to deliver with ease, the required quantities of cells at the time needed. In addition any cell culture assay will need to be robust and reliable and provide readily interpreted and quantified endpoints. Other general criteria for cell culture systems include scalability to provide the very large cell numbers that may be required for high throughput systems, with a high degree of reliability and reproducibility. The amenability of the cell culture for down-scaling may also be important, to permit the use of very small test samples (e.g., in 96-well arrays), even down to the level of single cell analysis. This chapter explores the range of new cell culture systems for scaling up cell cultures that will be needed for high throughput toxicology and drug discovery assays. It also reviews the increasing range of novel systems that enable high content analysis from small cell numbers or even single cells. The hopes and challenges for the use of human stem cell lines are also investigated in comparison with the range of eukaryotic cells types currently in use in toxicology.
Article
The electrical activity of cardiac and uterine tissues has been reconstructed by detailed computer models in the form of virtual tissues. Virtual tissues are biophysically and anatomically detailed, and represent quantitatively predictive models of the physiological and pathophysiological behaviours of tissue within an isolated organ. The cell excitation properties are quantitatively reproduced by equations that describe the kinetics of a few dozen proteins. These equations are derived from experimental measurements of membrane potentials, ionic currents, fluxes, and concentrations. Some of the measurements were taken from human cells and human ion channel proteins expressed in non-human cells, but they were mostly taken from cells of other animal species. Data on tissue geometry and architecture are obtained from the diffusion tensor magnetic resonance imaging of ex vivo or post mortem tissue, and are used to compute the spread of current in the tissue. Cardiac virtual tissues are well established and reproduce normal and pathological patterns of cardiac excitation within the atria or ventricles of the human heart. They have been applied to increase the understanding of normal cardiac electrophysiology, to evaluate the candidate mechanisms for re-entrant arrhythmias that lead to sudden cardiac death, and to predict the tissue level effects of mutant or pharmacologically-modified ion channels. The human full-term virtual uterus is still in development. This virtual tissue reproduces the in vitro behaviour of uterine tissue biopsies, and provides possible mechanisms for premature labour.
Article
While the events leading to breast cancer development are not fully understood, a pre-invasive lesion, ductal carcinoma in situ (DCIS), is recognised as the main precursor of invasive disease. Understanding how pre-invasive lesions develop into invasive breast cancer is critical, since currently there is no way of predicting which tumours are likely to progress, leading to unnecessary surgical intervention or chemotherapy. With a lack of good animal models able to mimic DCIS progression in a laboratory setting, there has been a shift toward developing in vitro human models which more accurately represent human disease. By manipulating individual cell populations in these models, we can recapitulate the complex cellular interactions involved in disease progression, an essential step in understanding breast cancer behaviour.
Article
As a branch of pharmacogenomics aimed at predicting drug safety concerns, toxicogenomics drew much excitement with the emergence of technologies such as gene expression microarrays. A few years down the line, the evidence is scant that current approaches to toxicogenomics are really making an impact in areas such as preclinical toxicology. It has been argued that there needs to be a re-focus of application toward high-throughput approaches which combine the best of tissue and genomic modelling. This commentary gives a brief introduction to in vitro toxicogenomics, drawn from the perspectives of the specialist toxicogenomics company, SimuGen.
Article
The detailed investigation of the metabolism of drugs is one of the key issues in drug development. Several in vitro metabolism assays have been developed over the last two decades, to replace time-consuming and expensive animal studies. These have the potential to speed up drug development and increase drug safety, as they can be used to improve the prediction of the effects of drugs on humans. The key factors to be identified in metabolism are: a) the enzymes involved, and b) the metabolites produced by these enzymes. Cytochromes P450 (CYP-450s) are the key enzymes in drug metabolism. Cloning the genes encoding the CYP-450s, and the genetic engineering of suitable cells for heterologous expression, have provided new cell lines for studies on drug metabolism in vitro, under highly defined conditions. The V79 cell line, derived from Chinese hamster lung fibroblasts, was found to be suitable for heterologous expression, as these cells themselves do not express CYP-450s, thus providing a clean background for genetically engineering for the stable expression of any cloned CYP-450. In this way, V79 cell lines were created which specifically express CYP-450s from human, mouse, rat, and fish. These recombinant V79 cells have been applied in several drug metabolism and toxicity studies.
Article
Accurate prediction of the human response to potential pharmaceuticals is difficult, often unreliable, and invariably expensive. Traditional in vitro cell culture assays are of limited value, because they do not accurately mimic the complex environment to which a drug candidate is subjected within the human body. While in vivo animal studies can account for the complex inter-cellular and inter-tissue effects not observable from in vitro assays, animal studies are expensive, labour intensive, time consuming, and unpopular. In addition, there is considerable concern as to whether animal studies can predict human risk sufficiently precisely, because, first, there is no known mechanistic basis for extrapolation from high to low doses, and second, cross-species extrapolation has frequently been found to be problematic with respect to toxicity and pharmacokinetic characteristics. To address these limitations, an interactive, cell-based microfluidic biochip called a Hurel was developed. The Hurel system consists of living cells segregated into interconnected "tissue" or "organ" compartments. The organ compartments are connected by a re-circulating culture medium that acts as a "blood surrogate". The fluidics are designed so that the primary elements of the circulatory system, and more importantly, the interactions of the organ systems, are accurately mimicked. Drug candidates are exposed to a more-realistic animal or human physiological environment, thus providing a higher and more accurate informational content than can the traditional in vitro assays. By affording dynamic assessment of potential toxicity, metabolism, and bioavailability, the device's capabilities hold the potential to markedly improve the prioritisation of drug leads prior to animal studies.
Article
Drug-induced liver injury is a common reason for drug attrition in late clinical phases, and even for post-launch withdrawals. As a consequence, there is a broad consensus in the pharmaceutical industry, and within regulatory authorities, that a significant improvement of the current in vitro test methodologies for accurate assessment and prediction of such adverse effects is needed. For this purpose, appropriate in vivo-like hepatic in vitro models are necessary, in addition to novel sources of human hepatocytes. In this report, we describe recent and ongoing research toward the use of human embryonic stem cell (hESC)-derived hepatic cells, in conjunction with new and improved test methods, for evaluating drug metabolism and hepatotoxicity. Recent progress on the directed differentiation of human embryonic stem cells to the functional hepatic phenotype is reported, as well as the development and adaptation of bioreactors and toxicity assay technologies for the testing of hepatic cells. The aim of achieving a testing platform for metabolism and hepatotoxicity assessment, based on hESC-derived hepatic cells, has advanced markedly in the last 2-3 years. However, great challenges still remain, before such new test systems could be routinely used by the industry. In particular, we give an overview of results from the Vitrocellomics project (EU Framework 6) and discuss these in relation to the current state-of-the-art and the remaining difficulties, with suggestions on how to proceed before such in vitro systems can be implemented in industrial discovery and development settings and in regulatory acceptance.
Article
An important molecular initiating event for genotoxicity is the ability of a compound to bind covalently with DNA. However, not all compounds that can undergo covalent binding mechanisms will result in genotoxicity. One approach to solving this problem, when in silico prediction techniques are being used, is to develop tools that allow chemicals to be grouped into categories based on their ability to bind covalently to DNA. For this analysis to take place, compounds need to be placed within categories where the trend in toxicity can be explained by simple descriptors, such as hydrophobicity. However, this can occur only when the compounds within a category are structurally and mechanistically similar. Chemistry-based profilers have the ability to screen compounds and highlight those with similar structures to a target compound, and are thus likely to act via a similar mechanism of action. Here, examples are reported to highlight how structure-based profilers can be used to form categories and hence fill data gaps. The importance of developing a well-defined and robust category is discussed in terms of both mechanisms of action and structural similarity.
Article
Drug-induced liver injury (DILI) is a leading cause of drugs failing during clinical trials and being withdrawn from the market. Comparative analysis of drugs based on their DILI potential is an effective approach to discover key DILI mechanisms and risk factors. However, assessing the DILI potential of a drug is a challenge with no existing consensus methods. We proposed a systematic classification scheme using FDA-approved drug labeling to assess the DILI potential of drugs, which yielded a benchmark dataset with 287 drugs representing a wide range of therapeutic categories and daily dosage amounts. The method is transparent and reproducible with a potential to serve as a common practice to study the DILI of marketed drugs for supporting drug discovery and biomarker development.
Article
Members of the Hes and Hey families of basic helix-loop-helix transcription factors are regarded as Notch target genes that generally inhibit neuronal differentiation of neural progenitor cells. We found that HeyL, contrary to the classic function of Hes and Hey factors, promotes neuronal differentiation of neural progenitor cells both in culture and in the embryonic brain in vivo. Furthermore, null mutation of HeyL decreased the rate of neuronal differentiation of cultured neural progenitor cells. HeyL binds to and activates the promoter of the proneural gene neurogenin2, which is inhibited by other Hes and Hey family members, and HeyL is a weak inhibitor of the Hes1 promoter. HeyL is able to bind other Hes and Hey family members, but it cannot bind the Groucho/Tle1 transcriptional corepressor, which mediates the inhibitory effects of the Hes family of factors. Furthermore, although HeyL expression is only weakly augmented by Notch signaling, we found that bone morphogenic protein signaling increases HeyL expression by neural progenitor cells. These observations suggest that HeyL promotes neuronal differentiation of neural progenitor cells by activating proneural genes and by inhibiting the actions of other Hes and Hey family members.
Article
A broad range of technologies have been developed to enable three dimensional (3D) cell culture. Few if any however are adaptable for routine everyday use in a straightforward and cost effective manner. Alvetex(®) is a rigid highly porous polystyrene scaffold designed specifically to enable routine 3D cell culture. The scaffold is engineered into thin membranes that fit into conventional cell culture plasticware. The material is inert and offers a polystyrene substrate familiar to cell biologists worldwide. The 3D geometry of the scaffold provides the environment in which cells grow, differentiate, and function to form close relationships with adjacent cells thus creating the equivalent of a thin tissue layer in vitro. This chapter introduces the features required by a technology that enables routine 3D cell culture. Using Alvetex(®) as a product that satisfies these requirements, its application is demonstrated for the growth of a recognised cell line. Procedures detailing the use of Alvetex(®) for 3D cell culture are provided. This is followed by a series of detailed methods describing ways to analyse such cultures including histological techniques, immunocytochemistry, and scanning electron microscopy. Examples of data generated from these methods are shown in the corresponding figures. Additional notes are also included where further information about certain procedures is required. The use of Alvetex(®) in combination with these methods will enable investigators to routinely produce complex 3D cultures to research the growth, differentiation, and function of cells in new ways.
Article
With the release of the landmark report Toxicity Testing in the 21st Century: A Vision and a Strategy, the U.S. National Academy of Sciences, in 2007, precipitated a major change in the way toxicity testing is conducted. It envisions increased efficiency in toxicity testing and decreased animal usage by transitioning from current expensive and lengthy in vivo testing with qualitative endpoints to in vitro toxicity pathway assays on human cells or cell lines using robotic high-throughput screening with mechanistic quantitative parameters. Risk assessment in the exposed human population would focus on avoiding significant perturbations in these toxicity pathways. Computational systems biology models would be implemented to determine the dose-response models of perturbations of pathway function. Extrapolation of in vitro results to in vivo human blood and tissue concentrations would be based on pharmacokinetic models for the given exposure condition. This practice would enhance human relevance of test results, and would cover several test agents, compared to traditional toxicological testing strategies. As all the tools that are necessary to implement the vision are currently available or in an advanced stage of development, the key prerequisites to achieving this paradigm shift are a commitment to change in the scientific community, which could be facilitated by a broad discussion of the vision, and obtaining necessary resources to enhance current knowledge of pathway perturbations and pathway assays in humans and to implement computational systems biology models. Implementation of these strategies would result in a new toxicity testing paradigm firmly based on human biology.
Article
Recent years have witnessed impressive advances in the use of magnetic resonance imaging (MRI) for the assessment of patients with multiple sclerosis (MS). Complementary to the clinical evaluation, conventional MRI provides crucial pieces of information for the diagnosis of MS. However, the correlation between the burden of lesions observed on conventional MRI scans and the clinical manifestations of the disease remains weak. The discrepancy between clinical and conventional MRI findings in MS is explained, at least partially, by the limited ability of conventional MRI to characterize and quantify the heterogeneous features of MS pathology. Other quantitative MR-based techniques, however, have the potential to overcome such a limitation of conventional MRI. Indeed, magnetization transfer MRI, diffusion tensor MRI, proton MR spectroscopy, and functional MRI are contributing to elucidate the mechanisms that underlie injury, repair, and functional adaptation in patients with MS. Such techniques are likely to benefit from the use of high-field MR systems and thus allow in the near future providing additional insight into all these aspects of the disease. This review summarizes how MRI is dramatically changing our understanding of the factors associated with the accumulation of irreversible disability in MS and highlights the reasons why they should be used more extensively in studies of disease evolution and clinical trials.
Article
The advent of induced pluripotent stem cells (iPSCs) can overcome some of the current limitations of hESC-based cell therapy. For instance, using genetic engineering, the Yamanaka group manipulated murine somatic cells through the expres- sion of four transcription factors (TFs) Oct4, Klf4, Sox2, and c- Myc, and reprogrammed the cells to a pluripotent state. (2) Simi- lar genetic manipulations subsequently resulted in the genera- tion of human iPSCs. (3) Efforts in many labs are now concen- trated on applying the iPSC technology to tissue-replacement therapies as well as on modeling diseases in vitro. (4) Other ad- vances include generating iPSCs from patients with different diseases in an effort to cause the in vitro differentiation of these iPSCs into the cell types affected by the disease. (5) The use of iPSCs also helps in gaining significant insights into understanding the mechanisms underlying pluripotency and differentiation. However, before iPSCs can be clinically relevant, several limi- tations, including the use of viral vectors and the slow kinetics and low efficiency of induction, need to be addressed. (6) One of the most critical issues is the presence of transgenes in the iPSCs. Typically, iPSCs are generated by transducing somatic cells with transgenes, which are integrated within the cell's genome, by using retroviruses or lentiviruses. The integrated transgenes are silenced, and the endogenous genes encoding the TFs are activated. However, transgene reactivation (espe- cially c-Myc) poses a significant threat as it can lead to tumori- genesis. (7) Furthermore, the erroneous expression of these
Article
Animal testing is still compulsory worldwide, for the approval of drugs and chemicals produced in large quantities. Computer-assisted (in silico) technologies are considered to be efficient alternatives to in vivo experiments, and are therefore endorsed by many regulatory agencies, e.g. for use in the European REACH initiative. Advantages of in silico methods include: the possible study of hypothetical compounds; their low cost; and the fact that such virtual experiments are typically based on human data, thus making the question of interspecies transferability obsolete. Since the mid-1990s, computer-based technologies have become an indispensable tool in drug discovery - used primarily to identify small molecules displaying a stereospecific and selective binding to a regulatory macromolecule. Since toxic effects are still responsible for some 20% of the late-stage failures, there is a continuing need for in silico concepts which can be used to estimate a compound's ADMET (adsorption, distribution, metabolism, elimination, toxicity) properties - in particular, toxicity. The aim of this paper is to provide an insight into computational technologies that allow for the prediction of toxic effects triggered by pharmaceuticals. As most adverse and toxic effects are mediated by unwanted interactions with macromolecules involved in biological regulatory systems, we have focused on methodologies that are based on three-dimensional models of small molecules binding to such entities, and discuss the results at the molecular level.
Article
With the ever increasing volume of data available to scientists in drug discovery and development, the opportunity to leverage an increasing amount of these data in the assessment of drug safety is clear. The challenge in an environment of increasing data volume is in the structuring and the analysis of these data, such that decisions can be made without excluding information or overstating their meaning. Informatics and modelling play a crucial role in addressing this challenge in two basic ways: a) the data are structured and analysed in a transparent and objective way; and b) new experiments are designed with the model as part of the design process, much like modern experimental physics. Enhancing the use and impact of informatics and modelling on drug discovery is not simply a matter of increasing processor speed and memory capacity. The transformation of raw data to usable, and useful, information is a scientific, technical and, perhaps most importantly, cultural challenge within drug discovery. This review will highlight some of the history, current approaches and promising future directions in this rapidly expanding area.
Article
Proton magnetic resonance spectroscopy is a powerful tool for in vivo biochemical characterization of normal and abnormal tissues. The initial application in the abdomen was the measurement of fat concentration in the liver using chemical shift imaging. The success of chemical shift imaging in providing a semiquantitative measure of liver fat concentration led to the application of the more quantitative single-voxel volume-selective spectroscopy of the liver. This single-voxel volume-selective spectroscopic technique is able to characterize the different lipids and metabolites present in the liver and the pancreas, providing information about the ratio of unsaturated and saturated lipids. The purposes of this article were to review the spectroscopic techniques and to discuss some of the clinical applications of these techniques in the abdomen.
Article
In the past decade, high-content screening has become a highly developed approach to obtaining richly descriptive quantitative phenotypic data using automated microscopy. From early use in drug screening, the technique has evolved to embrace a diverse range of applications in both academic and industrial sectors and is now widely recognized as providing an efficient and effective approach to large-scale programs investigating cell biology in situ and in context. (Journal of Biomolecular Screening 2010: 1-9)
Article
Unlabelled: With the advent of induced pluripotent stem cell (iPSC) technology, it is now feasible to generate iPSCs with a defined genotype or disease state. When coupled with direct differentiation to a defined lineage, such as hepatic endoderm (HE), iPSCs would revolutionize the way we study human liver biology and generate efficient "off the shelf" models of human liver disease. Here, we show the "proof of concept" that iPSC lines representing both male and female sexes and two ethnic origins can be differentiated to HE at efficiencies of between 70%-90%, using a method mimicking physiological relevant condition. The iPSC-derived HE exhibited hepatic morphology and expressed the hepatic markers albumin and E-cadherin, as assessed by immunohistochemistry. They also expressed alpha-fetoprotein, hepatocyte nuclear factor-4a, and a metabolic marker, cytochrome P450 7A1 (Cyp7A1), demonstrating a definitive endodermal lineage differentiation. Furthermore, iPSC-derived hepatocytes produced and secreted the plasma proteins, fibrinogen, fibronectin, transthyretin, and alpha-fetoprotein, an essential feature for functional HE. Additionally iPSC-derived HE supported both CYP1A2 and CYP3A4 metabolism, which is essential for drug and toxicology testing. Conclusion: This work is first to demonstrate the efficient generation of hepatic endodermal lineage from human iPSCs that exhibits key attributes of hepatocytes, and the potential application of iPSC-derived HE in studying human liver biology. In particular, iPSCs from individuals representing highly polymorphic variants in metabolic genes and different ethnic groups will provide pharmaceutical development and toxicology studies a unique opportunity to revolutionize predictive drug toxicology assays and allow the creation of in vitro hepatic disease models.
Article
The application of the Integrated Discrete Multiple Organ Co-culture (IdMOC) system in the evaluation of organ-specific toxicity is reviewed. In vitro approaches to predict in vivo toxicity have met with limited success, mainly because of the complexity of in vivo toxic responses. In vivo properties that are not well-represented in vitro include organ-specific responses, multiple organ metabolism, and multiple organ interactions. The IdMOC system has been developed to address these deficiencies. The system uses a 'wells-within-a-well' concept for the co-culturing of cells or tissue slices from different organs as physically separated (discrete) entities in the small inner wells. These inner wells are nevertheless interconnected (integrated) by overlying culture medium in the large outer containing well. The IdMOC system thereby models the in vivo situation, in which multiple organs are physically separated but interconnected by the systemic circulation, permitting multiple organ interactions. The IdMOC system, with either cells or tissue slices from multiple organs, can be used to evaluate cell type-specific or organ-specific toxicity.
Article
In this work, we review and comment upon the challenges and the 'quo vadis' in Alzheimer's disease drug discovery at the beginning of the new millennium. We emphasize recent approaches that, moving on from a target-centric approach, have produced innovative molecular probes or drug candidates. In particular, the discovery of endosome-targeted BACE1 inhibitors and mitochondria-targeted antioxidants represents a significant advance in Alzheimer's research and therapy. The case study of the development of rasagiline provides an excellent example to support the validity of the multitarget-designed ligand approach to the search for effective medicines for combating Alzheimer's disease.
Article
Spontaneous bursting activity is present in vivo during CNS development and in vitro in neocortex slices. A prerequisite for understanding the cooperative behavior in neuronal ensembles is large-scale simultaneous extracellular electrophysiology by using either "tetrodes" (4-wire electrode) in awake animals or multi-electrode arrays (MEA) in long-term cultured networks as we did here. We show that from a single low-noise MEA electrode it is possible to identify up to 3-4 types of waveforms whose time stamps show excitatory and inhibitory short-latency (2-4 ms) cross-correlations, indicative of monosynaptic connections. Moreover, the MEA units autocorrelagrams (AC) resulted to have behaviors similar to those demonstrated in vivo by using tetrodes or shanks. Principal component analysis of AC followed by a K-means classification returned 3-4 different clusters whose firing- and burst-related properties were typical of assemblies of putative excitatory and inhibitory neurons. By manipulating the networks with a GABA(A) antagonist (gabazine), we could detect cell groups selectively responding to blockade of GABA transmission with IC(50)s of 82+/-2 and 770+/-70 nM. These methods, expanded to organotypic co-cultures of CNS regions may be useful to better understand their connecting properties in studies of regenerative medicine.