Chapter

Paving the way for personalized medicine: FDA's role in a new era of medical product development

Authors:
To read the full-text of this research, you can request a copy directly from the author.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... Single-arm designs with non-inferiority and superiority analyses are optimal for proof-of-concept and de-escalation studies in oncology. KEYWORDS time-to-event, non-inferiority, single-arm, phase II, clinical trial, superiority 1 Introduction Molecular diagnostics and biomarkers have enabled many cancers to be divided into clinical and biological subtypes, some of which have a low risk of relapse or death (1)(2)(3)(4). In patients with low-risk breast cancer, de-escalation trials are increasingly being conducted to evaluate therapies that aim to improve quality of life by avoiding overtreatment (1,5,6). ...
... Single-arm studies with a time-to-event primary endpoint usually include a superiority analysis that aims to show that the probability of survival (e.g., median PFS [mPFS]) with a certain treatment is greater than the probability of survival estimated for an active control arm (mPFS0) in a previous trial (34). Conversely, the risk of progression or death with the treatment, represented by a hazard rate (l) equal to the Napierian logarithm of 2 (LN [2]) divided by mPFS, is expected to be lower than the risk of progression or death in the active control arm (l0) (34). In contrast to such superiority analyses, a non-inferiority analysis aims to show that the effect of a test drug in terms of survival is not inferior to that of the historical comparator by more than a specified amount called the non-inferiority margin (NIM) (29). ...
... Continuing with this example, it is supposed that by the end of the study 66 patients have been accrued, 54 PFS events have occurred, and the final mPFS is 12 months, with a hazard rate (lobs) of 0.058. Based on an NIM of 1.2, H0 non-inferiority is an mPFS (mPFS0/NIM = 12/1.2) of 10 months, which is equivalent to a noninferiority hazard rate (l NI = LN [2]/10) of 0.069. Final statistical analyses for the superiority and non-inferiority objectives are performed using the maximum likelihood method for exponential distributions as follows (34) ...
Article
Full-text available
De-escalation trials in oncology evaluate therapies that aim to improve the quality of life of patients with low-risk cancer by avoiding overtreatment. Non-inferiority randomized trials are commonly used to investigate de-intensified regimens with similar efficacy to that of standard regimens but with fewer adverse effects (ESMO evidence tier A). In cases where it is not feasible to recruit the number of patients needed for a randomized trial, single-arm prospective studies with a hypothesis of non-inferiority can be conducted as an alternative. Single-arm studies are also commonly used to evaluate novel treatment strategies (ESMO evidence tier B). A single-arm design that includes both non-inferiority and superiority primary objectives will enable the ranking of clinical activity and other parameters such as safety, pharmacokinetics, and pharmacodynamics data. Here, we describe the statistical principles and procedures to support such a strategy. The non-inferiority margin is calculated using the fixed margin method. Sample size and statistical analyses are based on the maximum likelihood method for exponential distributions. We present example analyses in metastatic and adjuvant settings to illustrate the usefulness of our methodology. We also explain its implementation with nonparametric methods. Single-arm designs with non-inferiority and superiority analyses are optimal for proof-of-concept and de-escalation studies in oncology.
... In addition, it seems that there is no consensus about the definition of PM [6][7][8], as two main definitions can be distinguished in the literature. These are partially overlapping and describe PM either as a patient-centered approach (i.e., adjusting treatment to the specificities of an individual patient) [9][10][11][12] or as a genomics-based treatment strategy [6,11,13]. ...
... In addition, it seems that there is no consensus about the definition of PM [6][7][8], as two main definitions can be distinguished in the literature. These are partially overlapping and describe PM either as a patient-centered approach (i.e., adjusting treatment to the specificities of an individual patient) [9][10][11][12] or as a genomics-based treatment strategy [6,11,13]. ...
... For those most familiar with the medical field, the important presence of genetic aspects in the definition of both PM and TT could be related to a technical definition of PM [6,8]. On the other hand, the less medically educated groups' definitions ignored genetics-related elements, focusing on the humanistic aspects of PM [10][11][12]. Thus, in line with the literature, we supposed that the particularly positive attitudes toward TT in the most familiar students derived from the "actionability" of medical concepts [32]. ...
Article
Full-text available
Personalized medicine (PM) is increasingly becoming a topic of discussion in public health policies and media. However, there is no consensus among definitions of PM in the scientific literature and the terms used to designate it, with some definitions emphasizing patient-centered aspects and others emphasizing biomedical aspects. Furthermore, terms used to refer to PM (e.g., “pharmacogenomics” or, more often, “targeted therapies”) are diverse and differently used. To our knowledge, no study has apprehended the differences of definition and attitudes toward personalized medicine and targeted therapies according to level of familiarity with the medical field. Our cohort included 349 French students from three different academic fields, which modulated their familiarity level with the medical field. They were asked to associate words either to “personalized medicine” or “target therapies”. Then, they were asked to give an emotional valence to their associations. Results showed that nonfamiliar students perceived PM as more positive than targeted therapies (TT), whereas familiar students showed no difference. Only familiar students defined PM and TT with technical aspects such as genetics or immunology. Further studies are needed in the field in order to determine which other factors could influence the definitions of PM and TT and determine how these definitions could have an impact in a clinical setting.
... 2 Continuous manufacturing can save expensive active pharmaceutical ingredients (APIs) for the design of experiments (DOE) during development to support a quality-by-design filing. 3 Continuous manufacturing also generally reduces the production time because there is less downtime as compared with the traditional batch processes. 4 Tablets are the most commonly used solid dosage forms in the pharmaceutical industry. ...
Article
Full-text available
The mixing and drying behavior in a continuous fluidized bed dryer were investigated experimentally by characterizing the residence time distribution (RTD) and incorporating a micromixing model together with the drying kinetics obtained from batch drying. The RTD of the dryer was modeled using a tank‐in‐series model. It was found that a high initial material loading and a low material flow rate resulted in a reduced peak height and broaded peak width of the RTD curve. To predict the continuous dryer effluent moisture content, we combined: (a) the drying kinetics as determined in a batch fluidized bed dryer, (b) the RTD model, and (c) micromixing models—segregation and maximum mixedness models. It was found that the segregation model overpredicted the effluent moisture content by up to 5% for the cases we have studied while the maximum mixedness model gave a good prediction of the effluent moisture content.
... While the potential benefits of such biomarker-guided therapy are seemingly substantial, discovery of prognostic, and predic- tive markers from signal in the presence of many noise covariates remains a challenge (Food and Drug Administration, 2013). Retrospective reviews in oncology are difficult because treatment assignments often depend on prognostic characteristics of the treated patients. ...
Article
Full-text available
The evolution of “informatics” technologies has the potential to generate massive databases, but the extent to which personalized medicine may be effectuated depends on the extent to which these rich databases may be utilized to advance understanding of the disease molecular profiles and ultimately integrated for treatment selection, necessitating robust methodology for dimension reduction. Yet, statistical methods proposed to address challenges arising with the high‐dimensionality of omics‐type data predominately rely on linear models and emphasize associations deriving from prognostic biomarkers. Existing methods are often limited for discovering predictive biomarkers that interact with treatment and fail to elucidate the predictive power of their resultant selection rules. In this article, we present a Bayesian predictive method for personalized treatment selection that is devised to integrate both the treatment predictive and disease prognostic characteristics of a particular patient's disease. The method appropriately characterizes the structural constraints inherent to prognostic and predictive biomarkers, and hence properly utilizes these complementary sources of information for treatment selection. The methodology is illustrated through a case study of lower grade glioma. Theoretical considerations are explored to demonstrate the manner in which treatment selection is impacted by prognostic features. Additionally, simulations based on an actual leukemia study are provided to ascertain the method's performance with respect to selection rules derived from competing methods.
... Despite considerable efforts to develop and prescribe effective therapies, participants in the development of modern medicine have been unable to eliminate a major irony-that many people end up taking medications that do not help them or that help them very little. 1 One reason for this incongruity is that most clinical trials are primarily designed to assess average rather than individual responses to therapies. 1 Thus, research methods that focus on individuals rather than averages are attracting increasing attention. [1][2][3][4] Garnering special interest are data- analysis methods that are appropriate for investigating individual benefits of medical or behavioral treatments (MBTs) for chronic diseases. [5][6][7][8][9][10][11][12] In line with this, we propose a new approach below. ...
Article
Full-text available
There is a need for statistical methods appropriate for the analysis of clinical trials from a personalized-medicine viewpoint as opposed to the common statistical practice that simply examines average treatment effects. This article proposes an approach to quantifying, reporting and analyzing individual benefits of medical or behavioral treatments to severely ill patients with chronic conditions, using data from clinical trials. The approach is a new development of a published framework for measuring the severity of a chronic disease and the benefits treatments provide to individuals, which utilizes regression models with random coefficients. Here, a patient is considered to be severely ill if the patient’s basal severity is close to one. This allows the derivation of a very flexible family of probability distributions of individual benefits that depend on treatment duration and the covariates included in the regression model. Our approach may enrich the statistical analysis of clinical trials of severely ill patients because it allows investigating the probability distribution of individual benefits in the patient population and the variables that influence it, and we can also measure the benefits achieved in specific patients including new patients. We illustrate our approach using data from a clinical trial of the anti-depressant imipramine.
... For example, a biomarker could be described as "prognostic" if it identifies a set of patients with acute myocardial infarction (AMI) at a greater risk of death, regardless of whether the death is due to the AMI (so-called "cardiacrelated death") or not. "Predictive" resembles prognostic but is in fact synonymous with "drug-responsiveness" (or device-responsiveness) and is a term used by groups such as the Food and Drug Administration (FDA) to specifically imply an approach to divide patients based on their likelihood of responding to a drug or device [11]. Thus, one marker may be prognostic, in that it predicts the likelihood of having the outcome of interest (e.g., death). ...
Article
Full-text available
All of medicine aspires to be precise, where a greater understanding of individual data will lead to personalized treatment and improved outcomes. Prompted by specific examples in oncology, the field of critical care may be tempted to envision that complex, acute syndromes could bend to a similar reductionist philosophy—where single mutations could identify and target our critically ill patients for treatment. However, precision medicine faces many challenges in critical care. These include confusion about terminology, uncertainty about how to divide patients into discrete groups, the challenges of multi-morbidity, scale, and the need for timely interventions. This review addresses these challenges and provides a translational roadmap spanning preclinical work to identify putative treatment targets, novel designs for clinical trials, and the integration of the electronic health record to implement precision critical care for all.
... • Zenith Fenestrated AAA Endovascular Graft as an indicator for the endovascular treatment of patients with abdominal aortic or aortoiliac aneurysms having morphology suitable for endovascular repair [2]. ...
... Everybody is different: Research in N-of-1 interventions [57] [24], personal (precision) medicine [22], and small data [19] demonstrate that every individual is unique based on their age, culture, childhood development and life context. Therefore personalized treatment or interventions catered to individual uniqueness should out perform one-size-fits-all approaches. ...
Chapter
Full-text available
Personal data acquisition using smartphones has become robust and achievable in recent times: improvements in user interfaces have made manual inputting more straightforward and intuitive, while advances in sensing technology has made tracking more accurate and less obtrusive. Moreover, algorithmic advances in data mining and machine learning has led to better a interpretation and determination factors indicative of health conditions and outcomes. However, these indicators are still under-utilized when providing feedback to the user or a health worker. Mobile health systems that can exploit such indicators could potentially deliver precision feedback personalized to the user’s condition and also lead to increases in adherence and improve efficacy. In this book chapter, we will provide an overview of the state of the art in mobile health feedback systems and then discuss MyBehavior, an example of a feedback system that utilizes individual data streams and indicators. MyBehavior is the first personalized system that provides health beneficial recommendations based on physical activity and dietary data acquired using smartphones. The system learns common healthy and unhealthy behaviors from activity and dietary logs, and then prioritizes and suggests actions similar to existing behaviors. Such prioritization is done to promote a sense of familiarity to the suggestions and increase the likelihood of adoption. We also formulate a basis framework for future systems similar to MyBehavior and discuss challenges with regard to transference and adaptation.
... A kind of "golden rule" for the final clinical validation of a CDx assay is to use only one version of the assay, which must be the final analytically validated version, and, furthermore, only to use one testing site in order to reduce possible interlaboratory variability. 10,37,38 In the drug-diagnostic codevelopment model, the pivotal phase II and III studies are aimed not only to demonstrate safety and efficacy of the drug but also to clinically validate the CDx assay. In these studies, it must be demonstrated that the assay, with a relatively high probability, is able to select the responding patients. ...
Chapter
A companion diagnostic (CDx) assay is an in vitro diagnostic device that provides information that is essential for the safe and effective use of a corresponding drug. This type of assay is developed in close conjunction with the corresponding drug using the drug–diagnostic codevelopment model. CDx assays hold the promise of improving the predictability of the drug-development process and being an important tool for the clinical use of the drugs after regulatory approval. For some oncology drugs, the CDx assays have taken up a central role in the development process, and the success of this type of targeted drugs largely depends on the performance of these assays. CDx assays have the individual patient as a point of reference, and they will be decisive for the move toward a more precise and individualized pharmacotherapy.
... The amount of information that academic researchers and pharmaceutical practitioners have obtained through the use of genetic sequencing has ballooned over the past decade because the cost of these technologies has constantly been declining [5]. However, there has not yet been a proper infrastructure to use the obtained information efficiently. ...
Thesis
The productivity of research and development in the bio-pharmaceutical industry has been constantly declining since the early 2000's. One possible reason is that biomedical projects are risky, take a long time, and require significant investment. Hence, substantial capital has been shifted away from the bio-pharmaceutical industry to other industries that are perceived less risky, creating a funding gap for early-stage pharmaceutical R&D. Here, we investigate and improve upon a novel financing technique that has been proposed to facilitate the R&D funding in the bio-pharmaceutical industry. This new financing method is a clear example of rapidly evolving innovation in the financial industry, from which the bio-pharmaceutical industry can benefit tremendously. Apart from funding challenges, pharmaceutical companies have to clear regulatory hurdles before they can commercialize their treatments. These drug-regulatory standards require a specific balance of benefits vs. risks for a therapy to be approved, and do not currently take into account the severity of the disease that the therapy is targeting. In the second part of this thesis, we propose an objective and quantitative Bayesian decision analysis framework to incorporate patients' feedback into the drug-approval process, and propose adjustment to the approval standards based on disease severity. When launching their drug, pharmaceutical companies set the drug's price such that expected revenues offset the costs of all the projects, failed or successful, that were pursued in order to lead to this successful treatment resulting in costly treatment. Recently, some highly curative therapies with high price tags have emerged for diseases with large prevalence, such as hepatitis C. These high prices, coupled with the large size of the patient population, have created an unsupportable financial burden for insurance companies in order to cover the broadest patient population who could benefit from these drugs. Despite delivering breakthrough discoveries, the pharmaceutical companies producing these drugs have experienced a public backlash due to drug prices. In the last part of this dissertation, we introduce a new financing paradigm to address the issue of high aggregate costs for these highly curative therapies.
... Biomarkers have been an important tool in this transition, and are often presented as a revolutionary new technology used for patient assessment to help determine predispositions to particular types of cancer, to screen and diagnose cancer types and stages, to estimate the disease prognosis, to predict the most effective course of treatment, and to monitor cancer recurrence [1]. There is a noticeable technooptimism regarding cancer biomarkers, especially in policy reports where they are anticipated to facilitate a higher-quality, safer and more efficient treatment of cancer while decreasing health care costs [2][3][4]. ...
Article
Full-text available
Cancer biomarkers represent a revolutionary advance toward personalised cancer treatment, promising therapies that are tailored to subgroups of patients sharing similar generic traits. Notwithstanding the optimism driving this development, biomarkers also present an array of social and ethical questions, as witnessed in sporadic debates across different literatures. This review article seeks to consolidate these debates in a mapping of the complex terrain of ethical and social aspects of cancer biomarker research. This mapping was undertaken from the vantage point offered by a working cancer biomarker research centre called the Centre for Cancer Biomarkers (CCBIO) in Norway, according to a dialectic move between the literature and discussions with researchers and practitioners in the laboratory. Starting in the lab we found that, with the exception of some classical bioethical dilemmas, researchers regarded many issues relative to the ethos of the biomarker community; how the complexity and uncertainty characterising biomarker research influence their scientific norms of quality. Such challenges to the ethos of cancer research remain largely implicit, outside the scope of formal bioethical enquiry, yet form the basis for other social and ethical issues. Looking out from the lab we see how questions of complexity, uncertainty and quality contribute to debates around social and global justice; undermining policies for the prioritisation of care, contributing to the stratification of those patients worthy of treatment, and limiting global access to this highly sophisticated research. We go on to discuss biomarker research within the culturally-constructed 'war on cancer' and highlight an important tension between the expectations of 'magic bullets' and the complexity and uncertainty faced in the lab. We conclude by arguing, with researchers in the CCBIO, for greater reflexivity and humility in cancer biomarker research and policy.
... A well-known example is Herceptest 1 –the companion diagnostic for HER2-positive breast cancer and gastric cancer–which identifies patients eligible for trastuzumab treatment [10]. Other examples of Food and Drug Administration (FDA)-approved drugs with companion diagnostics include cetuximab, imatinib, and vemurafenib, which are used to treat metastatic colorectal cancer, gastrointestinal stroma tumor, and late-stage melanoma , respectively [11, 12] With the emergence of new molecular technologies identifying tumor aberrations that can be treated with targeted agents, the number of companion diagnostic tests used in oncology will significantly increase in the future. Companion diagnostics has the potential to enable the selection of the correct drug dose at the appropriate time of a patient`s treatment course, thereby reducing overall therapy cost. ...
Article
Full-text available
Background: In vitro diagnostic (IVD) investigations are indispensable for routine patient management. Appropriate testing allows early-stage interventions, reducing late-stage healthcare expenditure (HCE). Aim: To investigate HCE on IVDs in two developed markets and to assess the perceived value of IVDs on clinical decision-making. Physician-perceived HCE on IVD was evaluated, as well as desired features of new diagnostic markers. Methods: Past and current HCE on IVD was calculated for the US and Germany. A total of 79 US/German oncologists and cardiologists were interviewed to assess the number of cases where: physicians ask for IVDs; IVDs are used for initial diagnosis, treatment monitoring, or post-treatment; and decision-making is based on an IVD test result. A sample of 201 US and German oncologists and cardiologists was questioned regarding the proportion of HCE they believed to be attributable to IVD testing. After disclosing the actual IVD HCE, the physician's perception of the appropriateness of the amount was captured. Finally, the association between physician-rated impact of IVD on decision-making and perceived contribution of IVD expenditure on overall HCE was assessed. Results: IVD costs account for 2.3% and 1.4% of total HCE in the US and Germany. Most physicians (81%) believed that the actual HCE on IVDs was >5%; 19% rated the spending correctly (0-4%, p<0.001). When informed of the actual amount, 64% of physicians rated this as appropriate (p<0.0001); 66% of decision-making was based on IVD. Significantly, more physicians asked for either additional clinical or combined clinical/health economic data than for the product (test/platform) alone (p<0.0001). Conclusions: Our results indicate a poor awareness of actual HCE on IVD, but a high attributable value of diagnostic procedures for patient management. New markers should deliver actionable and medically relevant information, to guide decision-making and foster improved patient outcomes.
... SCBIs use cells which are not cleared from the body like drugs are and the interventions used are often patient-specific. But the FDA has a long history of adapting to new technologies and already has been developing pathways for regulating personalized medicine interventions including drugs for specific disease-causing mutations (such as Kalydeco) and autologous vaccines (such as Provenge) [89]. Any regulatory approaches developed to address SC tourism must be clear and transparent [90]. ...
Article
Full-text available
Background In 2004, patient advocate groups were major players in helping pass and implement significant public policy and funding initiatives in stem cells and regenerative medicine. In the following years, advocates were also actively engaged in Washington DC, encouraging policy makers to broaden embryonic stem cell research funding, which was ultimately passed after President Barack Obama came into office. Many advocates did this because they were told stem cell research would lead to cures. After waiting more than 10 years, many of these same patients are now approaching clinics around the world offering experimental stem cell-based interventions instead of waiting for scientists in the US to complete clinical trials. How did the same groups who were once (and often still are) the strongest supporters of stem cell research become stem cell tourists? And how can scientists, clinicians, and regulators work to bring stem cell patients back home to the US and into the clinical trial process? Discussion In this paper, we argue that the continued marketing and use of experimental stem cell-based interventions is problematic and unsustainable. Central problems include the lack of patient protection, US liability standards, regulation of clinical sites, and clinician licensing. These interventions have insufficient evidence of safety and efficacy; patients may be wasting money and time, and they may be forgoing other opportunities for an intervention that has not been shown to be safe and effective. Current practices do not contribute to scientific progress because the data from the procedures are unsuitable for follow-up research to measure outcomes. In addition, there is no assurance for patients that they are receiving the interventions promised or of what dosage they are receiving. Furthermore, there is inconsistent or non-existent follow-up care. Public policy should be developed to correct the current situation. Conclusion The current landscape of stem cell tourism should prompt a re-evaluation of current approaches to study cell-based interventions with respect to the design, initiation, and conduct of US clinical trials. Stakeholders, including scientists, clinicians, regulators and patient advocates, need to work together to find a compromise to keep patients in the US and within the clinical trial process. Using HIV/AIDS and breast cancer advocate cases as examples, we identify key priorities and goals for this policy effort.
... The "personalization" of medicine is an area that is receiving ever-growing attention from medical practitioners, statisticians, and computer scientists, to name but some of the interested parties. The U.S. Food and Drug Administration refers to this personalization as "tailoring medical treatment to […] a patient's genetic, anatomical, and physiological characteristics", approaches which are "allowing patients to be treated and monitored more precisely and effectively and in ways that better meet their individual needs" [1]. Within the statistics literature, there have been a great variety of methods proposed for estimating optimal personalized treatment strategies, also known as dynamic treatment regimens (DTRs), from regression-based approaches (Schulte et al. [2] provide an excellent review) to classification-based algorithms (e.g. ...
Article
Full-text available
Individualized medicine is an area that is growing, both in clinical and statistical settings, where in the latter, personalized treatment strategies are often referred to as dynamic treatment regimens. Estimation of the optimal dynamic treatment regime has focused primarily on semi-parametric approaches, some of which are said to be doubly robust in that they give rise to consistent estimators provided at least one of two models is correctly specified. In particular, the locally efficient doubly robust g-estimation is robust to misspecification of the treatment-free outcome model so long as the propensity model is specified correctly, at the cost of an increase in variability. In this paper, we propose data-adaptive weighting schemes that serve to decrease the impact of influential points and thus stabilize the estimator. In doing so, we provide a doubly robust g-estimator that is also robust in the sense of Hampel (15).
... Similarly, a false-negative test result could withhold or delay a potentially beneficial treatment and thereby also bring the patient at risk. 20 In oncology, an early and correct diagnosis and intervention are 2 elements of key importance in the treatment of cancer patients. In case of a wrong treatment decision, the disease may become disseminated with no or very low chances of cure. ...
Article
Companion diagnostics (CDx) is a positive attempt in the direction of improving the drug development process, especially in the field of oncology, with the advent of newer targeted therapies. It helps the oncologist in deciding the choice of treatment for the individual patient. The role of CDx assays has attracted the attention of regulators, and especially the US Food and Drug Administration developed regulatory strategies for CDx and the drug-diagnostic codevelopment project. For an increasing number of cancer patients, the treatment selection will depend on the result generated by a CDx assay, and consequently this type of assay has become critical for the care and safety of the patients. In addition to the assay-based approach, molecular imaging with its ability to image at the genetic and receptor level has made foray into the field of drug development and personalized medicine. We shall review these aspects of CDx, with special focus on molecular imaging and the upcoming concept of Theranostics.
... Enrichment designs have been widely discussed to establish treatment benefit in a selected (enriched) subpopulation1234567 and are closely related to FDA's initiative on personalized medicine [8]. One purpose to select such enriched population is for better treatment response potential, for example the trastuzumab benefit on HER2+ breast cancer patients9101112. ...
Chapter
Additive manufacturing is a technology that has revolutionized how objects are fabricated. In the last years, the pharmaceutical industry directed its attention to AM technologies, aiming to manufacture medications and devices with advantages such as customization, on-demand manufacturing, reduction of waste, and development of new medications and treatments in shorter times. This chapter presents the materials used for each AM technology, where their form and properties should be in good accordance with the process working principle. This fact defines the materials selections used as the basis to develop pre-formulations for tablets or to fabricate drug delivery devices. Furthermore, an overview of the main applications and future perspectives of pharmaceutical manufacturing and personalized drug delivery are discussed.KeywordsMaterialsThermoplastic polymersHydrogelsExtrusionThermosetsFilamentsPellets
Article
Full-text available
To maximize the potential of 3D printing, several technological issues must be overcome, especially in the case of semisolid extrusion (SSE), where the 3D model is built up through the well‐defined extrusion of polymeric ink. Focusing on natural polymers, their physicochemical properties should be deeply investigated to evaluate 3D printability. This study aims to produce tablets easy to swallow, exploiting as 3D printing ink a crosslinked alginate hydrogel added of sorbitol, speculating that the plasticising action of sorbitol should allow to improve tablets’ swallowability. Three different amounts of 70% sorbitol solution (5%, 15%, 25% v/v) are evaluated to identify the best composition in terms of both printing and technological properties. The data obtained from the chemical, rheological, and mechanical evaluation of each ink are related to the printing performance, to reach the best manufacturing reproducibility. Linear regression fitting study (r ² = 0.9982) confirms the predictable relationship between residual mass of final platforms and sorbitol amount loaded. To characterize the final products, all dried batches are subjected to technological evaluation highlighting differences in the matrix softability as well as in the swelling/erosion properties after the interaction with biological fluids.
Article
Full-text available
A epigenômica é uma área em ampla discussão atualmente. Estuda-se como fatores externos podem promover alterações genéticas com potencial hereditário e a patogenicidade dessas mudanças. O objetivo deste trabalho é identificar o papel dos genes supressores de tumor enquanto biomarcadores do câncer de mama e relacionar mecanismos epigenéticos do câncer de mama com estratégias de medicina de precisão. Trata-se de uma revisão de literatura do período entre 2010 e 2018, de artigos indexados na base de dados MEDLINE e de teses e dissertações disponíveis no portal Periódicos CAPES, além de materiais informativos do Ministério da Saúde, do Instituto Nacional do Câncer e da Organização Mundial da Saúde. Os genes BRCA1 e BRCA2 desenvolvem um papel crucial na supressão de tumores mamários por meio da reparação do DNA. Alterações epigenéticas que levam ao silenciamento deles indicam instabilidade genômica e sugerem que alternativas de reparação são buscadas e podem favorecer a carcinogênese. Avaliações moleculares já permitiram identificar que determinadas alterações – especialmente no padrão de metilação – se expressam em diferentes pacientes e seus prognósticos. A abordagem personalizada utiliza destas informações para avaliar e tratar os pacientes de forma mais efetiva. No entanto, entende-se que tais estudos são preliminares. Novos estudos são necessários para atestar a segurança e aplicabilidade dessas ferramentas.
Chapter
Traditionally known as the individually tailored “one size fits all” health care helping individual patients, personalised medicine over the last few decades has slowly been transforming to “genetically-based” health care and pharmacogenetics, thereby luring the light of bioethics.
Article
Full-text available
Background Genomic medicine has paved the way for identifying biomarkers and therapeutically actionable targets for complex diseases, but is complicated by the involvement of thousands of variably expressed genes across multiple cell types. Single-cell RNA-sequencing study (scRNA-seq) allows the characterization of such complex changes in whole organs. Methods The study is based on applying network tools to organize and analyze scRNA-seq data from a mouse model of arthritis and human rheumatoid arthritis, in order to find diagnostic biomarkers and therapeutic targets. Diagnostic validation studies were performed using expression profiling data and potential protein biomarkers from prospective clinical studies of 13 diseases. A candidate drug was examined by a treatment study of a mouse model of arthritis, using phenotypic, immunohistochemical, and cellular analyses as read-outs. Results We performed the first systematic analysis of pathways, potential biomarkers, and drug targets in scRNA-seq data from a complex disease, starting with inflamed joints and lymph nodes from a mouse model of arthritis. We found the involvement of hundreds of pathways, biomarkers, and drug targets that differed greatly between cell types. Analyses of scRNA-seq and GWAS data from human rheumatoid arthritis (RA) supported a similar dispersion of pathogenic mechanisms in different cell types. Thus, systems-level approaches to prioritize biomarkers and drugs are needed. Here, we present a prioritization strategy that is based on constructing network models of disease-associated cell types and interactions using scRNA-seq data from our mouse model of arthritis, as well as human RA, which we term multicellular disease models (MCDMs). We find that the network centrality of MCDM cell types correlates with the enrichment of genes harboring genetic variants associated with RA and thus could potentially be used to prioritize cell types and genes for diagnostics and therapeutics. We validated this hypothesis in a large-scale study of patients with 13 different autoimmune, allergic, infectious, malignant, endocrine, metabolic, and cardiovascular diseases, as well as a therapeutic study of the mouse arthritis model. Conclusions Overall, our results support that our strategy has the potential to help prioritize diagnostic and therapeutic targets in human disease.
Article
Dissolution testing is a very significant tool that adds in vivo relevance to in vitro analytical data, thus providing a realistic in vitro/in vivo correlation. Dissolution profiling serves as a predictor of biological performance since the rate limiting step in the absorption of any drug is the rate of release from its pharmaceutical formulation. Being a routine procedure in quality control laboratories for initial approval process and scaling-up, the development of more and more eco-friendly methods for dissolution monitoring is considered as a worldwide trendy goal. In the present contribution, a comparison was highlighted between the two analytical techniques of utmost importance in acquisition of dissolution profiles: UV-spectrophotometry and HPLC–PDA focusing on the greenness of each one for further consolidation of the biowaiver concept. Both techniques were applied on the recently FDA-approved combination naproxen sodium (NAPR) and diphenhydramine hydrochloride (DIPH) formulated as Aleve pm® tablets. For the first time, this binary mixture was analyzed by three UV-spectrophotometric methods. NAPR was directly determined by zero-order spectrophotometry at 330 nm, where the spectrum of DIPH shows zero contribution. Whereas DIPH was determined by three simple methods exploiting the ratio spectra calculated using the spectrum of 20 µg/ml NAPR as a divisor. The first proposed method was ratio difference (RD), the second was ratio subtraction, and the last one was derivative ratio method. Being the simplest method, RD was the spectrophotometric method of choice that was applied in monitoring the dissolution of DIPH from Aleve pm® tablets. Although RD method has been widely described in many publications dealing with pharmaceutical analysis, it is the first article that opens a new horizon for RD method in the way of being a real-life application rather than only a method for determination. The second technique was a previously published HPLC–PDA, in which dissolution was then monitored by calculating the peak area of each component drug over time. Methods’ validation was performed agreeing with the ICH guidelines. The advantages and challenges of each technique are discussed in a side-by-side comparison giving a key to recognize which one can positively influence environmental well-being. Graphical abstract Open image in new window
Article
The biophysical cues of endogenous origin, i.e., shear stress and electric field, are known to significantly modulate cell functionality, in vitro. While this has been relatively well investigated in conventional petri dish culture, it is important to validate such important phenomenon in physiologically simulated cellular microenvironment. In this perspective, this review critically discusses the importance of lab-on-a-chip (LOC)-based microfluidic devices to probe into this aspect to develop an insight towards the application in regenerative medicine. While reviewing several literature reports, an emphasis has been placed to unravel the intriguing aspects of shear and electric field modulated differentiation of stem cells in the biomicrofluidics devices. The potential application focusing the stem cell culture was emphasized in this article as the stem cells are the foundation of tissue regeneration. Several challenges in tissue regeneration and introduction of personalized medicine could be addressed through microfluidic technology. Culturing of organ-specific multiple cell types within lab-on-a-chip and biophysical stimulation mediated activations of intracellular signal transduction under gradient shear/electric field are highlighted in this review. Lay Summary Conceptually, regenerative medicine is considered as an emerging approach for treating traumatized, malfunctional anatomical parts of the patients with stem cells to establish normal functionality of the tissue. The regenerated tissue should preferably be the patients’ autologous tissue, grown under artificially created in vivo physiological environment. Biomicrofluidic-based lab-on-a-chip technology enables to perform in vitro cell/tissue engineering under endogenous cues, like shear and electric field. Therefore, this review discusses two aspects of regenerative medicine in terms of autologous transplantation of cells/tissues to improvise personalized regenerative medicine and to recreate an organ-specific tissue under the influence of biophysical stimulation in an attempt to improve the physiological functionality. Open image in new window Graphical abstract Biomicrofluidics based Lab-on-a-Chip devices have improvised the tissue regenerative approaches towards into the direction of personalised medicine.
Article
Full-text available
The diverse range of opportunities offered by ISEs has been broadly used in a number of pharmaceutical applications, with topics presented ranging from bioanalysis of drugs and metabolites, to green chemical analysis, impurity profiling, and drugs' dissolution in biorelevant media. Attractively, solid-contact ISEs (SC-ISEs) have been implemented in many approaches due to their ability to offer stable and miniaturizable sensors. In this contribution, gold SC-ISEs were used in bare and thiol-doped forms for the analysis of the two co-formulated drugs namely, naproxen sodium and diphenhydramine hydrochloride in tablets. The unique ability of thiol to form a hydrophobic self-assembled monolayer on gold surface, resulted in more sensitive, stable, and drift-free potentiometric signals. The special advantages offered by the gold-thiol electrodes empowered us to utilize them as real-time analyzers to investigate the protein-binding behavior of the studied drugs by continuous in-situ monitoring of the drug in presence of serum albumin. The proposed sensors could offer a prediction of the pharmacokinetics of each component drug, giving the healthcare professionals the opportunity to provide a personalized medicine for the patients according to their different protein-binding characteristics.
Chapter
This chapter situates contemporary debates over regenerative medicine governance within a broader framework, taking intersections with economic, political, and other kinds of technological zones into account. With the inherent complexities of regenerative medicine products, the advent of techniques such as gene editing and tissue organoids, and pragmatic problems of scaling-up cell manufacturing, conventional ways of thinking about and producing evidence are challenged. At the same time, the push to speed product approvals endures, but now in political and economic environments that include differing attitudes toward risk and patients’ roles in decision-making. The chapter highlights how crossing technological and political zones, data-driven approaches plus a return to observational data in particular are being incorporated into US regulatory law and product review.
Chapter
Three dimensional printing or additive manufacturing is a group of technologies that allow creation of three dimensional objects by adding layers of material using a “printer” under computer control. These technologies have long been used to create prototypes of devices for manufacturing, but now many proposed applications are emerging to use the technology to create medical devices, and eventually artificial organs, for patient care. This chapter reviews some proposed and actual biomedical applications of 3-D printing. Three-dimensional printing of medical devices is still largely in what one firm (Gartner) terms the “hype” stage of innovation, which is characterized by high expectations but as yet unproven success. However a few 3-D printed medical devices have achieved considerable success even at this early stage of evolution of the technology. This chapter calls for an ethical technology assessment of 3-D printing. While the products themselves vary greatly, the technology in general provides unprecedented flexibility of design and creation of medical devices, but there is an essential tension between this flexibility and the rigid controls that society has evolved to ensure the safety and effectiveness of medical devices and treatments. Specific issues include the need to reconcile flexibility of design and production with safety of products, the flexible boundaries between research and medical practice, the likely development of new vested interests related to the technology, and issues related to printing of body parts for nonmedical uses.
Article
Full-text available
Aristotelian cause-effect principles in medicine – Attempt to contribute to “conceptual integration” The assessment of cause and effect is key to all medical and scientific activities. Aristotle differentiated between four types of causes – material, formal, efficient and final. An analysis of modern western medical models of causality reveals a dominance of efficient and material causes. Traditional and spiritual approaches to healing are on the other hand closely related to formal and final causes. Conceptually this paper suggests a scientifically sound integration of all four cause-effect principles for the benefit of patients.
Chapter
The primary goal of the pharmaceutical industry is to develop safe and effective medications. As the industry matures and the existing arsenal of marketed therapeutics grows, novel drugs must exhibit greater efficacy and safety to achieve registration and favorable reimbursement. Furthermore, gaining market-share has become extremely competitive, in terms of both meaningful clinical effects and tolerated safety profiles. As a result, the pharmaceutical industry has experienced a steady decline in productivity in recent decades. However, the achievement of regulatory approvals for targeted therapeutics may reverse this drop in productivity. The convergence of high-throughput genetic analysis technologies and the exponentially expanding biological and genomic knowledgebase have provided many clear examples that genetic variation can affect both disease risk and drug response. Therefore, evaluation of genetic variation in clinical trial populations should be considered essential and routine from the earliest phases of drug development. Pharmacogenetics (PGx) in particular has gained considerable attention from drug developers, regulators and payers over the past decade as a means to achieving safer, efficacious and more cost-effective drugs. While PGx science has great potential to impact positively the success of developing a new medicine, the integration of PGx into the decision making processes of the drug development pipeline has been difficult. The goal of this chapter is to describe the principles and requirements of an efficient and valuable PGx strategy that makes use of every opportunity during the course of developing innovative medicines. This strategy combines a proven methodology with rigorous genetic science to create a “Pipeline Pharmacogenetic Program”.
Article
Paediatrics and geriatrics both represent highly heterogenous populations and require special consideration when developing appropriate dosage forms. This paper discusses similarities, differences and considerations with respect to the development of appropriate medicine formulations for paediatrics and geriatrics. Arguably the most significant compliance challenge in older people is polypharmacy, whereas for children the largest barrier is taste. Pharmaceutical technology has progressed rapidly and technologies including FDCs, multi-particulates and orodispersible dosage forms provide unprecedented opportunities to develop novel and appropriate formulations for both old and new drugs. However, it is important for the formulation scientists to work closely with patients, carers and clinicians to develop such formulations for both the paediatric and geriatric population.
Article
Full-text available
Purpose Medication adherence is a major challenge in HIV treatment. New mobile technologies such as smartphones facilitate the delivery of brief tailored messages to promote adherence. However, the best approach for tailoring messages is unknown. Persons living with HIV (PLWH) might be more receptive to some messages than others based on their current psychological state. Methods We recruited 37 PLWH from a parent study of motivational states and adherence. Participants completed smartphone-based surveys at a random time every day for 2 weeks, then immediately received intervention or control tailored messages, depending on random assignment. After 2 weeks in the initial condition, participants received the other condition in a crossover design. Intervention messages were tailored to match PLWH’s current psychological state based on five variables – control beliefs, mood, stress, coping, and social support. Control messages were tailored to create a mismatch between message framing and participants’ current psychological state. We evaluated intervention feasibility based on acceptance, ease of use, and usefulness measures. We also used pilot randomized controlled trial methods to test the intervention’s effect on adherence, which was measured using electronic caps that recorded pill-bottle openings. Results Acceptance was high based on 76% enrollment and 85% satisfaction. Participants found the hardware and software easy to use. However, attrition was high at 59%, and usefulness ratings were slightly lower. The most common complaint was boredom. Unexpectedly, there was no difference between mismatched and matched messages’ effects, but each group showed a 10%–15% improvement in adherence after crossing to the opposite study condition. Conclusion Although smartphone-based tailored messaging was feasible and participants had clinically meaningful improvements in adherence, the mechanisms of change require further study. Possible explanations might include novelty effects, increased receptiveness to new information after habituation, or pseudotailoring, three ways in which attentional processes can affect behavior.
Article
The idea of personalized medicine raises a series of questions. If one considers that the physician takes into account the uniqueness of his patient in the frame of the medical consultation, is the definition of medicine as "personalized" not a pleonasm? If not, why has this ambiguous denomination been adopted? In addition, is this form of medicine a novel discipline capable of revolutionizing therapeutic approaches as claimed in its accompanying discourses or is it in continuity with the molecular conception of biomedicine? Rather than attempting to directly answer these questions, we focused our attention on the organizing concepts, the technological breakthroughs and the transformations in medical practices that characterize this medicine. Following this brief analysis, it appears that the choice of a term as equivocal as personalized medicine and the emphasis on the antagonistic notions of revolution and continuity in medicine are the signs of reshuffling that is emerging between actors in the health care system, in academia and in pharmaceutical companies. © 2015 médecine/sciences – Inserm.
Article
The action of a drug is dictated by its pharmacokinetic and pharmacodynamics properties, both of which can vary in different individuals because of environmental and genetic factors. Pharmacogenetics, the study of genetic factors determining drug response, has the potential to improve clinical outcomes through targeting therapies, individualising dosing, preventing adverse drug reactions and potentially rescuing previously failed therapies. Although there have been significant advances in pharmacogenetics over the last decade, only a few have been translated into clinical practice. However with new rapid genotyping technologies, regulatory modernisation, novel clinical trial designs, systems approaches and integration of pharmacogenetic data into decision support systems, there is hope that pharmacogenetics, as an important component of the overall drive towards personalised medicine, will advance more quickly in future. There will continue to be a need for collaboration between centres all over the world, and multi-sector working, capitalising on the current data revolution. This article is protected by copyright. All rights reserved. © 2015 American Society for Clinical Pharmacology and Therapeutics.
Article
Cellular immune responses that protect against tumors typically have been attributed to CD8 T cells. However, CD4 T cells also play a central role. It was shown recently that, in a patient with metastatic cholangiocarcinoma, CD4 T cells specific for a peptide from a mutated region of ERBB2IP could arrest tumor progression. This and other recent findings highlight new opportunities for CD4 T cells in cancer immunotherapy. In this article, I discuss the role and regulation of CD4 T cells in response to tumor Ags. Emphasis is placed on the types of Ags and mechanisms that elicit tumor-protective responses. I discuss the advantages and drawbacks of cancer immunotherapy through personalized genomics. These considerations should help to guide the design of next-generation therapeutic cancer vaccines. Copyright © 2015 by The American Association of Immunologists, Inc.
Article
In the United States the delivery of health care traditionally has been hierarchical and strictly controlled by physicians. Physicians typically provided patients with little information about their diagnosis, prognosis, and treatment plan; patients were expected to follow their physicians’ orders and ask no questions. Beginning in the 1970s, with the widespread adoption of the doctrine of informed consent to treatment, the physician-patient relationship began to be more collaborative, although the extent of the change has been subject to debate. At a minimum, physicians began to give patients more information and asked them to consent to recommended treatment, the therapeutic privilege to withhold information from patients lost support and eventually was repudiated, and physicians embraced — at least in theory — a more patient-centered conception of health care. More recently, health care and health promotion activities have moved beyond clinical encounters and the strict confines of physician-patient interactions.
Article
Full-text available
The long-standing medical tradition to "first do no harm" is reflected in population-wide evidence-based recommendations for cancer screening tests that focus primarily on reducing morbidity and mortality. The conventional cancer screening process is predicated on finding early-stage disease that can be treated effectively; yet emerging genetic and genomic testing technologies have moved the target earlier in the disease development process to identify a probabilistic predisposition to disease. Genetic risk information can have varying implications for the health and well-being of patients and their relatives, and has raised important questions about the evaluation and value of risk information. This paper explores the paradigms that are being applied to the evaluation of conventional cancer screening tests and emerging genetic and genomic tests of cancer susceptibility, and how these perspectives are shifting and evolving in response to advances in our ability to detect cancer risks. We consider several challenges germane to the evaluation of both categories of tests including defining benefits and harms in terms of personal and clinical utility, addressing healthcare consumers' information preferences, and managing scientific uncertainty. We encourage research and dialogue aimed at developing a better understanding of the value of all risk information, non-genetic and genetic, to people's lives.
ResearchGate has not been able to resolve any references for this publication.