Article

Harnessing Omics Sciences, Population Databases, and Open Innovation Models for Theranostics‐Guided Drug Discovery and Development

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Preclinical Research Omics science‐driven population databases and biobanks help in enabling robust, large‐scale, high‐throughput biomarker discovery and validation. As targeted drug therapies will require the development of companion diagnostic tests to identify patients most suitable for a given drug therapy, databases and biobanks represent one of the optimal and rapidly emerging ways to enable personalized medicine with reduced development timelines. Moreover, data‐intensive omics technologies represent a new dual reconfiguration of 21st‐century science whereby communitarian value‐driven “infrastructure science” and individual entrepreneurship‐driven “discovery science” now coexist. In the hope of overcoming the “transfer problem” in omics research that continues to hinder the full realization of concrete applications for human health, biobanks and databases are increasingly harnessing various open innovation models, such as open access, open source, expert sourcing, and patent pools. These models appear at various stages (drug repurposing, upstream, and downstream) of the research and development ( R&D ) process. While laudable, their inclusion will likely spur a variety of ethical, legal, and social issues ( ELSI ), including those revolving around consent, privacy, and property. By collectively anticipating and analyzing these issues, tensions among these innovation models and extant laws and policies regulating biomedical research and therapeutics based on the classical discovery science model can be resolved. This article does not posit which models will work best to achieve drug discovery and development breakthroughs, but rather, advocates for evidence‐based analyses that couple technical and economic data with global ELSI research to foster a more nuanced, contextualized, and thorough understanding of the new dual configuration of postgenomics pharmaceutical R&D .

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... 41 The third paper argues that open data/sample sharing is necessary for scientific development, facilitates the harmonization of international database consortia infrastructures, and facilitates the achievement of scientific community goals such as replicating results, promoting new research, improving methods of data collection and measurement, enabling the teaching of new researchers, and allowing for more effective use of researchers' and funding agencies' limited financial resources. 42 Data sharing and integration can also be valuable for achieving values other than (direct) scientific success or transformation. One paper says that benefits of data sharing are verification, data replication, the ability to pool analyses, and potential cost savings. ...
... 48 Relatedly, one paper notes that 21st-century omics sciences and technology highlight the necessity of pooled data to engender value and a knowledge commons in the bio-economy, help investigate both rare diseases and common complex diseases with greater confidence, and provide much-needed statistical robustness and greater granularity. 42 Two papers relate data integration to specific ethical values. The first paper mentions that a consideration for using pooled data samples in human biomonitoring and exposomics studies is that doing so may not require asking participants for consent and communicating study results to them, if, eg, individual samples are de-identified before pooling. ...
... 95 Similarly, another paper argues that patents are a potentially contributing factor to the problems involved in fully realizing concrete applications of omics research for human health. 42 From a more positive perspective, one paper argues that it is essential to protect an established biomarker or panel of biomarkers by intellectual property protection and to provide the investors with exclusive rights over their work and discovery. However, the paper also claims that stringent intellectual property regulations often cause a major hindrance in trans-national sharing of scientific data, and that parallel research ventures on similar topics among multiple countries can be beneficial if these regulations can be liberalized to some extent. ...
Article
Full-text available
In recent years, exposome research has been put forward as the next frontier for the study of human health and disease. Exposome research entails the analysis of the totality of environmental exposures and their corresponding biological responses within the human body. Increasingly, this is operationalized by big-data approaches to map the effects of internal as well as external exposures using smart sensors and multi-omics technologies. However, the ethical implications of exposome research are still only rarely discussed in the literature. Therefore, we conducted a systematic review of the academic literature regarding both the exposome and underlying research fields and approaches, to map the ethical aspects that are relevant to exposome research. We identify five ethical themes that are prominent in ethics discussions: the goals of exposome research, its standards, its tools, how it relates to study participants and the consequences of its products. Furthermore, we provide a number of general principles for how future ethics research can best make use of our comprehensive overview of the ethical aspects of exposome research. Lastly, we highlight three aspects of exposome research that are most in need of ethical reflection: the actionability of its findings, the epidemiological or clinical norms applicable to exposome research and the meaning and action-implications of bias.
... The linear commercialization model, led by large pharmaceutical companies and sponsored research projects [2], is an increasingly outdated and ineffective approach to remedying the problem [3][4][5]. Instead, biomedical stakeholders are implementing dispersed, network-based approaches to innovation [6,7] and emulating effective models from other high-tech industries [8,9]. Nowhere is this of more importance than in the field of neurodegenerative and neuropsychiatric disease [1]. ...
... This improves the Neuro's ability to link individual patient data, samples, and cells with clinical research and high-tech tools such as brain imaging databases without compromising researchers' independence in setting research programs and pursuing discoveries. As the Open Science initiative takes form, the research group will explicitly and transparently collect data about its concrete effects on research, collaboration, model uptake by other institutions, and the local economy [8,11]. ...
... For example, because of the Open Science initiative, Thermo Fisher Scientific agreed to partner with the Neuro to develop reagents, including antibodies and knock-out cell lines, to accelerate research into a number of neurodegenerative diseases. Potential partners recognize the promise of the Open Science model, in particular the commercial benefits of sharing the risk of early-stage research, and accelerating the speed of research progress [6,8,14,15]. ...
Article
Full-text available
Translational research is often afflicted by a fundamental problem: a limited understanding of disease mechanisms prevents effective targeting of new treatments. Seeking to accelerate research advances and reimagine its role in the community, the Montreal Neurological Institute (Neuro) announced in the spring of 2016 that it is launching a five-year experiment during which it will adopt Open Science—open data, open materials, and no patenting—across the institution. The experiment seeks to examine two hypotheses. The first is whether the Neuro’s Open Science initiative will attract new private partners. The second hypothesis is that the Neuro’s institution-based approach will draw companies to the Montreal region, where the Neuro is based, leading to the creation of a local knowledge hub. This article explores why these hypotheses are likely to be true and describes the Neuro’s approach to exploring them.
... Mashup framework should be an ideal choice for delivering the vast amount of web-based bioinformatics data -the web becomes an operating system neutral and makes the Mashup be usable by anyone through web browser [3]. With the sharing and reuse feature, Mashup framework provides a real incentive environment for creating specialized bioinformatics applications and eliminating the hosting barrier [25]. A biomashup community could offer a rich library of components to support a new style of bioinformatic experiment [25]. ...
... With the sharing and reuse feature, Mashup framework provides a real incentive environment for creating specialized bioinformatics applications and eliminating the hosting barrier [25]. A biomashup community could offer a rich library of components to support a new style of bioinformatic experiment [25]. However, large data processing and provenance record maintaining are the two challenges for biomashup implementation -appropriate and scalable algorithms are needed to achieve effective biocloud operation. ...
... There are plenty of examples of opposite trends worth researching, examples of which would include de-commercialization or nationalization of technoscience as well as the rise of open science and open source forms of property rights (e.g. Hope 2008; Dove et al. 2012). What is meant to be evident from this brief trawl of various fields of research is the wealth of areas of emerging and potential research open to those interested in understanding the political economy of technoscience, and their insights into how we might contribute not only to a renewal of this area of research, but also to a critical engagement with the economic context in which science, technology and innovation happen (see Galis and Hansson 2012). ...
... There are plenty of examples of opposite trends worth researching, examples of which would include de-commercialization or nationalization of technoscience as well as the rise of open science and open source forms of property rights (e.g. Hope 2008; Dove et al. 2012). What is meant to be evident from this brief trawl of various fields of research is the wealth of areas of emerging and potential research open to those interested in understanding the political economy of technoscience, and their insights into how we might contribute not only to a renewal of this area of research, but also to a critical engagement with the economic context in which science, technology and innovation happen (see Galis and Hansson 2012 ...
Article
This short essay presents the case for a renewed research agenda in STS focused on the political economy of technoscience. This research agenda is based on the claim that STS needs to take account of contemporary economic and financial processes and how they shape and are shaped by technoscience. This necessitates understanding how these processes might impact on science, technology and innovation, rather than turning an STS gaze on the economy.
... Many studies focused on the benefits of using the OSS in the health sector.[2021222324252627282930] In a number of researches, the characteristics of OSS systems have been compared with each other.[3132] ...
... NEHTA was funded jointly by the Australian, state and territory governments in 2005 to develop national standards and infrastructure for EHR across Australia. [19] Many studies focused on the benefits of using the OSS in the health sector.2021222324252627282930 In a number of researches, the characteristics of OSS systems have been compared with each other. ...
Article
Many projects on developing Electronic Health Record (EHR) systems have been carried out in many countries. The current study was conducted to review the published data on the utilization of open source EHR systems in different countries all over the world. Using free text and keyword search techniques, six bibliographic databases were searched for related articles. The identified papers were screened and reviewed during a string of stages for the irrelevancy and validity. The findings showed that open source EHRs have been wildly used by source limited regions in all continents, especially in Sub-Saharan Africa and South America. It would create opportunities to improve national healthcare level especially in developing countries with minimal financial resources. Open source technology is a solution to overcome the problems of high-costs and inflexibility associated with the proprietary health information systems.
... The nutrigenomics provide an approach to integrating the various available technologies for systems biology. OMICS provide reports on methods and research, including bioinformatics, computational biology, metadata, data standards, data sharing, databases, biomedical informatics, biobanks, and the methods along with the statistical and algorithmic developments of cloud computing (Dove et al. 2012). Understanding the function of metabolic stress and its link to various metabolic syndromes is a key objective nutrigenomics aspires to. ...
Chapter
The concept of “personalized medicine” has evolved based on the predicted pharmacogenetic responses of individuals. These response predictions were greatly enhanced by the Human Genome Project and the resultant detection of single nucleotide polymorphisms (SNPs) within human societies. Nutritional genomics provides genetic uprising, which incorporates (Subbiah 2007): 1. Nutrigenomics: To understand the changes occurring in the proteome and metabolome of an individual after the dietary compounds interact with the genome. 2. Nutrigenetics: Involves an individual’s gene-based reactions to various dietary compounds—thus, it helps in the development of a variety of nutraceuticals, well suited to the genetic makeup of the person. Nutrigenomics explores the effects of nutrients on the genome, proteome and metabolome, and nutrigenetics. Figure 3.1 below depicts a scheme to estimate the risk of a particular disease and the health condition of an individual by utilizing several nutrigenomic techniques incorporating dietary, genetic, and metabolic knowledge. On the basis of more than one interrelated dimension, a “systems” biomarker has been designed by the amalgamation of various aforesaid data. Nutraceuticals are the products derived from food sources having extra health bene ts in addition to their basic nutritional value. They can be categorized as: dietary supplements, functional foods, medicinal foods, and pharmaceuticals. Nutrigenomics has a wide range of applications and can be used to understand the consequence of nutritional compounds on the genetic constitution, function, and expression pro le along with the transcriptome of an individual. Intake of nutraceutical substitutes in the form of tablets or pill capsules are also known to prevent multifactorial diseases like cancer, CVD, type II diabetes mellitus, and some monogenic disorders, for example, phenylketonuria, galactosemia, lactose intolerance, etc. Nutrigenetics and nutrigenomics are two different elds, which have two completely different approaches for understanding the genetic interaction with the dietary compounds. However, their common nal aspiration is to personalize diet, understand genetic polymorphisms, and to offer potent gateways and hence augment human health (Mutch et al. 2005). Nutritional interventions like antioxidants, vitamins (e.g., vitamin A, E, D and C, etc.), avonoids, omega-3 fatty acids, etc., aim to prevent the pathogenesis of diabetes mellitus, metabolic syndromes, and their complications. Plant-derived food products show also positive effects on the reduction of chronic diseases due to the presence of phytochemicals. Various food materials like green tea, vitamin E, soy, vitamin D, lycopene, and selenium are taken from natural sources and are being used to alleviate human health (Cencic and Chingwaru 2010). So, inborn error metabolism can be recti fied artifi cially by giving nutraceuticals, or personalized diet supports, which have future aspects of personalized nutrigenome medications. Data mining is the procedure of nding patterns in data to calculate a result or to predict future outputs. Using data mining tools and techniques, huge amounts of data can be handled in a short period of time. Cluster detection, memory-based reasoning, market basket analysis, genetic algorithms, link analysis, decision trees, and neural nets are some of the powerful techniques used for data mining and evaluation purposes. Data mining is playing a major role in nutrigenomics analysis, as it helps to study the present status of nutraceuticals in the market, as well as control the market value of those medicated products and the customers’ responses to them.
... The nutrigenomics provide an approach to integrating the various available technologies for systems biology. OMICS provide reports on methods and research, including bioinformatics, computational biology, metadata, data standards, data sharing, databases, biomedical informatics, biobanks, and the methods along with the statistical and algorithmic developments of cloud computing (Dove et al. 2012). Understanding the function of metabolic stress and its link to various metabolic syndromes is a key objective nutrigenomics aspires to. ...
Article
The use of real-world data in drug repurposing has emerged due to well-established advantages of drug repurposing in supplementing de novo drug discovery and incentives in incorporating real-world evidence in regulatory approvals. We conducted a scoping review to characterize repurposing studies using real-world data and discuss their potential challenges and solutions. A total of 250 studies met the inclusion criteria, of which 36 were original studies on hypothesis generation, 101 on hypothesis validation, and seven on safety assessment. Key challenges that should be addressed for future progress in using real-world data for repurposing include isolated data sources with poor clinical granularity, false-positive signals from data mining, the sensitivity of hypothesis validation to bias and confounding, and the lack of clear regulatory guidance.
Article
Full-text available
This paper reports on end users' perspectives on the use of a blockchain solution for private and secure individual “omics” health data management and sharing. This solution is one output of a multidisciplinary project investigating the social, data, and technical issues surrounding application of blockchain technology in the context of personalized healthcare research. The project studies potential ethical, legal, social, and cognitive constraints of self-sovereign healthcare data management and sharing, and whether such constraints can be addressed through careful design of a blockchain solution.
Article
Full-text available
Breakthrough discoveries in life sciences technologies, coupled with continuing global growth in biobanks, are contributing to a core infrastructure for translational research in population genomics. At the same time, traditional intellectual property discourse, which stresses innovation as a social good and limited property rights as the proper function of law, is challenging, and being challenged by, large-scale population-based biobanks and their publicly-engaged governance structures. By discussing how several particular aspects of biobank discourse dii er from current intellectual property discourse (i.e. democratic engagement and governance policy, genetic sequence databases, and open data sharing), the authors show how this chasm may cause a recalibration of intellectual property law, particularly patent Summary:
Article
Full-text available
This paper deals with the creation of global principle‐based standards. For such standards to be accepted and effective, particular conditions must be fulfilled. One such condition, little explored, is that standard‐makers and ‐takers share knowledge about the meaning of the principles, as well as the practices through which they are likely to be applied. The paper shows that this condition is fulfilled when transnational cultural systems exist, by means of which both types of actors engage in the explication and representation of their practices so that a common, standard understanding emerges of how principles may be interpreted on the ground and informs the negotiations. A transnational cultural system is a crucial governance infrastructure to set global standards, as shown by the long history of creating a risk analysis guideline by the Codex Alimentarius, the inter‐governmental body for food standards.
Article
Full-text available
During the past two decades, computerized biomolecular databases have rapidly expanded and have been prominently integrated into laboratory work in biological sciences. This article offers an exploratory look at the potential significance of these databases as novel tools for scientific communication. The article develops the concept of a science communication regime—a sociotechnical system that constitutes a particular means of scientific communication, such as the scientific journal. This concept is then used to examine biomolecular databases and their potential implications for scientific institutions and practices. The field of genome research provides empirical examples, and the discussion explores the potential significance of these new forms of electronic communication.
Article
Full-text available
Developing Countries (DCs), particularly those in Sub-Saharan Africa (SSA), are suffering from scientific information famine. The expectation that the internet would facilitate scientific information flow does not seem to be realisable, owing to the restrictive subscription fees of the high quality sources and the beleaguering inequity in the access and use of the internet and other Information and Communication Technology (ICT) resources. This paper aims to assess and evaluate Open Access (OA) movement as a proposed solution to avoid the restrictions over accessing scientific knowledge, particularly in SSA. The paper also outlines the opportunities and challenges in implementing OA in SSA. However, there are often mismatches between what the 'donor' countries can reasonably offer and what the SSA countries can implement. Finally, the paper will discuss the slow uptake of the OA in Africa, the perception of the African scientists towards the movement, the non-expression of concern by policymakers and their implications on the scientific activities in Africa.
Article
Full-text available
This article argues that there is a mismatch between traditional intellectual property doctrine and the politics of intellectual property today. To examine the nature of this mismatch, I contrast two frameworks that both appear in contemporary debate about intellectual property: the traditional discourse, which focuses on innovation, and a newer, less clearly codified discourse that views intellectual property issues from the perspective of the politics of technology. This latter discourse focuses on the challenge of democratic governance in a world where emerging technologies have assumed a central role in constituting the future, raising far-reaching questions about how they should be fitted into social orders. The innovation discourse remains dominant in public debate, but recognizing the specific features of the politics-of-technology perspective -- and presenting its distinctive vision of what is at stake in intellectual property -- clarifies the struggles now in play. The politics-of-technology perspective rejects the traditional definition of the boundaries of intellectual property policy; first, because this perspective questions the empirical validity of a bright line distinction between creating technologies and making social choices about them; second, because it sees the traditional cartography as tending to constitute members of the public as "consumers" of prepackaged technologies rather than as "citizens" engaged in shaping them; and third, because it has a normative commitment to enabling citizens to exercise voice and choice about emerging technology before irrevocable commitments in specific directions are made. In contrast to traditional innovation discourse, the politics-of-technology perspective considers patent policy from a point of view that focuses on questions of democratic governance and political legitimacy.
Article
Full-text available
This Article attempts to map the challenges raised by recent encounters between intellectual property and development. It proposes a normative principle of global intellectual property - one that is responsive to development paradigms that have moved far beyond simple utilitarian measures of social welfare. Recent insights from the field of development economics suggest strongly that intellectual property should include a substantive equality principle, measuring its welfare-generating outcomes not only by economic growth but also by distributional effects. A new principle of substantive equality is a necessary corollary to the formal equality principles of national treatment and minimum standards that are now imposed on virtually all countries regardless of their level of development. Indeed this principle is arguably the very core of a human development-driven concept of development, a term that is highly indeterminate but lately used by many developing countries to express an equality concern within various global intellectual property regimes such as the WTO and WIPO. This proposed principle of substantive intellectual property equality would be analogous to strict scrutiny review in the judicial context of U.S. constitutional law. It would be foundational to any form of intellectual property decision making. Simply put, a decision maker would accord much less deference and exercise much more skepticism towards the proposed government action (in this case, the regulatory intervention by the state in the form of the grant of intellectual property protection, or the withholding of an exception or limitation to an intellectual property grant) when a knowledge good that affects basic human development capabilities, such as basic education or health care, is implicated. Certain foundational capacities, whether viewed as the sum of individual capabilities in knowledge or as national capacities in production of knowledge goods, should guide application and creation of intellectual property norms. This proposed substantive equality principle would match intellectual property's innovation mandate to the actual local conditions and concerns of developing countries seeking to join the global knowledge economy. It has taken on new urgency and significance given the recent agreement by WIPO member states to forward recommendations of The New WIPO Development Agenda to the General Assembly in the fall 2007.
Article
Full-text available
Biobanks are adopting various modes of public engagement to close the agency gap between participants and biobank builders. We propose a wiki-governance model for biobanks that harnesses Web 2.0, and which gives citizens the ability to collaborate in biobank governance and policymaking.
Article
Full-text available
To assess the public's perception of biobank research and the relative importance they place on concerns for privacy and confidentiality, when compared with other key variables when considering participation in biobank research. Conjoint analysis of three key attributes (research focus, research beneficiary, and privacy and confidentiality) under conditions of either blanket or specific consent. Although the majority of our participants described themselves as private individuals, they consistently ranked privacy and confidentiality as the least important of the variables they considered. The potential beneficiary of proposed research ranked the highest under conditions of both blanket and specific consent. When completing the conjoint task under conditions of blanket consent, participants tended to act more altruistically. The public tends to view biobanks as public goods designed primarily for public benefit. As such it tends to act altruistically with respect to the potential benefits that might accrue from research using biobanked samples. Participants expressed little concern about informational risks (i.e., privacy and confidentiality) should they choose to participate. The manner in which policy priorities are framed could impact participant value preferences with regard to a number of governance issues in biobanking.
Article
Full-text available
In recent years there has been a major change on the part of funders, particularly in North America, so that data sharing is now considered to be the norm rather than the exception. We believe that data sharing is a good idea. However, we also believe that it is inappropriate to prescribe exactly when or how researchers should preserve and share data, since these issues are highly specific to each study, the nature of the data collected, who is requesting it, and what they intend to do with it. The level of ethical concern will vary according to the nature of the information, and the way in which it is collected - analyses of anonymised hospital admission records may carry a quite different ethical burden than analyses of potentially identifiable health information collected directly from the study participants. It is striking that most discussions about data sharing focus almost exclusively on issues of ownership (by the researchers or the funders) and efficiency (on the part of the funders). There is usually little discussion of the ethical issues involved in data sharing, and its implications for the study participants. Obtaining prior informed consent from the participants does not solve this problem, unless the informed consent process makes it completely clear what is being proposed, in which case most study participants would not agree. Thus, the undoubted benefits of data sharing does not remove the obligations and responsibilities that the original investigators hold for the people they invited to participate in the study.
Article
Full-text available
The Wellcome Trust Sanger Institute has a strong reputation for prepublication data sharing as a result of its policy of rapid release of genome sequence data and particularly through its contribution to the Human Genome Project. The practicalities of broad data sharing remain largely uncharted, especially to cover the wide range of data types currently produced by genomic studies and to adequately address ethical issues. This paper describes the processes and challenges involved in implementing a data sharing policy on an institute-wide scale. This includes questions of governance, practical aspects of applying principles to diverse experimental contexts, building enabling systems and infrastructure, incentives and collaborative issues.
Article
Full-text available
The progress of translational research and enhancing the prospects for personalized medicine require coordinated efforts across a wide range of disciplines as well as public-private partnerships of many kinds. Advances in genomics and proteomics technologies are providing more accurate data on more targets and more types of targets than ever before. These include whole genome sequence information; identification and quantification of thousands of specific protein targets, including details of a myriad of protein modifications; epigenetic modifications; and new frontiers that include micro RNAs and protein and nucleic acid interactome complexities. Our technological advances will continue unabated, as it is unlikely that we have plumbed the depths of biological complexity governing development and disease.
Article
Full-text available
ABSTRACT: The ever increasing social cost that society pays for illness and disease are currently steadily increasing in many countries in the world today. These changes in society becomes a major financial burden that activates politicians and health care organizations in order to find new solutions. Biobanks are becoming the new powerful modality within the field of modern Life Science, that is expected to be important in the proactive awareness of patient health status. Biobanks are also expected to promote the developments of targeted treatments with personalized indicator assays, for effective use of Personalized Medicine treatments in the near future.
Article
Full-text available
Vaccinomics is the convergence of vaccinology and population-based omics sciences. The success of knowledge-based innovations such as vaccinomics is not only contingent on access to new biotechnologies. It also requires new ways of governance of science, knowledge production, and management. This article presents a conceptual analysis of the anticipatory and adaptive approaches that are crucial for the responsible design and sustainable transition of vaccinomics to public health practice. Anticipatory governance is a new approach to manage the uncertainties embedded on an innovation trajectory with participatory foresight, in order to devise governance instruments for collective "steering" of science and technology. As a contrast to hitherto narrowly framed "downstream impact assessments" for emerging technologies, anticipatory governance adopts a broader and interventionist approach that recognizes the social construction of technology design and innovation. It includes in its process explicit mechanisms to understand the factors upstream to the innovation trajectory such as deliberation and cocultivation of the aims, motives, funding, design, and direction of science and technology, both by experts and publics. This upstream shift from a consumer "product uptake" focus to "participatory technology design" on the innovation trajectory is an appropriately radical and necessary departure in the field of technology assessment, especially given that considerable public funds are dedicated to innovations. Recent examples of demands by research funding agencies to anticipate the broad impacts of proposed research--at a very upstream stage at the time of research funding application--suggest that anticipatory governance with foresight may be one way how postgenomics scientific practice might transform in the future toward responsible innovation. Moreover, the present context of knowledge production in vaccinomics is such that policy making for vaccines of the 21st century is occurring in the face of uncertainties where the "facts are uncertain, values in dispute, stakes high and decisions urgent and where no single one of these dimensions can be managed in isolation from the rest." This article concludes, however, that uncertainty is not an accident of the scientific method, but its very substance. Anticipatory governance with participatory foresight offers a mechanism to respond to such inherent sociotechnical uncertainties in the emerging field of vaccinomics by making the coproduction of scientific knowledge by technology and the social systems explicit. Ultimately, this serves to integrate scientific and social knowledge thereby steering innovations to coproduce results and outputs that are socially robust and context sensitive.
Article
Full-text available
The promise of science lies in expectations of its benefits to societies and is matched by expectations of the realisation of the significant public investment in that science. In this paper, we undertake a methodological analysis of the science of biobanking and a sociological analysis of translational research in relation to biobanking. Part of global and local endeavours to translate raw biomedical evidence into practice, biobanks aim to provide a platform for generating new scientific knowledge to inform development of new policies, systems and interventions to enhance the public's health. Effectively translating scientific knowledge into routine practice, however, involves more than good science. Although biobanks undoubtedly provide a fundamental resource for both clinical and public health practice, their potentiating ontology--that their outputs are perpetually a promise of scientific knowledge generation--renders translation rather less straightforward than drug discovery and treatment implementation. Biobanking science, therefore, provides a perfect counterpoint against which to test the bounds of translational research. We argue that translational research is a contextual and cumulative process: one that is necessarily dynamic and interactive and involves multiple actors. We propose a new multidimensional model of translational research which enables us to imagine a new paradigm: one that takes us from bench to bedside to backyard and beyond, that is, attentive to the social and political context of translational science, and is cognisant of all the players in that process be they researchers, health professionals, policy makers, industry representatives, members of the public or research participants, amongst others.
Article
Full-text available
In recent years, there has been tremendous growth and interest in translational research, particularly in cancer biology. This area of study clearly establishes the connection between laboratory experimentation and practical human application. Though it is common for laboratory and clinical data regarding patient specimens to be maintained separately, the storage of such heterogeneous data in one database offers many benefits as it may facilitate more rapid accession of data and provide researchers access to greater numbers of tissue samples. The Thoracic Oncology Program Database Project was developed to serve as a repository for well-annotated cancer specimen, clinical, genomic, and proteomic data obtained from tumor tissue studies. The TOPDP is not merely a library-it is a dynamic tool that may be used for data mining and exploratory analysis. Using the example of non-small cell lung cancer cases within the database, this study will demonstrate how clinical data may be combined with proteomic analyses of patient tissue samples in determining the functional relevance of protein over and under expression in this disease. Clinical data for 1323 patients with non-small cell lung cancer has been captured to date. Proteomic studies have been performed on tissue samples from 105 of these patients. These tissues have been analyzed for the expression of 33 different protein biomarkers using tissue microarrays. The expression of 15 potential biomarkers was found to be significantly higher in tumor versus matched normal tissue. Proteins belonging to the receptor tyrosine kinase family were particularly likely to be over expressed in tumor tissues. There was no difference in protein expression across various histologies or stages of non-small cell lung cancer. Though not differentially expressed between tumor and non-tumor tissues, the over expression of the glucocorticoid receptor (GR) was associated improved overall survival. However, this finding is preliminary and warrants further investigation. Though the database project is still under development, the application of such a database has the potential to enhance our understanding of cancer biology and will help researchers to identify targets to modify the course of thoracic malignancies.
Article
Full-text available
Genomics research data are often widely shared through a variety of mechanisms including publication, meetings and online databases. Re-identification of research participants from sequence data has been shown possible, raising concerns of participants' privacy. In 2008-09, we convened 10 focus groups in Durham, N.C. to explore attitudes about how genomic research data were shared amongst the research community, communication of these practices to participants and how different policies might influence participants' likelihood to consent to a genetic/genomic study. Focus groups were audio-recorded and transcripts were complemented by a short anonymous survey. Of 100 participants, 73% were female and 76% African-American, with a median age of 40-49 years. Overall, we found that discussants expressed concerns about privacy and confidentially of data shared through online databases. Although discussants recognized the benefits of data-sharing, they believed it was important to inform research participants of a study's data-sharing plans during the informed consent process. Discussants were significantly more likely to participate in a study that planned to deposit data in a restricted access online database compared to an open access database (p < 0.00001). The combination of the potential loss of privacy with concerns about data access and identity of the research sponsor warrants disclosure about a study's data-sharing plans during the informed consent process.
Article
Full-text available
This decade is witnessing the proliferation of large-scale population-based biobanks. Many biobanks have reached the stage of offering access to their collection of data and samples to the scientific community. This, however, requires that access arrangements be established to govern the relationship between biobanks and users. Access arrangements capture the convergence of all normative elements in the life cycle of a biobank: policies, laws, common practices, commitments made by the biobank to participants, the expectations of funders, and the needs of the scientific community. Furthermore, access arrangements shape new legal agreements between 'biobankers' and researchers to ensure appropriate, regulated and efficient use of biobank materials. This paper begins by examining the particularities of access arrangements, identifying the key elements of these new regulatory instruments. Second, the paper looks at various strategies used by biobanks to regulate access and surveys the underlying motivations of these strategies and the impact they can have on potential international collaboration. Third, an example of the challenges encountered in creating access policy is illustrated using the case of CARTaGENE, a biobank based in Montreal, Canada. Last, the paper presents how Public Population Project in Genomics (P(3)G) facilitates the work of biobankers and improves collaboration throughout the international human genomics research community.
Article
Full-text available
Genomics is one of the key technologies enabling personalized medicine and the broader field of theragnostics (i.e., the fusion of therapeutics and diagnostic medicine). Yet other high-throughput technologies (e.g., nanotechnology and proteomics) are also rapidly emerging on the horizon in the postgenomics era since the completion of the Human Genome Project in 2003. Applications of these health technologies, too, are being diversified in personalized medicine. These include both “old” and “new” applications aimed at better understanding host-environment interactions, for example, pharmacogenomics, nutrigenomics (featured in the June and September 2009 issues of the CPPM) and pharmacoproteomics, to name a few. Importantly, all these advances are now taking place both “in” and “outside” the traditional laboratory space as personalized medicine innovations diffuse, albeit slowly, from upstream discovery oriented applications (e.g., search for genes associated with common complex diseases) to downstream health products, diagnostics, and personalized interventions in the clinic [2], although not always in that linear direction [3]. Personalized medicine in the postgenomics era calls for a transdisciplinary approach [4], and considerations for how best to develop innovation frameworks to support safe and effective deployment of the new enabling diagnostic technologies. CPPM aims to address the previously unmet needs in both pharmacogenomics and personalized medicine, for example, by moving beyond the artificial compartmentalization of biomarkers and knowledge across health technologies and disciplinary silos. This is crucial as there are important lessons to be learned from different personalized health interventions, whether they involve pharmaceuticals, nutrition, stem cell therapy, or are enabled by genomics, proteomics and nanotechnology. Indeed, these health technologies and their applications can usefully cross-inform each other and thereby help strengthen and triangulate the attendant evidentiary base for personalized medicine. This integrative vision of personalized medicine that includes and extends beyond pharmacogenomics is now being put into practice by the CPPM through vigilant and transdisciplinary horizon scanning, and rigorous peer-review with strong international outreach to expertise available in different global regions. Hence, the December issue of the Journal features two new health technologies - nanotechnology and proteomics - that are already beginning to impact the individualization of drug therapy.
Article
Full-text available
Rapid release of prepublication data has served the field of genomics well. Attendees at a workshop in Toronto recommend extending the practice to other biological data sets.
Chapter
A key element in the success of a biobank is the trust and support of the public. Building this trust is a difficult process. The field of population genomics raises many ethical and societal issues that must be addressed in order for the public to see participation as a trustworthy activity. The Public Population Project in Genomics (P3G) is an international collaboration dedicated to bringing members of the population biobanking community together to share their expertise for the advancement of population genomics research. Through its fundamental principles of promotion of the common good, responsibility, mutual respect, accountability and proportionality, P3G seeks to create tools and resources to assist those involved in biobanks to conduct their research in such a way as to inspire the trust and participation of the public.
Article
Postgenomics personalized medicine science is experiencing a "data deluge" from emerging high-throughput technologies and a myriad of sensors and exponential growth in electronic records. This new data-intensive science was named the "fourth paradigm of science," preceded by the third (last few decades: computational branch, modeling, and simulating complex phenomena), the second (last few hundred years: theoretical branch, using models leading to generalizations), and the first paradigm (a thousand years ago: empirical description of natural phenomena). These shifts in the architecture of twenty-first-century global personalized medicine science call for rethinking twenty-first-century bioethics so as to proactively steer the science and technology innovation trajectory, instead of the more narrowly framed "enabler," "protector," or "regulator" roles hitherto assigned to bioethics in the twentieth century. This chapter introduces the emerging innovative approaches to twenty-first-century bioethics for data-intensive fourth-paradigm science that form the central pillar of postgenomics personalized medicine research and development. Additionally, the chapter goes beyond the classic prescriptive approaches to bioethics, in that it also examines, in a bottom-up manner, the "ethics of bioethics"-as with pharmacogenomics science, bioethics needs to be examined for a deeper, integrated, and panoptic discourse on genomics innovations and their anticipated trajectory from "lab to global society.".
Article
Skip Main Navigation Click here! The Lancet . RSS Feeds Subscribe | Register | Login Close. Username: Password: Forgotten Username or Password? Remember me on this computer until I logout. ... outline goes here. The Lancet , Volume 377, Issue 9765, Pages 537 - 539, 12 ...
Successful repositioning of a drug product depends on carefully considering and integrating both intellectual property and regulatory exclusivities. Patent strategies directed to protecting new formulations, indications and methods of use, when combined with strategically repositioned products, can provide effective and long lasting product exclusivity even where the underlying API, and the original formulations, indications and methods of use are off-patent.
Article
The idea that there is an emerging “bioeconomy” characterized by the capture of the latent value found in biological material (e.g. cells, tissues, plants, etc.) has become a popular policy agenda since the mid-2000s. A number of scholars have also written about this intersection between the life sciences and capitalism, often drawing on anthropological and sociological perspectives to conceptualize the new socialities, subjectivities, and identities brought about by new biotechnologies. While these studies are undoubtedly a fruitful academic enterprise, they have also left a gap in our understanding of the bioeconomy because they have not discussed knowledge or knowledge production. This article focuses on this immaterial side of the bioeconomy, exploring the geographies of value in the bioeconomy that are constituted by intangible and immaterial resources and labor. The core argument is that value in the bioeconomy is created from geographical processes that both embed immateriality in particular places and, at the same time, abstract it in global standards and regulations.
Article
A prominent feature of health in all industrialized countries is the social gradient in health and disease. Many observers believe that this gradient is simply a matter of poor health for the disadvantaged and good health for everyone else, but this is an inadequate analysis. The Whitehall Study documented a social gradient in mortality rates, even among people who are not poor, and this pattern has been confirmed by data from the United States and elsewhere. The social gradient in health is influenced by such factors as social position; relative versus absolute deprivation; and control and social participation. To understand causality and generate policies to improve health, we must consider the relationship between social environment and health and especially the importance of early life experiences.
Article
One of the first (conceptual) frameworks developed for understanding the relation of science and technology to the economy has been the linear model of innovation. The model postulated that innovation starts with basic research, is followed by applied research and development, and ends with production and diffusion. The precise source of the model remains nebulous, having never been documented. Several authors who have used, improved, or criticized the model in the past fifty years rarely acknowledged or cited any original source. The model usually was taken for granted. According to others, however, it comes directly from V. Bush’s Science: The Endless Frontier ([1945] 1995). This article traces the history of the linear model, suggesting that it developed in three steps corresponding to three scientific communities looking at science analytically. The article argues that statistics is a main reason the model is still alive despite criticisms, alternatives, and having been proclaimed dead.
Article
The theory advanced is that precise rules more consistently regulate simple phenomena than principles. However, as the regulated phenomena become more complex, principles deliver more consistency than rules. A central reason is that the iterative pursuit of precision in single rules increases the imprecision of a complex system of rules. By increasing the reliance we can place on a part of the law we reduce the reliability of the law as a whole. Then it is argued that consistency in complex domains can be even better realised by an appropriate mix of rules and principles than by principles alone. A key choice here is between binding rules interpreted by non-binding principles and non-binding rules backed by binding principles. The more complex the domain, the more likely it is the latter that will deliver greater consistency. Robert Baldwin argues that the reason "Why Rules Don't Work" is that they are typically evaluated without reference to the context of their implementation. Hence we cannot understand when law is and is not consistently implemented by the police without confronting the fact that police culture is not a rulebook, but a storybook. In complex domains, when police, regulatory inspectors and judges enforce rules consistently, they do so as a result of shared sensibilities. Regulatory conversations that foreground obligatory principles buttressed by non-binding background rules is hypothesised to be the stuff of legal certainty on such complex terrain.
Article
Patents both promote the development of health technologies as well as constrain access to them. Access constraints on patented medicines, diagnostics, and agricultural innovations can severely compromise human health, particularly for low-income populations. To help address this challenge, this Article explores mechanisms for integrating distributive safeguards in the patent system in a manner consistent with strong property rights and private ordering. Finding existing patent doctrine inadequate, this Article examines solutions arising from the developmental histories of particular health technologies. In particular, this Article argues that public institutions, which contribute enormous amounts of "scientific capital" - money, labor, and bodily materials - to life sciences research and development, can effectively leverage these contributions to enhance access to downstream patented technologies. By providing vital capital, government, academic, and nonprofit entities both weaken the economic need for exclusive rights as well as obtain limited co-ownership stakes in resulting inventions. By exercising this leverage, public institutions are helping to create a "distributive commons" that enhances access to patented health technologies for low-income populations. This Article surveys existing practices, providing prescriptions to address the chilling effects and technical competence concerns that undermine distributive efforts. It concludes by challenging prevailing theoretical preferences for individual rather than communal ownership of property, highlighting the advantages of public-private co-ownership of nonrival resources.
Article
Since Hardin, law and economics scholars have launched a crusade to expose the evil of the commons - the evil, that is, of not propertizing. Progressive legal scholars have responded in kind, exposing the perils of propertization. With the rise of the Information Age, the flashpoint debates about property have moved from land to information. The public domain is now the cause célèbre among progressive intellectual property and cyber-law scholars, who extol the public domain as necessary for sustaining innovation. But scholars obscure the distributional consequences of the commons. They presume a landscape where every person can reap the riches found in the commons. This is the romance of the commons - the belief that because a resource is open to all by force of law, it will indeed be equally exploited by all. But in practice, differing circumstances - including knowledge, wealth, power, access, and ability - render some better able than others to exploit a commons. We examine this romance through the lens of the global intellectual property regime in genetic resources and traditional knowledge. The Agreement on Trade Related Aspects of Intellectual Property Rights (TRIPS) transformed a global public domain in information by propertizing the information resources of the West - from entertainment to technological advances - but leaving in the commons the information resources of the rest of the world, such as genetic resources and traditional knowledge. Just as the trope of the romantic author has served to bolster the property rights claims of the powerful, so too does the romance of the public domain. Resourcefully, the romantic public domain trope steps in exactly where the romantic author falters. Where genius cannot justify the property claims of corporations (because the knowledge pre-exists individual claims of authorship), the public domain can. We review real-world strategies for resolving the romance of the commons. Just as recognition of the tragedy of the commons is the central justification for private property, recognizing the romance of the commons may justify forms of property uncommon in Western legal traditions.
Article
In the field of biotechnology, the patent system has had its share of detractors and has come under increasing criticism of late. It has been suggested that cooperative strategies such as open source could correct the inadequacies generated by the application of the patent system to biotechnological inventions. However, most of the critics of the patent system presume its inefficiency on the basis of theoretical arguments that have not been confirmed by the existing evidence. Before discussing the feasibility of implementing cooperative strategies in the field of biotechnology, should we not ensure that such strategies are really needed to improve the system? We addressed this question by investigating the claim that the patent system has created an anticommons effect in the field of biotechnology. Our evaluation of the available empirical data demonstrates the absence of a widespread anticommons effect and thus the inadequacy of using this theory as a central argument in favour of the use of open source approaches in biotechnology. Ultimately, the use of open source approaches should be justified on the individual merits of these strategies rather than on the basis of theoretical inefficiencies of the patent system. Consequently, our article will analyze the various potential benefits of open source reported in the literature. The strategic use of collaborative approaches such as open source could facilitate the development of a dynamic and functional biomedical research sector in academia, one that continues to work in the spirit of open science.
Article
The related concepts of open innovation and user-centric innovation are currently popular in the literature on technology and innovation management. In this paper, we attempt to address two shortcomings to their practical application. First, the precise mechanisms supporting open and user innovation in different industrial contexts are poorly specified. Second, it is not clear under what circumstances they might become dysfunctional. We identify how the interaction of meso- and micro-level mechanisms contribute to project-based user-centric innovation, based on a detailed characterization of the business activities of eight technology and engineering consultancies working across a range of sectors. We develop and illustrate the notion of generative interaction, which describes a series of mechanisms that produce a self-re-enforcing ecology, which favours innovation and profitability. At the same time, we observe the opposite dynamics of self-reinforcing degenerative interaction likely to produce a cycle of declining innovation and profitability. In the specific context of project-based firms, we show that user-centric, open innovation can affect performance negatively, and we discuss the consequences (positive and negative) of different patterns of interaction with clients.
Chapter
New advancement in genomics, proteomics, and metabonomics created significant excitement about the use of these relatively new technologies in drug design, discovery, development, and molecular-targeted therapeutics by identifying new drug targets and better tools for safety and efficacy studies in preclinical and clinical stages of drug development as well as diagnostics. In this chapter, we will briefly discuss the application of genomics, proteomics, and metabonomics in drug discovery and development.
Article
Low productivity, rising R&D costs, dissipating proprietary products and dwindling pipelines are driving the pharmaceutical industry to unprecedented challenges and scrutiny. In this article I reflect on the current status of the pharmaceutical industry and reasons for continued low productivity. An emerging 'symbiotic model of innovation', that addresses underlying issues in drug failure and attempts to narrow gaps in current drug discovery processes, is discussed to boost productivity. The model emphasizes partnerships in innovation to deliver quality products in a cost-effective system. I also discuss diverse options to build a balanced research portfolio with higher potential for persistent delivery of drug molecules.
Article
This paper investigates how scientists decide whether to share information with their colleagues or not. Detailed data on the decisions of 1694 bio-scientists allow to detect similarities and differences between academia-based and industry-based scientists. Arguments from social capital theory are applied to explain why individuals share information even at (temporary) personal cost. In both realms, the results suggest that the likelihood of sharing decreases with the competitive value of the requested information. Factors related to social capital, i.e., expected reciprocity and the extent to which a scientist's community conforms to the norm of open science, either directly affect information-sharing or moderate competitive interest considerations on information-sharing. The effect depends on the system to which a scientist belongs.
Article
The application of translational approaches (e.g. from bed to bench and back) is gaining momentum in the pharmaceutical industry. By utilizing the rapidly increasing volume of data at all phases of drug discovery, translational bioinformatics is poised to address some of the key challenges faced by the industry. Indeed, computational analysis of clinical data and patient records has informed decision-making in multiple aspects of drug discovery and development. Here, we review key examples of translational bioinformatics approaches to emphasize its potential to enhance the quality of drug discovery pipelines, reduce attrition rates and, ultimately, lead to more effective treatments.
Article
Hepatocellular carcinoma (HCC) is a malignant tumor of liver that causes approximately half a million deaths each year, of which over half of the cases are diagnosed in China. Because of its asymptomatic nature, HCC is usually diagnosed at late and advanced stages, for which there are no effective therapies. Thus, biomarkers for early detection and molecular targets for treating HCC are urgently needed. With the advent of high-throughput omics technologies, we have begun to mine the genomics and proteomics information of HCC, and most importantly, these data can be integrated with clinical annotations of the patients. Such new horizons of integrated profiling informatics have allowed us to search for and better identify clinically useful biomarkers and therapeutic targets for cancers including HCC. Capitalizing the large clinical samples cohort (over 100 pairs of tumor and matched adjacent nontumor tissues of HCC), we herein discuss the use of proteomics approach to identify biomarkers that are potentially useful for (1) discrimination of tumorous from nonmalignant tissues, (2) detection of small-sized and early stage of HCC, and (3) prediction of early disease relapse after hepatectomy.
Article
In recent years, biobanks of human tissues have evolved from small-scale collections of pathological materials into structured resource centers for acquisition, storage, processing and usage of high-quality biospecimens for research. This evolution goes hand in hand with the development of highly sensitive, high-throughput methods for biomarker discovery. The complexity of the molecular patterns of diseases such as cancer provides multiple opportunities for targeted therapeutic intervention, tailored to suit the particular characteristics of each patient. Developing and evaluating such novel therapies requires access to rigorously designed and well-structured collections of biospecimens. In turn, biobanking infrastructures have a critical impact on the discovery, development and implementation of new drugs for cancer treatment. Therefore, it is essential to harmonize biobanking procedures, and to develop innovative solutions supporting biobank interoperability and specimen sharing, ensuring that new drugs may effectively reach out to the largest possible number of patients.
Article
Biobanking has been identified as a key area for development in order to accelerate the discovery and development of new drugs. This review describes the recent advances in the field of biobanking and biospecimen research, with special reference to tumour banks which are the biobanks of primary interest in oncology. There is a dramatic deficiency of high-quality, well annotated cancer biospecimens. Biospecimen research is a fast developing field that will improve biobanking methodology and biobanking is becoming more professionally organized with increased attention to quality management. Biobank networks are developing rapidly in order to combine and share resources. Biobanking services must improve rapidly to serve the needs of personalized medicine and biospecimen research should be encouraged and supported at all levels from project funding to publication of results. Biobanks need to be run to high professional standards and the importance of adequate funding, training and certification must be emphasized. The growing presence of national and international biobank networks will allow biobanks to synergize. The development of a biobanking community will facilitate teamwork to overcome common challenges and enhance communication with multiple stakeholder groups.
Article
Analyses of public policy regularly express certitude about the consequences of alternative policy choices. Yet policy predictions often are fragile, with conclusions resting on critical unsupported assumptions. Then the certitude of policy analysis is not credible. This paper develops a typology of incredible analytical practices and gives illustrative cases. I call these practices conventional certitudes, dueling certitudes, conflating science and advocacy, and wishful extrapolation. I contrast these practices with my vision for credible policy analysis.
Article
"The goal of this paper is to summarize the lessons learned from a large body of international, interdisciplinary research on common-pool resources (CPRs) in the past 25 years and consider its usefulness in the analysis of scholarly information as a resource. We will suggest ways in which the study of the governance and management of common-pool resources can be applied to the analysis of information and 'the intellectual public domain.' The complexity of the issues is enormous for many reasons: the vast number of players, multiple conflicting interests, rapid changes of technology, the general lack of understanding of digital technologies, local versus global arenas, and a chronic lack of precision about the information resource at hand. We suggest, in the tradition of Hayek, that the combination of time and place analysis with general scientific knowledge is necessary for sufficient understanding of policy and action. In addition, the careful development of an unambiguous language and agreed-upon definitions is imperative. "As one of the framing papers for the Conference on the Public Domain, we focus on the language, the methodology, and outcomes of research on common-pool resources in order to better understand how various types of property regimes affect the provision, production, distribution, appropriation, and consumption of scholarly information. Our analysis will suggest that collective action and new institutional design play as large a part in the shaping of scholarly information as do legal restrictions and market forces."
Article
With the radical changes in information production that the Internet has introduced, we stand at an important moment of transition, says Yochai Benkler in this thought-provoking book. The phenomenon he describes as social production is reshaping markets, while at the same time offering new opportunities to enhance individual freedom, cultural diversity, political discourse, and justice. But these results are by no means inevitable: a systematic campaign to protect the entrenched industrial information economy of the last century threatens the promise of today's emerging networked information environment. In this comprehensive social theory of the Internet and the networked information economy, Benkler describes how patterns of information, knowledge, and cultural production are changing-and shows that the way information and knowledge are made available can either limit or enlarge the ways people can create and express themselves. He describes the range of legal and policy choices that confront us and maintains that there is much to be gained-or lost-by the decisions we make today.
Article
Off-label prescribing is an integral part of contemporary medicine. Many patients benefit when they receive drugs or devices under circumstances not specified on the label approved by the Food and Drug Administration (FDA). An off-label use may provide the best available intervention for a patient, as well as the standard of care for a particular health problem. In oncology, pediatrics, geriatrics, obstetrics, and other practice areas, patient care could not proceed without off-label prescribing. When scientific and medical evidence justify off-label uses, physicians promote patients’ interests by prescribing products off label. Off-label prescribing can also harm patients, however. The potential for harm is greatest when an offlabel use lacks a solid evidentiary basis. A 2006 study examining prescribing practices for 169 commonly prescribed drugs found high rates of off-label use with little or no scientific support. Researchers examining off-label use in U.S. children’s hospitals concluded, “[W]e still have incomplete knowledge about the safety and efficacy of many medications commonly used to treat children across a range of drug classes and clinical diagnoses.”
Article
Analysis of the genome provides important information about the somatic genetic changes existing in the tissue; however, it is the proteins that do the work of the cell. Diseases such as cancer are caused by derangements in cellular protein molecular networks and cell signaling pathways. These pathways contain a large and growing collection drug targets, governing cellular survival, proliferation, invasion, and cell death. The clinical utility of reverse-phase protein microarrays (RPPA), a new technology invented in our laboratory, lies in its ability to generate a functional map of known cell signaling networks or pathways for an individual patient obtained directly from a biopsy specimen. Coupled with laser capture microdissection (LCM), the RPPA platform, the entire cellular proteome is immobilized on a substratum with subsequent immunodetection of the total levels and phosphorylated, or activated, state of cell signaling proteins. The results of which pathways are "in use" can then be correlated with biological and clinical information and serve as both a diagnostic and a therapeutic guide, thus providing a "theranostic" endpoint.