84
367.84
4.38
324

Publication History View all

  • Nature 10/2013; 502(7471):295-7.
  • [show abstract] [hide abstract]
    ABSTRACT: Cloud computing has been increasingly adopted by users and providers to promote a flexible, scalable and tailored access to computing resources. Nonetheless, the consolidation of this paradigm has uncovered some of its limitations. Initially devised by corporations with direct control over large amounts of computational resources, cloud computing is now being endorsed by organizations with limited resources or with a more articulated, less direct control over these resources. The challenge for these organizations is to leverage the benefits of cloud computing while dealing with limited and often widely distributed computing resources. This study focuses on the adoption of cloud computing by higher education institutions and addresses two main issues: flexible and on-demand access to a large amount of storage resources, and scalability across a heterogeneous set of cloud infrastructures. The proposed solutions leverage a federated approach to cloud resources in which users access multiple and largely independent cloud infrastructures through a highly customizable broker layer. This approach allows for a uniform authentication and authorization infrastructure, a fine-grained policy specification and the aggregation of accounting and monitoring. Within a loosely coupled federation of cloud infrastructures, users can access vast amount of data without copying them across cloud infrastructures and can scale their resource provisions when the local cloud resources become insufficient.
    Philosophical Transactions of The Royal Society A Mathematical Physical and Engineering Sciences 01/2013; 371(1983):20120067.
  • [show abstract] [hide abstract]
    ABSTRACT: Energy consumption of computing systems has become a major concern. Constrained by cost, environmental concerns and policy, minimising the energy foot-print of computing systems is one of the primary goals of many initiatives.As we move towards exascale computing, energy constraints become very real and are a major driver in design decisions. The issue is also apparent at the scale of desk top machines, where many core and accelerator chips are common and offer a spectrum of opportunities for balancing energy and performance.Conventionally, approaches for reducing energy consumption have been either at the operational level (such as powering down all or part of systems) or at the hardware design level (such as utilising specialised low-energy components). In this paper, we are interested in a different approach; energy-aware software. By measuring the energy consumption of a computer application and understanding where the energy usage lies, may allow a change of the software to provide opportunities for energy savings.In order to understand the complexities of this approach, we specifically look at multithreaded algorithms and applications. By an evaluation of a benchmark suite on multiple architectures and multiple environments, we show how basic parameters, such as threading options, compilers and frequencies, can impact energy consumption. As such, we provide an overview of the challenges that face software developers in this regard. We then offer a view of the directions that need to be taken and possible strategies needed for building energy-aware software.
    Journal of Computational Science. 01/2013; 4(6):444–449.
  • Source
    [show abstract] [hide abstract]
    ABSTRACT: MOTIVATION: Data collection in spreadsheets is ubiquitous, but current solutions lack support for collaborative semantic annotation that would promote shared and interdisciplinary annotation practices, supporting geographically distributed players. RESULTS: OntoMaton is an open source solution that brings ontology look-up and tagging capabilities into a cloud-based collaborative editing environment, harnessing Google Spreadsheets and the NCBO Bioportal Web services. It is a general purpose, format-agnostic tool that may serve as a component of the ISA software suite. OntoMaton can also be used to assist the ontology development process. AVAILABILITY: OntoMaton is freely available from Google widgets under the CPAL open source license; documentation and examples at: https://github.com/ISA-tools/OntoMaton CONTACT: isatools@googlegroups.com.
    Bioinformatics 12/2012;
  • [show abstract] [hide abstract]
    ABSTRACT: MOTIVATION: Fungi form extensive interconnected mycelial networks that scavenge efficiently for scarce resources in a heterogeneous environment. The architecture of the network is highly responsive to local nutritional cues, damage or predation, and continuously adapts through growth, branching, fusion or regression. These networks also provide an example of an experimental planar network system that can be subjected to both theoretical analysis and experimental manipulation in multiple replicates. For high-throughput measurements, with hundreds of thousands of branches on each image, manual detection is not a realistic option, especially if extended time series are captured. Furthermore, branches typically show considerable variation in contrast as the individual cords span several orders of magnitude and the compressed soil substrate is not homogeneous in texture making automated segmentation challenging. RESULTS: We have developed and evaluated a high-throughput automated image analysis and processing approach using Phase Congruency Tensors and watershed segmentation to characterize complex fungal networks. The performance of the proposed approach is evaluated using complex images of saprotrophic fungal networks with 10(5)-10(6) edges. The results obtained demonstrate that this approach provides a fast and robust solution for detection and graph-based representation of complex curvilinear networks. Availability and implementation: The Matlab toolbox is freely available through the Oxford e-Research Centre website: http://www.oerc.ox.ac.uk/research/bioimage/software Contacts: boguslaw.obara@oerc.ox.ac.uk.
    Bioinformatics 06/2012; 28(18):2374-81.
  • Source
    [show abstract] [hide abstract]
    ABSTRACT: Background: Suicide rates are elevated in the veterinary profession in several countries, yet little is known about possible contributory and preventive factors. Aims: To obtain information from veterinarians with a history of suicidal ideation or behavior about the factors associated with suicidality in their profession. Methods: We conducted a mixed-methods interview study with 21 UK veterinarians who had attempted suicide or reported recent suicidal ideation. Interview topics included work and nonwork contributory factors, coping mechanisms, and preventive factors. Results: Self-poisoning was the most common method used or considered by participants. Common contributory factors were workplace relationships, career concerns, patient issues, number of hours and volume of work, and responsibility, although two-thirds of participants reported co-occurring difficult life events. Around half had received a psychiatric diagnosis following their suicidal behavior. Several possible preventive measures were suggested by participants. Conclusions: Several work- and non-work-related contributory factors to suicidality in the veterinary profession were identified. Future preventive measures may involve better promotion of support services, formal support for recent graduates, and improving employers’ attitudes toward work-life balance.
    Crisis The Journal of Crisis Intervention and Suicide Prevention 06/2012; 33(5):280-9.
  • [show abstract] [hide abstract]
    ABSTRACT: Computational models in physiology often integrate functional and structural information from a large range of spatiotemporal scales from the ionic to the whole organ level. Their sophistication raises both expectations and skepticism concerning how computational methods can improve our understanding of living organisms and also how they can reduce, replace, and refine animal experiments. A fundamental requirement to fulfill these expectations and achieve the full potential of computational physiology is a clear understanding of what models represent and how they can be validated. The present study aims at informing strategies for validation by elucidating the complex interrelations among experiments, models, and simulations in cardiac electrophysiology. We describe the processes, data, and knowledge involved in the construction of whole ventricular multiscale models of cardiac electrophysiology. Our analysis reveals that models, simulations, and experiments are intertwined, in an assemblage that is a system itself, namely the model-simulation-experiment (MSE) system. We argue that validation is part of the whole MSE system and is contingent upon 1) understanding and coping with sources of biovariability; 2) testing and developing robust techniques and tools as a prerequisite to conducting physiological investigations; 3) defining and adopting standards to facilitate the interoperability of experiments, models, and simulations; 4) and understanding physiological validation as an iterative process that contributes to defining the specific aspects of cardiac electrophysiology the MSE system targets, rather than being only an external test, and that this is driven by advances in experimental and computational methods and the combination of both.
    AJP Heart and Circulatory Physiology 05/2012; 303(2):H144-55.
  • [show abstract] [hide abstract]
    ABSTRACT: In a cohort study of users of bisphosphonates, we evaluated the incidence of fragility fractures at all sites on the femur following for up to 8 years of therapy with alendronate or risedronate. We did not find evidence for a reversal of fracture protection with long-term use of bisphosphonates. INTRODUCTION: Few studies have acquired adequate data with prolonged follow-up on bisphosphonate users in the general population to evaluate their long-term effects on the risk of hip fractures including those in the subtrochanteric region. METHODS: This cohort study utilizes a large USA database (January 1, 2000 to June 30, 2009). We compared patients with higher versus lower degrees of compliance [medication possession ratio, MPR <1/3 (the reference), 1/3-<2/3, or ≥2/3]. Radiographic adjudication of fracture site and features were not performed. Hazard ratios (HR) for fracture were estimated using time-dependent Cox models. Restricted cubic splines (RCS) were used to plot HRs for fracture against duration of therapy. RESULTS: There were 3,655 incident cases of femoral fracture (764 subtrochanteric/shaft, 2,769 hip) identified during 917,741 person-years of follow-up (median = 3 years) on 287,099 patients (267,374 were women) from the date when they initiated oral bisphosphonate therapy. The corresponding HRs (95% confidence interval, CI) for overall femoral fractures associated with each additional year of therapy were 0.93 (0.86-1.01) within 5 years, and 0.89 (0.77-1.03) beyond 5 years for risedronate and 0.86 (0.81-0.91) and 0.95 (0.84-1.07) for alendronate, respectively. The corresponding estimates for subtrochanteric/shaft fractures were 1.05 (0.87-1.26) and 0.89 (0.60-1.33) for risedronate and 0.99 (0.92-1.05) and 1.05 (0.92-1.20) for alendronate, respectively. The HRs (95% CI) for overall femoral fractures associated with each additional year of alendronate or risedronate therapy within 5 and beyond 5 years were not significantly different. CONCLUSION: Our study showed persistence of overall hip fracture protection with long-term use of alendronate or risedronate.
    Osteoporosis International 03/2012;
  • Source
    [show abstract] [hide abstract]
    ABSTRACT: To make full use of research data, the bioscience community needs to adopt technologies and reward mechanisms that support interoperability and promote the growth of an open 'data commoning' culture. Here we describe the prerequisites for data commoning and present an established and growing ecosystem of solutions using the shared 'Investigation-Study-Assay' framework to support that vision.
    Nature Genetics 02/2012; 44(2):121-6.
  • Source
    [show abstract] [hide abstract]
    ABSTRACT: Several methods for density matrix propagation in parallel computing environments are proposed and evaluated. It is demonstrated that the large communication overhead associated with each propagation step (two-sided multiplication of the density matrix by an exponential propagator and its conjugate) may be avoided and the simulation recast in a form that requires virtually no inter-thread communication. Good scaling is demonstrated on a 128-core (16 nodes, 8 cores each) cluster.
    The Journal of Chemical Physics 01/2012; 136(4):044108.
Information provided on this web page is aggregated encyclopedic and bibliographical information relating to the named institution. Information provided is not approved by the institution itself. The institution’s logo (and/or other graphical identification, such as a coat of arms) is used only to identify the institution in a nominal way. Under certain jurisdictions it may be property of the institution.
View all

Top publications last week

 
Scientific Data Management: Challenges, Technology, and Deployment 1 edited by Arie Shoshani and Doron Rotem, pages 467-508; Chapman and Hall/CRC.
34 Downloads
 
BMC Bioinformatics 01/2012; 13 Suppl 1:S9.
3 Downloads