Publication History View all

  • Journal of Biomedical Informatics 08/2014; 50:1–3. DOI:10.1016/j.jbi.2014.07.010
  • [Show abstract] [Hide abstract]
    ABSTRACT: In data center networks, resource allocation refers to mapping a large number of workloads to substrate networks. Existing heuristic mapping algorithms evaluate the resources of the nodes according to one resource factor or a product of resource factors, ...
    Future Generation Computer Systems 03/2014; 32:24–26. DOI:10.1016/j.future.2013.10.017
  • [Show abstract] [Hide abstract]
    ABSTRACT: In order to meet demands in mobile broadband and to bridge the digital divide WiMAX was introduced in 2004. However, in order to increase the financial return on the investment, service operators need to make every effort in designing and deploying the most cost-effective networks using this technology. This paper presents a novel dimensioning technique for WiMAX technology which takes the dimensioning problem to a new level and produces more accurate results in comparison to traditional methods. Furthermore, a novel decomposed optimization framework for the WiMAX network planning is introduced which subdivides the overall problem into three distinct stages – a network dimensioning stage, initial sectorization and configuration stage and a final network configuration stage. The proposed framework also solves two fundamental problems, which are cell planning and frequency planning, simultaneously. Optimization is based on a tailored simulated annealing algorithm. Results, based on experiments in and around a UK city, show that the three stage framework produces viable designs with good coverage and service. In particular Stage 1 is efficient and more accurately dimensions the network than the traditional methods.
    Ad Hoc Networks 02/2014; 13:381–403. DOI:10.1016/j.adhoc.2013.08.016
  • Nature 10/2013; 502(7470):171. DOI:10.1038/502171d
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Objective To evaluate the performance of the United Kingdom Prospective Diabetes Study (UKPDS) Risk Engine for predicting the 10-year risk of cardiovascular disease endpoints in an independent cohort of UK patients newly diagnosed with type 2 diabetes.Research Design and Methods This was a retrospective cohort study using routine healthcare data collected between April 1998 and October 2011 from around 350 UK primary-care practices contributing to the Clinical Practice Research Datalink (CPRD). Participants comprised 79,966 patients aged between 35 and 85 years (388 269 person years) with 4,984 cardiovascular events. Four outcomes were evaluated: first diagnosis of coronary heart disease (CHD), stroke, fatal CHD, and fatal stroke.ResultsAccounting for censoring , the observed versus predicted ten-year event rates were as follows: CHD 6.1% vs 16.5%, fatal CHD 1.9% vs 10.1%, stroke 7.0% vs 10.1%, and fatal stroke 1.7% vs 1.6%, respectively. The UKPDS-RE showed moderate discrimination for all four outcomes, with the concordance-index values ranging from 0.65 to 0.78.Conclusions The UKPDS stroke equations showed calibration ranging from poor to moderate; however, the CHD equations showed poor calibration and considerably overestimated CHD risk. There is a need for revised risk equations in type 2 diabetes.
    Diabetes care 10/2013; 37(2). DOI:10.2337/dc13-1159
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Mutual information (MI) is a popular similarity measure for performing image registration between different modalities. MI makes a statistical comparison between two images by computing the entropy from the probability distribution of the data. Therefore, to obtain an accurate registration it is important to have an accurate estimation of the true underlying probability distribution. Within the statistics literature, many methods have been proposed for finding the 'optimal' probability density, with the aim of improving the estimation by means of optimal histogram bin size selection. This provokes the common question of how many bins should actually be used when constructing a histogram. There is no definitive answer to this. This question itself has received little attention in the MI literature, and yet this issue is critical to the effectiveness of the algorithm. The purpose of this paper is to highlight this fundamental element of the MI algorithm. We present a comprehensive study that introduces methods from statistics literature and incorporates these for image registration. We demonstrate this work for registration of multi-modal retinal images: colour fundus photographs and scanning laser ophthalmoscope images. The registration of these modalities offers significant enhancement to early glaucoma detection, however traditional registration techniques fail to perform sufficiently well. We find that adaptive probability density estimation heavily impacts on registration accuracy and runtime, improving over traditional binning techniques.
    Computerized medical imaging and graphics: the official journal of the Computerized Medical Imaging Society 08/2013; 37(7-8). DOI:10.1016/j.compmedimag.2013.08.004
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Three-dimensional surface registration transforms multiple three-dimensional data sets into the same coordinate system so as to align overlapping components of these sets. Recent surveys have covered different aspects of either rigid or nonrigid registration, but seldom discuss them as a whole. Our study serves two purposes: 1) To give a comprehensive survey of both types of registration, focusing on three-dimensional point clouds and meshes and 2) to provide a better understanding of registration from the perspective of data fitting. Registration is closely related to data fitting in which it comprises three core interwoven components: model selection, correspondences and constraints, and optimization. Study of these components 1) provides a basis for comparison of the novelties of different techniques, 2) reveals the similarity of rigid and nonrigid registration in terms of problem representations, and 3) shows how overfitting arises in nonrigid registration and the reasons for increasing interest in intrinsic techniques. We further summarize some practical issues of registration which include initializations and evaluations, and discuss some of our own observations, insights and foreseeable research trends.
    07/2013; 19(7):1199-217. DOI:10.1109/TVCG.2012.310
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We introduce a novel stratified sampling technique for mesh surfaces that gives the user control over sampling density and anisotropy via a tensor field. Our approach is based on sampling space-filling curves mapped onto mesh segments via parametrizations aligned with the tensor field. After a short preprocessing step, samples can be generated in real time. Along with visual examples, we provide rigorous spectral analysis and differential domain analysis of our sampling. The sample distributions are of high quality: they fulfil the blue noise criterion, so have minimal artifacts due to regularity of sampling patterns, and they accurately represent isotropic and anisotropic densities on the plane and on mesh surfaces. They also have low discrepancy, ensuring that the surface is evenly covered.
    07/2013; 19(7):1143-57. DOI:10.1109/TVCG.2012.305
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Many non-photorealistic rendering techniques exist to produce artistic effects from given images. Inspired by various artists, interesting effects can be produced by using a minimal rendering, where the minimum refers to the number of tones as well as the number and complexity of the primitives used for rendering. Our method is based on various computer vision techniques, and uses a combination of refined lines and blocks (potentially simplified), as well as a small number of tones, to produce abstracted artistic rendering with sufficient elements from the original image. We also considered a variety of methods to produce different artistic styles, such as colour and 2-tone drawings, and use semantic information to improve renderings for faces. By changing some intuitive parameters a wide range of visually pleasing results can be produced. Our method is fully automatic. We demonstrate the effectiveness of our method with extensive experiments and a user study.
    Graphical Models 07/2013; 75(4):208–229. DOI:10.1016/j.gmod.2013.03.004
  • [Show abstract] [Hide abstract]
    ABSTRACT: The Taverna workflow tool suite (http://www.taverna.org.uk) is designed to combine distributed Web Services and/or local tools into complex analysis pipelines. These pipelines can be executed on local desktop machines or through larger infrastructure (such as supercomputers, Grids or cloud environments), using the Taverna Server. In bioinformatics, Taverna workflows are typically used in the areas of high-throughput omics analyses (for example, proteomics or transcriptomics), or for evidence gathering methods involving text mining or data mining. Through Taverna, scientists have access to several thousand different tools and resources that are freely available from a large range of life science institutions. Once constructed, the workflows are reusable, executable bioinformatics protocols that can be shared, reused and repurposed. A repository of public workflows is available at http://www.myexperiment.org. This article provides an update to the Taverna tool suite, highlighting new features and developments in the workbench and the Taverna Server.
    Nucleic Acids Research 05/2013; 41(Web Server issue). DOI:10.1093/nar/gkt328
    This article is viewable in ResearchGate's enriched format
Information provided on this web page is aggregated encyclopedic and bibliographical information relating to the named institution. Information provided is not approved by the institution itself. The institution’s logo (and/or other graphical identification, such as a coat of arms) is used only to identify the institution in a nominal way. Under certain jurisdictions it may be property of the institution.
View all

Top publications last week

Parallel Problem Solving from Nature - PPSN VI, 6th International Conference, Paris, France, September 18-20, 2000, Proceedings; 01/2000
Evolutionary Computation, 2009. CEC '09. IEEE Congress on; 06/2009