128
276.72
2.16
178

Publication History View all

  • [show abstract] [hide abstract]
    ABSTRACT: Stationarity assumptions of linked human–water systems are frequently invalid given the difficult-to-predict changes affecting such systems. In this case water planning occurs under conditions of deep or severe uncertainty, where the statistical distributions of future conditions and events are poorly known. In such situations predictive system simulation models are typically run under different scenarios to evaluate the performance of future plans under different conditions. Given that there are many possible plans and many possible futures, which simulations will lead to the best designs? Robust Decision Making (RDM) and Info-Gap Decision Theory (IGDT) provide a structured approach to planning complex systems under such uncertainty. Both RDM and IGDT make repeated use of trusted simulation models to evaluate different plans under different future conditions. Both methods seek to identify robust rather than optimal decisions, where a robust decision works satisfactorily over a broad range of possible futures. IGDT efficiently charts system performance with robustness and opportuneness plots summarising system performance for different plans under the most dire and favourable sets of future conditions. RDM samples a wider range of dire, benign and opportune futures and offers a holistic assessment of the performance of different options. RDM also identifies through ‘scenario discovery’ which combinations of uncertain future stresses lead to system vulnerabilities. In our study we apply both frameworks to a water resource system planning problem: London’s water supply system expansion in the Thames basin, UK. The methods help identify which out of 20 proposed water supply infrastructure portfolios is the most robust given severely uncertain future hydrological inflows, water demands and energy prices. Multiple criteria of system performance are considered: service reliability, storage susceptibility, capital and operating cost, energy use and environmental flows. Initially the two decision frameworks lead to different recommendations. We show the methods are complementary and can be beneficially used together to better understand results and reveal how the particulars of each method can skew results towards particular future plans.
    Journal of Hydrology 06/2013; 494:43–58.
  • [show abstract] [hide abstract]
    ABSTRACT: A simple model of a circularly closed double-stranded DNA in a poor solvent is considered as an example of a semi-flexible polymer with self-attraction. To find the ground states, the conformational energy is computed as a sum of the bending and torsional elastic components and the effective self-attraction energy. The model includes a relative orientation or sequence dependence of the effective attraction forces between different pieces of the polymer chain. Two series of conformations are analysed: a multicovered circle (a toroid) and a multifold two-headed racquet. The results are presented as a diagram of state. It is suggested that the stability of particular conformations may be controlled by proper adjustment of the primary structure. Application of the model to other semi-flexible polymers is considered.
    The Journal of Chemical Physics 04/2013; 138(16):164903.
  • [show abstract] [hide abstract]
    ABSTRACT: SECOA (Solutions for Environmental Contrasts in Coastal Areas) is a multi-national research project examining the effects of human mobility on urban settlements in fragile coastal environments. This paper describes the setting up of a SECOA metadata repository for non-specialist researchers such as environmental scientists and tourism experts. Conflicting usability requirements of two groups - metadata creators and metadata users - are identified along with associated limitations of current metadata standards. A description is given of a configurable metadata system designed to grow as the project evolves. This work is of relevance for similar projects such as INSPIRE.
    Applied ergonomics 02/2013;
  • [show abstract] [hide abstract]
    ABSTRACT: Motivated by continuum models for DNA supercoiling we formulate a theory for equilibria of 2-braids, i.e., structures formed by two elastic rods winding around each other in continuous contact and subject to a local interstrand interaction. No assumption is made on the shape of the contact curve. The theory is developed in terms of a moving frame of directors attached to one of the strands. The other strand is tracked by including in this frame the normalised closest-approach chord connecting the two strands. The kinematic constant-distance constraint is formulated at strain level through the introduction of what we call braid strains. As a result the total potential energy involves arclength derivatives of these strains, thus giving rise to a second-order variational problem. The Euler-Lagrange equations for this problem give balance equations for the overall braid force and moment referred to the moving frame as well as differential equations that can be interpreted as effective constitutive relations encoding the effect that the second strand has on the first as the braid deforms under the action of end loads. Hard contact models are used to obtain the normal contact pressure between strands that has to be non-negative for a physically realisable solution without the need for external devices such as clamps or glue to keep the strands together. The theory is first illustrated by a number of problems that can be solved analytically and then applied to several new problems that have not hitherto been treated.
    Journal of the Mechanics and Physics of Solids 01/2013;
  • [show abstract] [hide abstract]
    ABSTRACT: London is expected to experience more frequent periods of intense rainfall and tidal surges, leading to an increase in the risk of flooding. Damp and flooded dwellings can support microbial growth, including mould, bacteria, and protozoa, as well as persistence of flood-borne microorganisms. The amount of time flooded dwellings remain damp will depend on the duration and height of the flood, the contents of the flood water, the drying conditions, and the building construction, leading to particular properties and property types being prone to lingering damp and human pathogen growth or persistence. The impact of flooding on buildings can be simulated using Heat Air and Moisture (HAM) models of varying complexity in order to understand how water can be absorbed and dry out of the building structure. This paper describes the simulation of the drying of building archetypes representative of the English building stock using the EnergyPlus based tool 'UCL-HAMT' in order to determine the drying rates of different abandoned structures flooded to different heights and during different seasons. The results are mapped out using GIS in order to estimate the spatial risk across London in terms of comparative flood vulnerability, as well as for specific flood events. Areas of South and East London were found to be particularly vulnerable to long-term microbial exposure following major flood events.
    Environment international 12/2012; 51C:182-195.
  • Source
    [show abstract] [hide abstract]
    ABSTRACT: Understanding travel behaviour and travel demand is of constant importance to transportation communities and agencies in every country. Nowadays, attempts have been made to automatically infer transportation modes from positional data, such as the data collected by using GPS devices so that the cost in time and budget of conventional travel diary survey could be significantly reduced. Some limitations, however, exist in the literature, in aspects of data collection (sample size selected, duration of study, granularity of data), selection of variables (or combination of variables), and method of inference (the number of transportation modes to be used in the learning). This paper therefore, attempts to fully understand these aspects in the process of inference. We aim to solve a classification problem of GPS data into different transportation modes (car, walk, cycle, underground, train and bus). We first study the variables that could contribute positively to this classification, and statistically quantify their discriminatory power. We then introduce a novel approach to carry out this inference using a framework based on Support Vector Machines (SVMs) classification. The framework was tested using coarse-grained GPS data, which has been avoided in previous studies, achieving a promising accuracy of 88% with a Kappa statistic reflecting almost perfect agreement.
    Computers Environment and Urban Systems 11/2012; 36(6):526–537.
  • Computers Environment and Urban Systems 11/2012; 36(6):481–487.
  • Source
    [show abstract] [hide abstract]
    ABSTRACT: As more and more real time spatio-temporal datasets become available at increasing spatial and temporal resolutions, the provision of high quality, predictive information about spatio-temporal processes becomes an increasingly feasible goal. However, many sensor networks that collect spatio-temporal information are prone to failure, resulting in missing data. To complicate matters, the missing data is often not missing at random, and is characterised by long periods where no data is observed. The performance of traditional univariate forecasting methods such as ARIMA models decreases with the length of the missing data period because they do not have access to local temporal information. However, if spatio-temporal autocorrelation is present in a space–time series then spatio-temporal approaches have the potential to offer better forecasts. In this paper, a non-parametric spatio-temporal kernel regression model is developed to forecast the future unit journey time values of road links in central London, UK, under the assumption of sensor malfunction. Only the current traffic patterns of the upstream and downstream neighbouring links are used to inform the forecasts. The model performance is compared with another form of non-parametric regression, K-nearest neighbours, which is also effective in forecasting under missing data. The methods show promising forecasting performance, particularly in periods of high congestion.
    Computers Environment and Urban Systems 11/2012; 36(6):538–550.
  • [show abstract] [hide abstract]
    ABSTRACT: A combination of environmental measurement and mathematical modelling may provide a more quantitative method to inform the tuberculosis (TB) screening process in non-household settings following diagnosis of an infectious case. To explore different methods for environmental assessment and mathematical modelling to predict TB transmission risk and devise a tool for public health practitioners for use in TB investigations. Parameters including air flow, carbon dioxide (CO(2)) and airborne particles were measured over 3 working days in an office with a staff member with infectious TB. The Wells-Riley model was applied to predict transmission rates. The results suggested that poor ventilation and well-mixed air led to equal exposure of staff members to airborne TB bacilli. The model's prediction of attack rate (42%) supported the actual number of infections that occurred (50%). This study supports the use of environmental assessment and modelling as a tool for public health practitioners to determine the extent of TB exposure and to inform TB screening strategies. CO(2) and airborne particle profiles, both measured via a handheld device, provide the greatest practicality and amount of information that public health practitioners can use. Further studies will validate the level of screening required related to these measurements.
    The international journal of tuberculosis and lung disease: the official journal of the International Union against Tuberculosis and Lung Disease 06/2012; 16(8):1023-9.
  • Source
    [show abstract] [hide abstract]
    ABSTRACT: This paper introduces a novel methodology based on disaggregate analysis of two-car crash data to estimate the partial effects of mass, through the velocity change, on absolute driver injury risk in each of the vehicles involved in the crash when absolute injury risk is defined as the probability of injury when the vehicle is involved in a two-car crash. The novel aspect of the introduced methodology is in providing a solution to the issue of lack of data on the speed of vehicles prior to the crash, which is required to calculate the velocity change, as well as a solution to the issue of lack of information on non-injury two-car crashes in national accident data. These issues have often led to focussing on relative measures of injury risk that are not independent of risk in the colliding cars. Furthermore, the introduced methodology is used to investigate whether there is any effect of vehicle size above and beyond that of mass ratio, and whether there are any effects associated with the gender and age of the drivers. The methodology was used to analyse two-car crashes to investigate the partial effects of vehicle mass and size on absolute driver injury risk. The results confirmed that in a two-car collision, vehicle mass has a protective effect on its own driver injury risk and an aggressive effect on the driver injury risk of the colliding vehicle. The results also confirmed that there is a protective effect of vehicle size above and beyond that of vehicle mass for frontal and front to side collisions.
    Accident; analysis and prevention 05/2012;
Information provided on this web page is aggregated encyclopedic and bibliographical information relating to the named institution. Information provided is not approved by the institution itself. The institution’s logo (and/or other graphical identification, such as a coat of arms) is used only to identify the institution in a nominal way. Under certain jurisdictions it may be property of the institution.