502
407.97
0.81
558

Publication History View all

  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: The biochemical paradigm is well-suited for modelling autonomous systems and new programming languages are emerging from this approach. However, in order to validate such programs, we need to define precisely their semantics and to provide verification techniques. In this paper, we consider a higher-order biochemical calculus that models the structure of system states and its dynamics thanks to rewriting abstractions, namely rules and strategies. We extend this calculus with a runtime verification technique in order to perform automatic discovery of property satisfaction failure. The property specification language is a subclass of LTL safety and liveness properties.
    Electronic Notes in Theoretical Computer Science 12/2013; 297:27–46.
  • [Show abstract] [Hide abstract]
    ABSTRACT: We address two distinct problems with de facto mobile device authentication, as provided by a password or sketch. Firstly, device activity is permitted on an all-or-nothing basis, depending on whether the user successfully authenticates at the beginning of a session. This ignores the fact that tasks performed on a mobile device have a range of sensitivities, depending on the nature of the data and services accessed. Secondly, users are forced to re-authenticate frequently due to the bursty nature that characterizes mobile device use. Owners react to this by disabling the mechanism, or by choosing a weak “secret”. To address both issues, we propose an extensible Transparent Authentication Framework that integrates multiple behavioral biometrics with conventional authentication to implement an effortless and continuous authentication mechanism. Our security and usability evaluation of the proposed framework showed that a legitimate device owner can perform all device tasks, while being asked to authenticate explicitly 67% less often than without a transparent authentication method. Furthermore, our evaluation showed that attackers are soon denied access to on-device tasks as their behavioral biometrics are collected. Our results support the creation of a working prototype of our framework, and provide support for further research into transparent authentication on mobile devices.
    Computers & Security 11/2013; 39:127–136.
  • [Show abstract] [Hide abstract]
    ABSTRACT: We identify four roles that social networking plays in the 'attribution problem', which obscures whether or not cyber-attacks were state-sponsored. First, social networks motivate individuals to participate in Distributed Denial of Service attacks by providing malware and identifying potential targets. Second, attackers use an individual's social network to focus attacks, through spear phishing. Recipients are more likely to open infected attachments when they come from a trusted source. Third, social networking infrastructures create disposable architectures to coordinate attacks through command and control servers. The ubiquitous nature of these architectures makes it difficult to determine who owns and operates the servers. Finally, governments recruit anti-social criminal networks to launch attacks on third-party infrastructures using botnets. The closing sections identify a roadmap to increase resilience against the 'dark side' of social networking. Practitioner Summary: This paper provides readers with an overview of state-sponsored cyber-attacks. I show how many of these threats have exploited social networks and social media. The aim was to alert practitioners to the dark side of computing, where attackers learn to exploit new interaction techniques and new forms of working.
    Ergonomics 07/2013;
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: We present formal specification and verification of a robot moving in a complex network, using temporal sequence learning to avoid obstacles. Our aim is to demonstrate the benefit of using a formal approach to analyze such a system as a complementary approach to simulation. We first describe a classical closed-loop simulation of the system and compare this approach to one in which the system is analyzed using formal verification. We show that the formal verification has some advantages over classical simulation and finds deficiencies our classical simulation did not identify. Specifically we present a formal specification of the system, defined in the Promela modeling language and show how the associated model is verified using the Spin model checker. We then introduce an abstract model that is suitable for verifying the same properties for any environment with obstacles under a given set of assumptions. We outline how we can prove that our abstraction is sound: any property that holds for the abstracted model will hold in the original (unabstracted) model.
    Neural Computation 06/2013;
  • [Show abstract] [Hide abstract]
    ABSTRACT: We consider the Bayesian analysis of mechanistic models describing the dynamic behavior of ligand-gated ion channels. The opening of the transmembrane pore in an ion channel is brought about by conformational changes in the protein, which results in a flow of ions through the pore. Remarkably, given the diameter of the pore, the flow of ions from a small number of channels or indeed from a single ion channel molecule can be recorded experimentally. This produces a large time-series of high-resolution experimental data, which can be used to investigate the gating process of these channels. We give a brief overview of the achievements and limitations of alternative maximum-likelihood approaches to this type of modeling, before investigating the statistical issues associated with analyzing stochastic model reaction mechanisms from a Bayesian perspective. Finally, we compare a number of Markov chain Monte Carlo algorithms that may be used to tackle this challenging inference problem.
    Methods in molecular biology (Clifton, N.J.) 01/2013; 1021:247-272.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Signalling pathways are well-known abstractions that explain the mechanisms whereby cells respond to signals. Collections of pathways form networks, and interactions between pathways in a network, known as cross-talk, enables further complex signalling behaviours. While there are several formal modelling approaches for signalling pathways, none make cross-talk explicit; the aim of this paper is to define and categorise cross-talk in a rigorous way. We define a modular approach to pathway and network modelling, based on the module construct in the PRISM modelling language, and a set of generic signalling modules. Five different types of cross-talk are defined according to various biologically meaningful combinations of variable sharing, synchronisation labels and reaction renaming. The approach is illustrated with a case-study analysis of cross-talk between the TGF-ββ, WNT and MAPK pathways.
    Theoretical Computer Science 10/2012; 456:30–50.
  • [Show abstract] [Hide abstract]
    ABSTRACT: The primary users of assisted living technology are older people who are likely to have one or more sensory impairments. Multimodal technology allows users to interact via non-impaired senses and provides alternative ways to interact if primary interaction methods fail. An empirical user study was carried out with older participants which evaluated the performance, disruptiveness and subjective workload of visual, audio, tactile and olfactory notifications then compared the results with earlier findings in younger participants. It was found that disruption and subjective workload were not affected by modality, although some modalities were more effective at delivering information accurately. It is concluded that although further studies need to be carried out in a real-world settings, the findings support the argument for multiple modalities in assisted living technology.
    Health Informatics Journal 09/2012; 18(3):181-90.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: In this thesis we investigate the use of quantum probability theory for ranking documents. Quantum probability theory is used to estimate the probability of relevance of a document given a user's query. We posit that quantum probability theory can lead to a better estimation of the probability of a document being relevant to a user's query than the common IR approach, i. e. the Probability Ranking Principle (PRP), which is based upon Kolmogorovian probability theory. Following our hypothesis, we formulate an analogy between the document retrieval scenario and a physical scenario, that of the double slit experiment. Through the analogy, we propose a novel ranking approach, the quantum probability ranking principle (qPRP). Key to our proposal is the presence of quantum interference. Mathematically, this is the statistical deviation between empirical observations and expected values predicted by the Kolmogorovian rule of additivity of probabilities of disjoint events in configurations such that of the double slit experiment. While PRP explicitly assumes that the relevancy of a document is independent of that of other documents, we suggest that qPRP implicitly models interdependent document relevance through quantum interference and thus is suited to those document ranking tasks where the independence assumption fails. Throughout the thesis, we also suggest how quantum interference can be estimated for effective document ranking. To validate our proposal and to gain more insights about approaches for document ranking, we (1) analyse PRP, qPRP and other ranking approaches, exposing the assumptions underlying their ranking criteria and formulating the conditions for the optimality of the two ranking principles, (2) empirically compare three ranking principles (i. e. PRP, interactive PRP, and qPRP) and two state-of-the-art ranking strategies in two retrieval scenarios, those of ad-hoc retrieval and diversity retrieval, (3) analytically contrast the ranking criteria of the examined approaches, exposing similarities and differences, (4) study the ranking behaviours of approaches alternative to PRP in terms of the kinematics they impose on relevant documents, i. e. by considering the extent and direction of the movements of relevant documents across the ranking recorded when comparing PRP against its alternatives. Our findings show that the effectiveness of the examined ranking approaches strongly depends upon the evaluation context. In the traditional evaluation context of ad-hoc retrieval, PRP is empirically shown to be better than or comparable to alternative ranking approaches. However, when evaluation contexts that account for interdependent document relevance are examined (i. e. when the relevance of a document is assessed also with respect to other retrieved documents, as it is the case in the diversity retrieval scenario), the use of quantum probability theory and thus of qPRP is shown to improve retrieval and ranking effectiveness over the traditional PRP and alternative ranking strategies, such as Maximal Marginal Relevance, Portfolio theory, and Interactive PRP. This work represents a significant step forward regarding the use of quantum theory in information retrieval. It demonstrates that the application of quantum theory to problems within information retrieval can lead to improvements both in modelling power and retrieval effectiveness, allowing the constructions of models that capture the complexity of information retrieval situations. Furthermore, the thesis opens up a number of lines of future research. These include investigating estimations and approximations of quantum interference in qPRP, exploiting complex numbers for the representation of documents and queries, and applying the concepts underlying qPRP to tasks other than document ranking. This dissertation was completed at School of Computing Science, University of Glasgow under the advise of Dr. Leif Azzopardi and Prof. Keith van Rijsbergen. Prof. Norbert Fuhr, Dr. Iadh Ounis, and Dr. John O'Donnell served as dissertation committee members. For the full dissertation, visit: http://theses.gla.ac.uk/3463.
    School of Computing Science, University of Glasgow, 06/2012, Degree: PhD
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: An instance of the classical Stable Roommates problem (sr) need not admit a stable matching. Previous work has considered the problem of finding a matching that is "as stable as possible", i.e., admits the fewest number of blocking pairs. It is known that this problem is NP-hard and not approximable within n 1 2 −ε , for any ε > 0, unless P=NP, where n is the number of agents in a given sr instance. In this paper we extend the study to the Stable Roommates problem with Incomplete lists. In particular, we consider the case that the lengths of the lists are bounded by some integer d. We show that, even if d = 3, there is some c > 1 such that the problem of finding a matching with the minimum number of blocking pairs is not approximable within c unless P=NP. On the other hand we show that the problem is solvable in polynomial time for d ≤ 2, and we give a (2d − 3)-approximation algorithm for fixed d ≥ 3. If the given lists satisfy an additional condition (namely the absence of a so-called elitist odd party – a structure that is unlikely to exist in general), the performance guarantee improves to 2d − 4.
    Theoretical Computer Science 05/2012; 432.
  • Source
    [Show abstract] [Hide abstract]
    ABSTRACT: Without Abstract
    04/2012; 4(2):71-72.
Information provided on this web page is aggregated encyclopedic and bibliographical information relating to the named institution. Information provided is not approved by the institution itself. The institution’s logo (and/or other graphical identification, such as a coat of arms) is used only to identify the institution in a nominal way. Under certain jurisdictions it may be property of the institution.