Malcolm CroweUniversity of the West of Scotland | UWS · School of Computing Engineering and Physical Sciences
Malcolm Crowe
D.Phil (Oxon)
About
88
Publications
8,864
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
333
Citations
Introduction
Database Technology: optimistic concurrency and loosely-coupled distributed data.
Additional affiliations
May 2018 - present
April 1985 - May 2018
University of Paisley/ University of the West of Scotland
Position
- Professor (Full)
Description
- Head of Department from 1985-1000.
August 1972 - April 1985
Paisley College of Technology
Position
- Lecturer
Education
September 1968 - August 1972
September 1965 - August 1969
Publications
Publications (88)
This paper reviews the changes for database technology represented by the current development of the draft international standard ISO 39075 (Database Languages-GQL), which seeks a unified specification for property graphs and knowledge graphs. This paper examines these current developments as part of our review of the evolution of database technolo...
Recent standardization work for database languages has reflected the growing use of typed graph models (TGM) in application development. Such data models are frequently only used early in the design process, and not reflected directly in underlying physical database. In previous work, we have added support to a relational database management system...
The International Standards Organization (ISO) is developing a new standard for Graph Query Language, with a particular focus on graph patterns with repeating paths. The Linked Database Benchmark Council (LDBC) has developed benchmarks to test proposed implementations. Their Financial Benchmark includes a novel requirement for truncation of results...
This paper reviews the changes for database technology represented by the current development of the draft international standard ISO 39075 (Database Languages - GQL), which seeks a unified specification for property graphs and knowledge graphs. This paper examines these current developments as part of our review of the evolution of database techno...
Recent standardization work for database languages has reflected the growing use of typed graph models (TGM) in application development. Such data models are frequently only used early in the design process, and not reflected directly in underlying physical database. In previous work, we have added support to a relational database management system...
The International Standards Organization (ISO) is developing a new standard for Graph Query Language, with a particular focus on graph patterns with repeating paths. The Linked Database Benchmark Council (LDBC) has developed benchmarks to test proposed implementations. Their Financial Benchmark includes a novel requirement for truncation of results...
Recent work on database application development platforms has sought to include a declarative formulation of a conceptual data model in the application code, using annotations or attributes. Some recent work has used metadata to include the details of such formulations in the physical database, and this approach brings significant advantages in tha...
Recent work on database application development platforms has sought to include a declarative formulation of a conceptual data model in the application code, using annotations or attributes. Some recent work has used metadata to include the details of such formulations in the physical database, and this approach brings significant advantages in tha...
This paper reviews suggestions for changes to database technology coming from the work of many researchers, particularly those working with evolving big data. We discuss new approaches to remote data access and standards that better provide for durability and auditability in settings including business and scientific computing. We propose ways in w...
The Fifteenth International Conference on Advances in Databases, Knowledge,
and Data Applications
ISBN: 978-1-68558-056-8
March 13th - 17th, 2023
Barcelona, Spain
This paper reviews suggestions for changes to database technology coming from the work of many researchers, particularly those working with evolving big data. We discuss new approaches to remote data access and standards that better provide for durability and auditability in settings including business and scientific computing. We propose ways in w...
At DBKDA 2019, we demonstrated that StrongDBMS with simple but rigorous optimistic algorithms, provides better performance in situations of high concurrency than major commercial database management systems (DBMS). The demonstration was convincing but the reasons for its success were not fully analysed. There is a brief account of the results below...
Data Integration of heterogeneous data sources relies either on periodically transferring large amounts of data to a physical Data Warehouse or retrieving data from the sources on request only. The latter results in the creation of what is referred to as a virtual Data Warehouse, which is preferable when the use of the latest data is paramount. How...
Schema and data integration have been a challenge for more than 40 years. While data warehouse technologies are quite a success story, there is still a lack of information integration methods, especially if the data sources are based on different data models or do not have a schema. Enterprise Information Integration has to deal with heterogeneous...
At DBKDA 2019, we demonstrated that StrongDBMS with simple but rigorous optimistic algorithms, provides better performance in situations of high concurrency than major commercial database management systems (DBMS). The demonstration was convincing but the reasons for its success were not fully analysed. There is a brief account of the results below...
TPCC benchmark using multiple clerks for a single warehouse.
StrongDBMS is a new relational Database Management System (DBMS). Atomicity, Consistency, Isolation and Durability (ACID) properties are guaranteed through the use of an explicit transaction log and immutable software components. The shareable data structures used allow instant snapshots and provide thread-safety even for iterators, and minimize th...
StrongDBMS is a new relational Database Management System (DBMS). Atomicity, Consistency, Isolation and Durability (ACID) properties are guaranteed through the use of an explicit transaction log and immutable software components. The shareable data structures used allow instant snapshots and provide thread-safety even for iterators, and minimize th...
One major goal of database management systems (DBMS) is to shield all the programming difficulties like concurrency control, atomicity, checking consistency, durability, physical data access, indices, etc. from the programmer. For client programs that directly access the database API this goal has been mainly achieved by developing SQL level interf...
Big Data collected from different sources is always out of date. With Big Live Data, data is obtained from the sources only when requested. The sources agree to provide a simple public view of their data, and the big live data processor generates queries derived from the view definition to minimise data transfer. This allows a large number of repor...
A proposed implementation for Big Live Data using REST and a simple extension to SQL called RESTView. The ideas were presented at an international conference at UWS
http://iima.org/wp/wp-content/uploads/2017/04/FINAL-Programme-8-Sept.pdf
and have since been implemented at PyrrhoDB.com.
Data Integration of heterogeneous data sources relies either on periodically transferring large amounts of data to a physical Data Warehouse or retrieving data from the sources on request only. The latter results in the creation of what is referred to as a virtual Data Warehouse, which is preferable when the use of the latest data is paramount. How...
Purpose – The purpose of this paper is to show how description logics can be applied to formalizing the Information Bearing Capability (IBC) of paths in ER schemata.
Design/Methodology/Approach – The approach follows and extends the idea presented in Xu and Feng 2004, which applies description logics to classifying paths in an ER schema. To verify...
This book focuses on a number of aspects of database management systems that are important in real life application, but in which current products fall short of what is required. The book makes a contribution to DBMS design by suggesting a number of fundamental improvements to DBMS architecture, and validating them by means of a working proof-of-co...
Purpose – The purpose of this paper is to show how description logics can be applied to formalizing the Information Bearing Capability (IBC) of paths in ER schemata.
Design/Methodology/Approach – The approach follows and extends the idea presented in Xu and Feng 2004, which applies description logics to classifying paths in an ER schema. To verify...
This paper presents an approach to Reinforcement Learning that seems to work very well in changing environments. The experiments are based on an unmanned vehicle problem where the vehicle is equipped with navigation cameras and uses a multilayer perceptron (MLP). The route can change and obstacles can be added without warning. In the steady state,...
Affiliate Networks are the main source of communication between publishers and advertisers where publishers normally subscribe as a service provider and advertisers as an employer. These networks are helping both the publishers and advertisers in terms of providing them with a platform where they can build an automated affiliate connection with eac...
Curvilinear Component Analysis (CCA) is a useful data visualisation method. CCA has the technical property that its optimisation surface, as defined by its stress function, changes during the optimisation according to a decreasing parameter. CCA uses a variant of the stochastic gradient descent method to create a mapping of data. In the optimisatio...
This paper develops a new theoretical model to represent the important knowledge sharing factors and their role in development of trust in professional learning organisations. These factors are built upon the assumptions of organisations knowledge sharing behaviour and the role of technological advancement/awareness based on literature. The factors...
Reinforcement learning is one of the major strands of current computational intelligence: it is used to enable an agent to explore an environment in order to ascertain the best actions in that environment. Genetic programming is a method to evolve programs and given the similarity between genetic algorithms and reinforcement learning, it is perhaps...
The Sammon mapping has been one of the most successful nonlinear metric multidimensional scaling methods since its advent in 1969, but effort has been focused on algorithm improvement rather than on the form of the stress function. This paper further investigates using left Bregman divergences to extend the Sammon mapping and by analogy develops ri...
Composition of software components via Web technologies, scalability demands, and Mobile Computing has led to a questioning of the classical transaction concept. Some researchers have moved away from a synchronous model with strict atomicity, consistency, isolation and durability (ACID) to an asynchronous, disconnected one with possibly weaker ACID...
We consider means of extracting information from two data streams simultaneously when each data stream contains information about the other, i.e., there is redundancy in the data streams and we wish to identify the commonality between the data streams. The standard statistical method for doing this is canonical correlation analysis and so we consid...
We discuss an approach to formalise game playing behaviour for human and AI players. The presented outline for an implementation facilitates a high level description to steer the AI in games designed for behavioural experiments.
Sum of weighted square distance errors has been a popular way of defining stress function for metric multidimensional scaling (MMDS) like the Sammon mapping. In this paper we generalise this popular MMDS with Bregman divergences, as an example we show that the Sammon mapping can be thought of as a truncated Bregman MMDS (BMMDS) and we show that the...
This work presents a disconnected transaction model able to cope
with the increased complexity of long-living, hierarchically structured, and disconnected
transactions. We combine an Open and Closed Nested Transaction Model with Optimistic
Concurrency Control and interrelate flat transactions with the
aforementioned complex nature. Despite temporar...
Reliable broadband communications are required during disaster recovery and emergency response operations, when infrastructure-based communications have been destroyed or are not available. Wireless Mesh Networks (WMNs) are multi-hop wireless networks with instant deployment, self-healing, self-organization and self-configuration features. These ca...
The last decade has witnessed rapid development of diverse emerging wireless systems such as the thirdgeneration (3G) and beyond wide-area cellular networks, municipal area networks and local area hotspots. It is widely envisioned that these coexisting heterogeneous networks could complement each other in capacity and coverage and would converge in...
Curvilinear Component Analysis (CCA) is an interesting flavour of multidimensional scaling. In this paper one version of CCA is proved to be related to the mapping found by a specific Bregman divergence and its stress function is redefined based on this insight, and its parameter (the neighbourhood radius) is explained. 1
Presenting Web-content inside a 3D shared virtual space has strong potential as a tool for collaboration, but there are a number of issues to be resolved - an array of factors both technical and human. In this short paper we outline work on the SLOODLE browser, a collaborative browser for Second Life. We hope that this work has the potential to enh...
The problem of ‘information content’ of an information system appears elusive. In the field of databases, the information
content of a database has been taken as the instance of a database. We argue that this view misses two fundamental points.
One is a convincing conception of the phenomenon concerning information in databases, especially a proper...
IntroductionThe Journey of DiscoveryObjectivity and ContextThe Academic CommunityConclusions
References
An important book by researchers from across disciplines introducing varying ideas on research, important in these days of inter-disciplinary and multi-centered investigation. The book introduces academics to new areas of endeavour and encourages researchers and students to think broadly when devising their studies. Linking chapters present the con...
Recent articles and research issues (for example, Date [1], Zimanyi [2]) suggest that temporal data remains a source of difficulty to database designers and implementers. This paper analyses some of the issues and suggests some mechanisms that could help simplify the problems, notably a new derived column concept for NEXT, a new temporal join opera...
This report documents the details of a prototype called IIR-Reasoning, which exploits the notion of information content inclusion relationship in a database setting. We introduce the prototype, the database we used, business rules used within the reasoning process and show some test results.
We review the performance function associated with the familiar K-Means algorithm and that of the recently developed K-Harmonic
Means. The inadequacies in these algorithms leads us to investigate a family of performance functions which exhibit superior
clustering on a variety of data sets over a number of different initial conditions. In each case,...
Song and Bruza introduce a framework for Information Retrieval(IR) based on Gardenfor's three tiered cognitive model; Conceptual Spaces. They instantiate a conceptual space using Hyperspace Analogue to Language (HAL to generate higher order concepts which are later used for ad-hoc retrieval. In this poster, we propose an alternative implementation...
This paper reports on the COSMOS system which provides an integrated "COnstruction Sites Mobile Operations Support" solution by means of wireless networks as well as technical and business applications designed for mobile use. Since remote construction sites are often not easily reached by terrestrial communication paths, a satellite link is establ...
This paper will examine the shifts in perspective that have taken place in the Information Systems community in the last decade, as reflected in the published literature. Specifically, a trend is detectable where authors increasingly treat information almost as if it were data, rather than being inside the heads of the participants. Currently, it w...
Two features of this dynamic workflow system make it suitable for the use of quasi-intelligent agents: (a) workflow processes need not be fully specified, and so can be non-prescriptive in approach, and (b) a job can be modified independently of the process of which it is an instance, and so some participants may have permissions to change its cour...
This paper presents Network-Training Collaboration in Europe and China (NCEC), a joint project between two European and four Chinese institutions aiming at designing and developing network-based course production, delivery and presentation systems for China. Sponsored by the European Union, the NCEC project is designated to provide on-line collabor...
In Entity-Relationship (ER) modeling connection traps are a known problems. But the literature does not seem to have provided
an adequate treatment of it. Moreover, it seems to be only a special case of a more fundamental problem of whether a piece
of information can be represented by a database that is specified by an ER schema. To develop a syste...
The conventional wisdom in the development of software systems is that such systems should be built against a set of requirements that have been captured during the requirements analysis phase of the development life cycle. Textbooks on software engineering describe methods for system design and implementation that are predicated upon the idea that...
Conclusions A conceptual apparatus has been described that attempts to show how consideration of business processes can help to bridge the gap between highlevel consideration of the nature and purposes of organisations and the needs for computer systems analysts to consider data and data flows.
The purpose of this document is to present a set of mechanisms and concepts for object systems based on an external relational database. The object space may be shared among a set of applications which use the standard query language SQL as its principal data access mechanism. Methods are not a concern of this paper and may be handled by callout to...
The purpose of this paper is to attempt to respond to the systems approach from an engineering standpoint, in particular that of a computing scientist. Most computing scientists, and even most engineers, spend regrettably little time reflecting on the nature of the systems they work with.
This paper decribes an incremental environment which has been developed to meet the needs of developers and maintainers of large Ada projects. This environment assists the user to develop valid Ada text while consulting other Ada packages and documents. It is built around an embedded syntax-directed editor which controls text inspection, modificati...
A system for dynamic compilation under the Unix operating system is described. The basis of the system is an incremental assembler that can be used statically or during program execution to insert or replace a module in an executable image. All cross-module references are via offets into a run-time symbol table. All generated code is independent of...
This paper describes an approach to a project management which is sufficiently flexible to allow development methodologies and objectives to vary from one project to another. Enhancements to the UNIX C library are described which allow all the usual UNIX tools to support hierarchical file attributes, version control and controlled access, by using...
An abstract is not available.