Ross W Gayler

Ross W Gayler
Independent Researcher https://www.rossgayler.com · Cognitive Science & Data Science

BSc BSc(Hons) PhD

About

59
Publications
35,289
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
1,777
Citations
Introduction
Credit scoring: Applied statistical modelling for operational prediction of customer behaviour in consumer finance. Cognitive science: Developing a practical, implementable, connectionist architecture for compositional memory. This is concerned with the ability to recognise novel situations and objects in terms of the novel pattern of structural relationships between their familiar component parts. This work effectively treats analogy as a primitive capability of memory.
Additional affiliations
March 2009 - May 2012
La Trobe University
Position
  • Honorary Associate
January 2004 - December 2006
Monash University (Australia)
Position
  • Intelligent techniques to exploit the dynamic temporal structure in detection of attacks in credit application fraud
October 2003 - December 2006
La Trobe University
Position
  • Honorary Associate
Education
January 1978 - December 1987
The University of Queensland
Field of study
  • Psychology
January 1977 - December 1977
The University of Queensland
Field of study
  • Psychology
January 1974 - December 1976
The University of Queensland
Field of study
  • Psychology & Computer Science

Publications

Publications (59)
Preprint
Full-text available
This correspondence comments on the findings reported in a recent Science Robotics article by Mitrokhin et al. [1]. The main goal of this commentary is to expand on some of the issues touched on in that article. Our experience is that hyperdimensional computing is very different from other approaches to computation and that it can take considerable...
Article
Full-text available
By pointing to deep philosophical confusions endemic to cognitive science, Wittgenstein might seem an enemy of computational approaches. We agree (with Mills 1993) that while Wittgenstein would reject the classicist’s symbols and rules approach, his observations align well with connectionist or neural network approaches. While many connectionisms t...
Presentation
Full-text available
It has been argued that analogy is at the core of cognition [7, 1]. My work in VSA is driven by the goal of building a practical, effective analogical memory/reasoning system. Analogy is commonly construed as structure mapping between a source and target [5], which in turn can be construed as representing the source and target as graphs and finding...
Presentation
Full-text available
Score calibration is the process of empirically determining the relationship between a score and an outcome on some population of interest, and scaling is the process of expressing that relationship in agreed units. Calibration is often treated as a simple matter and attacked with simple tools – typically, either assuming the relationship between s...
Article
Full-text available
A Bloom filter is a special case of an artificial neural network with two layers. Traditionally, it is seen as a simple data structure supporting membership queries on a set. The standard Bloom filter does not support the delete operation, and therefore, many applications use a counting Bloom filter to enable deletion. This paper proposes a general...
Article
Full-text available
Financial institutions use credit scorecards for risk management. A scorecard is a data-driven model for predicting default probabilities. Scorecard assessment concentrates on how well a scorecard discriminates good and bad risk. Whether predicted and observed default probabilities agree (i.e., calibration) is an equally important yet often overloo...
Article
Full-text available
Introduction. Neural network models of autoassociative, distributed memory allow storage and retrieval of many items (vectors) where the number of stored items can exceed the vector dimension (the number of neurons in the network). This opens the possibility of a sublinear time search (in the number of stored items) for approximate nearest neighbor...
Article
Full-text available
This paper proposes a simple encoding scheme for words using principles of Vector Symbolic Architectures. The proposed encoding allows finding a valid word in the dictionary for a given permuted word (represented using the proposed approach) using only a single operation – calculation of Hamming distance to the distributed representations of valid...
Article
Textual databases are ubiquitous in many application domains. Examples of textual data range from names and addresses of customers to social media posts and bibliographic records.With online services, individuals are increasingly required to enter their personal details for example when purchasing products online or registering for government servi...
Article
Full-text available
Real-time Entity Resolution (ER) is the process of matching query records in subsecond time with records in a database that represent the same real-world entity. Indexing techniques are generally used to efficiently extract a set of candidate records from the database that are similar to a query record, and that are to be compared with the query re...
Conference Paper
Full-text available
We advocate for a novel connectionist modeling framework as an answer to a set of challenges to AGI and cognitive science put forth by classical formal systems approaches. We show how this framework, which we call Vector Symbolic Architectures, or VSAs, is also the kind of model of mental activity that we arrive at by taking Ludwig Wittgenstein’s c...
Conference Paper
Entity resolution is the process of identifying records in one or multiple data sources that represent the same real-world entity. This process needs to deal with noisy data that contain for example wrong pronunciation or spelling errors. Many real world applications require rapid responses for entity queries on dynamic datasets. This brings challe...
Presentation
Full-text available
Melbourne Users of R Network
Conference Paper
Full-text available
Analogy-making is a key function of human cognition. Therefore, the development of computational models of analogy that automatically learn from examples can lead to significant advances in cognitive systems. Analogies require complex, relational representations of learned structures, which is challenging for both symbolic and neurally inspired mod...
Conference Paper
Full-text available
Entity resolution is the process of identifying groups of records in a single or multiple data sources that represent the same real-world entity. It is an important tool in data de-duplication, in linking records across databases, and in matching query records against a database of existing entities. Most existing entity resolution techniques compl...
Data
This survey paper categorises, compares, and summarises from almost all published technical and review articles in automated fraud detection within the last 10 years. It defines the professional fraudster, formalises the main types and subtypes of known fraud, and presents the nature of data evidence collected within affected industries. Within the...
Conference Paper
Entity resolution is the process of matching records that refer to the same entities from one or several databases in situations where the records to be matched do not include unique entity identifiers. Matching therefore has to rely upon partially identifying information, such as names and addresses. Traditionally, entity resolution has been appli...
Article
Full-text available
We propose a knowledge-representation architecture allowing a robot to learn arbitrarily complex, hierarchical / symbolic relationships between sensors and actuators. These relationships are encoded in high-dimensional, low-precision vectors that are very robust to noise. Low-dimensional (single-bit) sensor values are projected onto the high-dimens...
Article
Full-text available
In October 2004, approximately 30 connectionist and nonconnectionist researchers gathered at a AAAI symposium to discuss and debate a topic of central concern in artificial intelligence and cognitive science: the nature of compositionality. The symposium offered participants an opportunity to confront the persistent belief among traditional cogniti...
Article
Full-text available
This survey paper categorises, compares, and summarises from almost all published technical and review articles in automated fraud detection within the last 10 years. It defines the professional fraudster, formalises the main types and subtypes of known fraud, and presents the nature of data evidence collected within affected industries. Within the...
Conference Paper
Full-text available
We argue for a high standard of explanation in cognitive neuroscience and find that most models fail to deliver adequate cognitive functionality to meet that standard. Those models that are functionally adequate fail to scale up. The scandal of cognitive neuroscience is that these limitations are not widely recognised. We propose a tool stack that...
Conference Paper
Full-text available
Credit scoring is the use of predictive modelling techniques to support decision making in lending. It is a field of immense practical value that also supports a modest amount of academic research. Interestingly, the academic research tends not to be put into practice. This is not a result of insularity and arrogance on the part of the practitioner...
Article
This paper describes a rapid technique: communal analysis suspicion scoring (CASS), for generating numeric suspicion scores on streaming credit applications based on implicit links to each other, over both time and space. CASS includes pair-wise communal scoring of identifier attributes for applications, definition of categories of suspiciousness f...
Conference Paper
Full-text available
We are concerned with the practical fea- sibility of the neural basis of analogical map- ping. All existing connectionist models of ana- logical mapping rely to some degree on local- ist representation (each concept or relation is represented by a dedicated unit/neuron). These localist solutions are implausible because they need too many units for...
Conference Paper
Full-text available
Analogy has been considered as a crucial cognitive process since the seminal work on Structure Mapping Theory by Dedre Gentner in the 80ies. In the following years, many different approaches have been proposed for analogy making. Although these approaches were successful in finding explanations for the cognitive ability of analogy making (including...
Conference Paper
Full-text available
We present a fully distributed connectionist architecture supporting lateral inhibition / winner-takes all competition. All items (individuals, relations, and structures) are represented by high-dimensional distributed vectors, and (multi)sets of items as the sum of such vectors. The architecture uses a neurally plausible permutation circuit to sup...
Article
Full-text available
We are concerned with the practical feasibility of the neural basis of analogical mapping. All existing connectionist models of analogical mapping rely to some degree on localist representation (each concept or relation is represented by a dedicated unit/neuron). These localist solutions are implausible because they need too many units for human-le...
Conference Paper
Full-text available
We provide an overviewofVector Symbolic Architectures (VSA), a class of structured associative memory models that offers a number of desirable features for artificial general intelligence. By directly encoding structure using familiar, computationally efficient algorithms, VSA bypasses many of the problems that have consumed unnecessary effort and...
Conference Paper
Full-text available
Most research into entity resolution (also known as record linkage or data matching) has concentrated on the quality of the matching results. In this paper, we focus on matching time and scalability, with the aim to achieve large-scale real-time entity resolution. Traditional entity resolution techniques have assumed the matching of two static data...
Conference Paper
Full-text available
Automated adversarial detection systems can fail when under attack by adversaries. As part of a re- silient data stream mining system to reduce the pos- sibility of such failure, adaptive spike detection is attribute ranking and selection without class-labels. The first part of adaptive spike detection requires weighing all attributes for spiky-nes...
Article
Full-text available
This paper is on adaptive real-time searching of credit application data streams for identity crime with many search parameters. Specifically, we concentrated on handling our domain-specific adversarial activity problem with the adaptive Communal Analysis Suspicion Scoring (CASS) algorithm. CASS's main novel theoretical contribution is in the formu...
Article
Full-text available
Comment on Classifier Technology and the Illusion of Progress--Credit Scoring [math.ST/0606441]
Article
Full-text available
The authors, on the basis of brief arguments, have dismissed tensor networks as a viable response to Jackendoff's challenges. However, there are reasons to believe that connectionist approaches descended from tensor networks are actually very well suited to answering Jackendoff's challenges. I rebut their arguments for dismissing tensor networks an...
Article
Full-text available
The purpose of this paper is to outline some of the major developments of an identity crime/fraud stream mining system. Communal detection is about finding real communities of interest. The algorithm itself is unsupervised, single-pass, differentiates between normal and anomalous links, and mitigates the suspicion of normal links with a dynamic glo...
Conference Paper
Full-text available
Identity crime has increased enormously over the recent years. Spike detection is important because it highlights sudden and sharp rises in intensity relative to the current identity attribute value (which can be indicative of abuse). This paper proposes the new spike analysis framework for monitoring sparse personal identity streams. For each iden...
Article
Full-text available
This paper describes a technique for generating numeric suspicion scores on credit applications based on implicit links to each other, and over time and space. Its contributions include pair-wise communal scoring of identifier attributes for applications, definition of categories of suspiciousness for application-pairs, smoothed k-wise scoring of m...
Article
Full-text available
The American Association for Artificial Intelligence presented its 2004 Fall Symposium Series Friday through Sunday, October 22-24 at the Hyatt Regency Crystal City in Arlington, Virginia, adjacent to Washington, DC. The symposium series was preceded by a one-day AI funding seminar. The topics of the eight symposia in the 2004 Fall Symposia Series...
Article
Full-text available
Jackendoff (2002) posed four challenges that linguistic combinatoriality and rules of language present to theories of brain function. The essence of these problems is the question of how to neurally instantiate the rapid construction and transformation of the compositional structures that are typically taken to be the domain of symbolic processing....
Conference Paper
Full-text available
Jackendoff (2002) posed four challenges that linguistic combinatoriality and rules of language present to theories of brain function. The essence of these problems is the question of how to neurally instantiate the rapid construction and transformation of the compositional structures that are typically taken to be the domain of symbolic processing....
Article
Full-text available
this article are of varying nature: position summaries, individual research summaries, historical accounts, discussion of controversial issues, etc. We have not attempted to connect the various pieces together, or to organize them within a coherent framework. Despite this, we think, the reader will find this collection useful.
Conference Paper
Full-text available
This paper claims that higher cognition implemented by a connectionist system will be essentially analogical, with analogical mapping by continuous systematic substitution as the core cognitive process. The centrality of analogy is argued to be necessary in order for a connectionist system to use representations that are effectively symbolic. In tu...
Conference Paper
Full-text available
Thispaper introduces a novel implementation of the bind() operator that is simple, can be efficiently implemented,and highlights the relationship between retrieval queries and analogical mapping.A frame of role/filler bindings can easily be represented using bind() and bundle(). However, typical bindingsystems are unable to adequately represent mul...
Article
Full-text available
Analogical inference depends on systematic substitution of the components of compositional structures. Simple systematic substitution has been achieved in a number of connectionist systems that support binding (the ability to create connectionist representations of the combination of component representations). These systems have used various imple...
Chapter
Full-text available
There is now a reasonable amount of consensus that an analogy entails a mapping from one structure, the base or source, to another structure, the target (Gentner, 1983, 1989; Holyoak & Thagard, 1989). Theories of human analogical reasoning have been reviewed by Gentner (1989), who concludes that there is basic agreement on the one-to-one mapping of...
Article
Full-text available
The ROC curve is useful for assessing the predictive power of risk models and is relatively well known for this purpose in the credit scoring community. The ROC curve is a component of the Theory of Signal Detection (TSD), a theory which has pervasive links to many issues in model building. However, these conceptual links and their associated insig...

Network

Cited By

Projects

Projects (5)
Project
ARC Linkage Project - Detection of fraud in credit application stream processing
Project
Development of methods for record linkage