
James Alfred Walker- Doctor of Philosophy
- Senior Lecturer at University of York
James Alfred Walker
- Doctor of Philosophy
- Senior Lecturer at University of York
About
110
Publications
28,505
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
1,288
Citations
Introduction
Current institution
Publications
Publications (110)
Data analytics is commonly used to enable storytelling and enhance esport coverage. One prominent use of it is win prediction, where machine learning models predict the winner of the game before its conclusion. However, predictions are most commonly results of black-box systems, forcing commentators to produce ad-hoc interpretations. Additionally,...
The three-dimensional swimming tracks of motile microorganisms can be used to identify their species, which holds promise for the rapid identification of bacterial pathogens. The tracks also provide detailed information on the cells’ responses to external stimuli such as chemical gradients and physical objects. Digital holographic microscopy (DHM)...
Many constraint satisfaction and optimisation problems can be solved effectively by encoding them as instances of the Boolean Satisfiability problem (SAT). However, even the simplest types of constraints have many encodings in the literature with widely varying performance, and the problem of selecting suitable encodings for a given problem instanc...
Esport games comprise a sizeable fraction of the global games market, and is the fastest growing segment in games. This has given rise to the domain of esports analytics, which uses telemetry data from games to inform players, coaches, broadcasters and other stakeholders. Compared to traditional sports, esport titles change rapidly, in terms of mec...
Natural language instruction following is paramount to enable collaboration between artificial agents and human beings. Natural language-conditioned reinforcement learning (RL) agents have shown how natural languages' properties, such as compositionality, can provide a strong inductive bias to learn complex policies. Previous architectures like HIG...
Many constraint satisfaction and optimisation problems can be solved effectively by encoding them as instances of the Boolean Satisfiability problem (SAT). However, even the simplest types of constraints have many encodings in the literature with widely varying performance, and the problem of selecting suitable encodings for a given problem instanc...
Esport games comprise a sizeable fraction of the global games market, and is the fastest growing segment in games. This has given rise to the domain of esports analytics, which uses telemetry data from games to inform players, coaches, broadcasters and other stakeholders. Compared to traditional sports, esport titles change rapidly, in terms of mec...
Natural languages are powerful tools wielded by human beings to communicate information. Among their desirable properties, compositionality has been the main focus in the context of referential games and variants, as it promises to enable greater systematicity to the agents which would wield it. The concept of disentanglement has been shown to be o...
Competitive video game playing, an activity called esports, is increasingly popular to the point that there are now many professional competitions held for a variety of games. These competitions are broadcast in a professional manner similar to traditional sports broadcasting. Esports games are generally fast paced, and due to the virtual nature of...
Human beings use compositionality to generalise from past experiences to actual or fictive, novel experiences. To do so, we separate our experiences into fundamental atomic components. These atomic components can then be recombined in novel ways to support our ability to imagine and engage with novel experiences. We frame this as the ability to lea...
Every minute, 500 h of footage is uploaded to Youtube.com, and ∼1900 h of footage is livestreamed on Twitch.tv. It can therefore be challenging for viewers to find the content they are most likely to enjoy. Highlight videos can entertain users who did not watch a broadcast, e.g. due to a lack of awareness, availability, or willingness. Furthermore,...
Competitive video game playing, an activity called esports, is increasingly popular to the point that there are now many professional competitions held for a variety of games. These professional competitions are often broadcast in a professional manner similar to traditional sports broadcasts. The ability to perform moment-to-moment prediction with...
Conversational Artificial Intelligence (CAI) systems and Intelligent Personal Assistants (IPA), such as Alexa, Cortana, Google Home and Siri are becoming ubiquitous in our lives, including those of children, the implications of which is receiving increased attention, specifically with respect to the effects of these systems on children’s cognitive,...
Conversational Artificial Intelligence (CAI) systems and Intelligent Personal Assistants (IPA), such as Alexa, Cortana, Google Home and Siri are becoming ubiquitous in our lives, including those of children, the implications of which is receiving increased attention, specifically with respect to the effects of these systems on children's cognitive,...
Indirect encoding is a promising area of research in machine learning/evolutionary computation, however, it is rarely able to achieve performance on par with state of the art directly encoded methods. One of the most important properties of indirect encoding is the ability to control exploration during learning by transforming random genotypic vari...
One of the most important lessons from the success of deep learning is that learned representations tend to perform much better at any task compared to representations we design by hand. Yet evolution of evolvability algorithms, which aim to automatically learn good genetic representations, have received relatively little attention, perhaps because...
The notion of self-play, albeit often cited in multiagent Reinforcement Learning as a process by which to train agent policies from scratch, has received little efforts to be taxonomized within a formal model. We present a formalized framework, with clearly defined assumptions, which encapsulates the meaning of self-play as abstracted from various...
The drivers of compositionality in artificial languages that emerge when two (or more) agents play a non-visual referential game has been previously investigated using approaches based on the REINFORCE algorithm and the (Neural) Iterated Learning Model. Following the more recent introduction of the \textit{Straight-Through Gumbel-Softmax} (ST-GS) a...
Natural languages are powerful tools wielded by human beings to communicate information and co-operate towards common goals. Their values lie in some main properties like compositionality, hierarchy and recurrent syntax, which computational linguists have been researching the emergence of in artificial languages induced by language games. Only rela...
Decentralised gambling applications are a new way for people to gamble online. Decentralised gambling applications are distinguished from traditional online casinos in that players use cryptocurrency as a stake. Also, rather than being stored on a single centralised server, decentralised gambling applications are stored on a cryptocurrency’s blockc...
Most natural language processing research focuses on modelling and understanding text formed of complete sentences with correct spelling and grammar. However, livestream chat is drastically different. Viewers are typically writing short messages while responding to in-stream events, often with incorrect grammar and many repeated tokens. Additionall...
The automated evaluation of creative products promises both good-and-scalable creativity assessments and new forms of visual analysis of whole corpora. Where creative works are not ‘born digital’, such automated evaluation requires fast and frugal ways of transforming them into data representations that can be meaningfully assessed with common crea...
Throughout scientific history, overarching theoretical frameworks have allowed researchers to grow beyond personal intuitions and culturally biased theories. They allow to verify and replicate existing findings, and to link disconnected results. The notion of self-play, albeit often cited in multiagent Reinforcement Learning, has never been grounde...
Automated game balancing has often focused on single-agent scenarios. In this paper we present a tool for balancing multi-player games during game design. Our approach requires a designer to construct an intuitive graphical representation of their meta-game target, representing the relative scores that high-level strategies (or decks, or character...
Automated game balancing has often focused on single-agent scenarios. In this paper we present a tool for balancing multi-player games during game design. Our approach requires a designer to construct an intuitive graphical representation of their meta-game target, representing the relative scores that high-level strategies (or decks, or character...
Automated game balancing has often focused on single-agent scenarios. In this paper we present a tool for balancing multi-player games during game design. Our approach requires a designer to construct an intuitive graphical representation of their meta-game target, representing the relative scores that high-level strategies (or decks, or character...
Player clustering when applied to the field of video games has several potential applications. For example, the evaluation of the composition of a player base or the generation of AI agents with identified playing styles. These agents can then be used for either the testing of new game content or used directly to enhance a player’s gaming experienc...
Decentralised gambling applications are a new way for individuals to engage in online gambling. Decentralised gambling applications are distinguished from traditional online casinos in that individuals use cryptocurrency as a stake. Furthermore, rather than being stored on a traditional server, decentralised gambling applications are stored on a cr...
In computer graphics and virtual environment development, a large portion of time is spent creating assets - one of these being the terrain environment, which usually forms the basis of many large graphical worlds. The texturing of height maps is usually performed as a post-processing step - with software requiring access to the height and gradient...
Ethereum crypto-games are a booming and relatively unexplored area of the games industry. While there is no consensus definition yet, 'crypto-games' commonly denotes games that store tokens, e.g. in-game items, on a distributed ledger atop a cryptocurrency network. This enables the trading of game items for cryptocurrency, which can then be exchang...
Decentralised gambling can be described as a type of online gambling. This submission evidences the form and prevalence of decentralised gambling applications, and shows how this information can help address several of the committee’s questions. This document was prepared for, accepted, and published by, the House of Lords Select Committee on the S...
Esports is an organised form of video games played competitively. The esports industry has grown rapidly in recent years, with global audiences estimated at the hundreds of millions. One of the most popular esports formats is the Multi-Player Online Battle Arena (MOBA), which sees two teams of players competing. In MOBAs and other team-based games,...
Hyperinflation and price volatility in virtual economies has the potential to reduce player satisfaction and decrease developer revenue. This paper describes intuitive analytical methods for monitoring volatility and inflation in virtual economies, with worked examples on the increasingly popular multiplayer game Old School Runescape. Analytical me...
This presentation covers some of the key ideas in the corresponding paper, including interpreting bivariate plots of market data, creating virtual economic indexes, and applying statistical tests to indexes (and returns). This presentation was given on Wednesday at IEEE COG 2019 under the 'Analytics and Player Modelling' track.
Data driven gambling research has historically been limited by the availability of rich large scale data sets, which have seldom been published for replication studies. In contrast, new gambling services built using distributed ledger technology, as underpins cryptocurrencies, allow inspection of gambling activity through these services at the indi...
This poster describes in plain terms the key findings from the talk given at the Current Advances in Gambling Research (CAGR) conference in 2019. This poster exists purely digitally and was not formally presented, instead acting to supplement the content of the presentation and corresponding pre-print, which described decentralised gambling and the...
Throughout scientific history, overarching theoretical frameworks have allowed researchers to grow beyond personal intuitions and culturally biased theories. They allow to verify and replicate existing findings, and to link disconnected results. The notion of self-play, albeit often cited in multiagent Reinforcement Learning, has never been grounde...
This paper describes the York Combined Transaction Set (YCTS), which offers a single consolidated list of publicly available gambling related transactions derived from the Ethereum blockchain. This data includes over 1.4M individual transactions across 17,000+ unique addresses, which represent spending on decentralised gambling smart contracts. The...
Video game streaming provides the viewer with a rich set of audio-visual data, conveying information both with regards to the game itself, through game footage and audio, as well as the streamer's emotional state and behaviour via webcam footage and audio. Analysing player behaviour and discovering correlations with game context is crucial for mode...
Heightmap generation is currently a tedious topic with the majority of generation using Perlin noise which forms a reliable, but sometimes repetitive output. In this paper, a method of generating height maps from real-world digital elevation data taken from specific regions of the planet is proposed. Raw elevation data sourced from NASA's SRTM (30m...
Esports have become major international sports with hundreds of millions of spectators. Esports games generate massive amounts of telemetry data. Using these to predict the outcome of esports matches has received considerable attention, but micro-predictions, which seek to predict events inside a match, is as yet unknown territory. Micro-prediction...
Hyperinflation and price volatility in virtual economies has the potential to reduce player satisfaction and decrease developer revenue. This paper describes intuitive analytical methods for monitoring volatility and inflation in virtual economies, with worked examples on the increasingly popular multiplayer game Old School Runescape. Analytical me...
Circuit clustering algorithms fit synthesised circuits into FPGA configurable logic blocks (CLBs) efficiently. This fundamental process in FPGA CAD flow directly impacts both effort required and performance achievable in subsequent place-and-route processes. Circuit clustering is limited by hardware constraints of specific target architectures. Hen...
Measuring wave heights has traditionally been associated with physical buoy tools that aim to measure and average multiple wave heights over a period of time. With our method, we demonstrate a process of utilizing large-scale satellite images to classify a wave height with a continuous regressive output using a corresponding input for close shore s...
We propose a memetic approach to find bottlenecks in complex networks based on searching for a graph partitioning with minimum conductance. Finding the optimum of this problem, also known in statistical mechanics as the Cheeger constant, is one of the most interesting NP-hard network optimisation problems. The existence of low conductance minima in...
This chapter describes the use of biologically inspired Evolutionary Algorithms (EAs) to create designs for implementation on a reconfigurable logic device. Previous work on Evolvable Hardware (EHW) is discussed with a focus on timing problems for digital circuits. An EA is developed that describes the circuit using a Hardware Description Language...
A novel hierarchical fault-tolerance methodology for reconfigurable devices is presented. A bespoke multi-reconfigurable FPGA architecture, the programmable analogue and digital array (PAnDA), is introduced allowing fine-grained reconfiguration beyond any other FPGA architecture currently in existence. Fault blind circuit repair strategies, which r...
Semiconductor devices have rapidly improved in performance and function density over the past 25 years enabled by the continuous shrinking of technology feature sizes. Fabricating transistors that small, even with advanced processes, results in structural irregularities at the atomic scale, which affect device characteristics in a random manner. To...
In this study, the authors present a design optimisation case study of D-type flip-flop timing characteristics that are degraded as a result of intrinsic stochastic variability in a 25 nm technology process. What makes this work unique is that the design is mapped onto a multi-reconfigurable architecture, which is, like a field programmable gate ar...
The speed and function density of field-programmable gate arrays (FPGAs) are increasing as transistor sizes shrink to the nano-scale. As transistors reduce in size and approach the atomic scale, the presence or absence of single doping atoms and structural irregularities are likely to affect the behaviour of the device in a random manner (Papanikol...
The modeling and simulation of semiconductor devices is a difficult and computationally intensive task. However the expense of fabrication and testing means that accurate modeling and simulation are crucial to the continued progress of the industry. To create these models and then perform the simulations requires parameters from accurate physical m...
This paper presents the concept of hierarchical reconfiguration strategies that can be applied to a circuit on a reconfigurable architecture to change the implementation without changing the functionality, and their use to overcome faults in a source agnostic way. The Programmable Analogue and Digital Array (PAnDA) is a novel FPGA-like reconfigurab...
Field programmable gate arrays (FPGAs) are widely used in applications where online reconfigurable signal processing is required. Speed and function density of FPGAs are increasing as transistor sizes shrink to the nanoscale. As these transistors reduce in size intrinsic variability becomes more of a problem and to reliably create electronic design...
This paper explores the potential for transistor level fault tolerance on a new Programmable Analogue and Digital Array (PAnDA) architecture1. In particular, this architecture features Combinatorial Configurable Analogue Blocks (CCABs) that can implement a number of combinatorial functions similar to FPGAs. In addition, PAnDA allows one to reconfig...
Field programmable gate arrays (FPGAs) are widely used in applications where on-line reconfigurable signal processing is required. Speed and function density of FPGAs are increasing as transistor sizes shrink to the nano-scale. As these transistors reduce in size, intrinsic variability becomes more of a problem, as every physical instance of a desi...
The Programmable Analogue and Digital Array (PAnDA) is a novel reconfigurable architecture, which allows variability aware design and rapid prototyping of digital systems. Exploiting the configuration options of the architecture allows the post-fabrication correction and optimisation of circuits directly in hardware using bio-inspired techniques. I...
Transport triggered architectures are used for implementing bio-inspired systems due to their simplicity, modularity and fault-tolerance. However, producing efficient, optimised machine code for such architectures is extremely difficult, since computational complexity has moved from the hardware-level to the software-level. Presented is the applica...
This book constitutes the refereed proceedings of the 9th International Conference on Information in Cells and Tissues, IPCAT 2012, held in Cambridge, UK, in March/April 2012.
The 13 revised full papers presented together with 26 extended abstracts were carefully reviewed and selected from numerous submissions. The papers cover a wide range of topi...
Field programmable gate arrays (FPGAs) are widely used in applications where on-line reconfigurable signal processing is required. Speed and function density of FPGAs are increasing when shrinking transistor sizes to the nano-scale. Unfortunately, in order to reliably create electronic designs according to specification time-consuming statistical s...
Cartesian Genetic Programming (CGP) is now attracting considerable recognition as an evolutionary algorithm that not only
delivers high performance, but one that has a representation that is flexible and easy to adapt to a range of applications.
Problems based in medicine stand to benefit greatly due their diverse and highly non-linear nature, whic...
Evolvable Hardware has been a discipline for over 15 years. Its application has ranged from simple circuit design to antenna
design. However, research in the field has often been criticised for not addressing real world problems. Intrinsic variability
has been recognised as one of the major challenges facing the semiconductor industry. This paper d...
In the past decades, a number of genetic programming techniques have been developed to evolve machine instructions. However,
these approaches typically suffer from a lack of scalability that seriously impairs their applicability to real-world scenarios.
In this paper, a novel self-scaling instruction generation method is introduced, which tries to...
From the outset, the field of Evolvable Hardware has proved its potential for evolution of innovative circuits and hardware adaptation and repair. CGP has been utilized in both tasks of hardware evolution as well as adaptation. This chapter illustrates some typical applications of CGP, comprising gate-level evolution of ordinary and polymorphic cir...
Scalability has become a major issue and a hot topic of research for the GP community, as researchers are moving on to investigate more complex problems. Throughout nature and conventional human design principles, modular structures are extensively used to tackle complex problems by decomposing them into smaller, simpler subproblems, which can be i...
From the outset, the field of Evolvable Hardware has proved its potential for evolution of innovative circuits and hardware adaptation and repair. CGP has been utilized in both tasks of hardware evolution as well as adaptation. This chapter illustrates some typical applications of CGP, comprising gate-level evolution of ordinary and polymorphic cir...
This work is a study of the viability of using complex building blocks (termed molecules) within the evolutionary computation
paradigm of CGP; extending it to MolCGP. Increasing the complexity of the building blocks increases the design space that
is to be explored to find a solution; thus, experiments were undertaken to find out whether this chang...
This paper presents for the first time the application of Cartesian Genetic Programming to the evolution of machine code for
a simple implementation of a MOVE processor. The effectiveness of the algorithm is demonstrated by evolving machine code for
a 4-bit multiplier with three different levels of parallelism. The results show that 100% successful...
This paper presents a comparison between conventional and multi-objective Cartesian Genetic Programming evolved designs for
a 2-bit adder and a 2-bit multiplier. Each design is converted from a gate-level schematic to a transistor level implementation,
through the use of an open-source standard cell library, and simulated in NGSPICE in order to gen...
The project Meeting the Design Challenges of nano-CMOS Electronics (http://www.nanocmos.ac.uk) was funded by the Engineering and Physical Sciences Research Council to tackle the challenges facing the electronics industry caused by the decreasing scale of transistor devices, and the inherent variability that this exposes in devices and in the circui...
This paper describes an approach of using a multi-objective fitness function to improve the performance of digital circuits evolved using CGP. Circuits are initially evolved for correct functionality using conventional CGP before the NSGA-II algorithm is used to extract circuits which are more efficient in terms of design complexity and delay. This...
Parallel and distributed methods for evolutionary algorithms have concentrated on maintaining multiple populations of genotypes,
where each genotype in a population encodes a potential solution to the problem. In this paper, we investigate the parallelisation
of the genotype itself into a collection of independent chromosomes which can be evaluated...
This paper describes an approach to create novel, robust logic-circuit topologies, using several evolution-inspired techniques over a number of design stages. A library of 2-input logic gates are evolved and optimised for tolerance to the effects of intrinsic variability. Block-level designs are evolved using evolutionary methods (CGP). A method of...
This paper describes an approach to optimise transistor dimensions within a standard cell library. The goal is to extract high-speed and low-power circuits which are more tolerant to the random fluctuations that will be prevalent in future technology nodes. Using statistically enhanced SPICE models based on 3D-atomistic simulations, a genetic algor...
As the size of CMOS devices is approaching the atomic level, the increasing intrinsic device variability is leading to higher failure rates in conventional CMOS designs. This paper introduces a design tool capable of evolving CMOS topologies using a modified form of Cartesian genetic programming and a multi-objective strategy. The effect of intrins...
This paper describes a proposed approach to optimise cell library logic functions for improved performance when future-generation transistor models are used. A multi-objective Genetic Algorithm is used to determine optimal values for transistor widths and supply voltages, utilising multiple supply rails within each cell. Circuits are assessed for t...
As the size of CMOS devices is approaching the atomic level, the increasing intrinsic device variability is leading to higher
failure rates in conventional CMOS designs. In this paper, two approaches are proposed for evolving unconventional variability-tolerent
CMOS designs: one uses a simple Genetic Algorithm, whilst the other uses Cartesian Genet...
This paper presents a generalization of the graph- based genetic programming (GP) technique known as Cartesian genetic programming (CGP). We have extended CGP by utilizing automatic module acquisition, evolution, and reuse. To benchmark the new technique, we have tested it on: various digital circuit problems, two symbolic regression problems, the...