About
9
Publications
1,846
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
81
Citations
Introduction
I am a PhD student at the Electronic and Electrical Engineering department of University College London. I investigate how memristors can be used to physically implement artificial neural networks (ANNs) which are utilised in machine learning applications. Such implementations would reduce the computation time and power consumption of ANNs by orders of magnitude. My work consists of processing experimental data of memristors which I then use in simulating physical implementations of ANNs.
Skills and Expertise
Education
September 2018 - September 2022
September 2015 - July 2018
Publications
Publications (9)
Recent years have seen a rapid rise of artificial neural networks being employed in a number of cognitive tasks. The ever‐increasing computing requirements of these structures have contributed to a desire for novel technologies and paradigms, including memristor‐based hardware accelerators. Solutions based on memristive crossbars and analog data pr...
In a data-driven economy, virtually all industries benefit from advances in information technology—powerful computing systems are critically important for rapid technological progress. However, this progress might be at risk of slowing down if we do not address the discrepancy between our current computing power demands and what the existing techno...
Recent years have seen a rapid rise of artificial neural networks being employed in a number of cognitive tasks. The ever-increasing computing requirements of these structures have contributed to a desire for novel technologies and paradigms, including memristor-based hardware accelerators. Solutions based on memristive crossbars and analog data pr...
Artificial neural networks are notoriously power- and time-consuming when implemented on conventional von Neumann computing systems. Consequently, recent years have seen an emergence of research in machine learning hardware that strives to bring memory and computing closer together. A popular approach is to realise artificial neural networks in har...
Artificial neural networks are notoriously power- and time-consuming when implemented on conventional von Neumann computing systems. Consequently, recent years have seen an emergence of research in machine learning hardware that strives to bring memory and computing closer together. A popular approach is to realise artificial neural networks in har...
Crossbar arrays are a popular solution when implementing systems that have array-like architecture. With the recent developments in the field of neuromorphic engineering, crossbars are now routinely used to implement artificial neural networks or, more generally, to perform vector–matrix multiplication in hardware. However, the interconnect resista...
Resistive Random Access Memory (RRAM) is a promising technology for power efficient hardware in applications of artificial intelligence (AI) and machine learning (ML) implemented in non-von Neumann architectures. However, there is an unanswered question if the device non-idealities preclude the use of RRAM devices in this potentially disruptive tec...
We report a study of the relationship between oxide microstructure at the scale of tens of nanometres and resistance switching behaviour in silicon oxide. In the case of sputtered amorphous oxides, the presence of columnar structure enables efficient resistance switching by providing an intial structured distribution of defects that can act as prec...
Projects
Project (1)
Using a combination of transmission electron microscopy studies and density functional theory modelling, we aim to further understand the fundamental mechanism behind resistive switching in silicon suboxides, and how device fabrication affects device performance.