Dovydas Joksas

Dovydas Joksas
University College London | UCL · Department of Electronic and Electrical Engineering

Doctor of Philosophy

About

15
Publications
5,041
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
231
Citations
Introduction
I'm a researcher at the Electronic and Electrical Engineering department of University College London. I investigate memristive neural networks and adversarial attacks against them. Visit my website yoshke.org for more information.
Education
September 2018 - September 2022
University College London
Field of study
  • Electronic and Electrical Engineering
September 2015 - July 2018
University College London
Field of study
  • Electronic and Electrical Engineering

Publications

Publications (15)
Preprint
Neural networks are now deployed in a wide number of areas from object classification to natural language systems. Implementations using analog devices like memristors promise better power efficiency, potentially bringing these applications to a greater number of environments. However, such systems suffer from more frequent device faults and overal...
Chapter
Advanced memory technologies are impacting the information era, representing a vibrant research area of huge electronic industry interest. The demand for data storage, computing performance and energy efficiency is increasing exponentially and will exceed the capabilities of current information technologies. Alternatives to traditional silicon tech...
Preprint
Digital computers have been getting exponentially faster for decades, but huge challenges exist today. Transistor scaling, described by Moore's law, has been slowing down over the last few years, ending the era of fully predictable performance improvements. Furthermore, the data-centric computing demands fueled by machine learning applications are...
Thesis
Full-text available
Digital electronics has given rise to reliable, affordable, and scalable computing devices. However, new computing paradigms present challenges. For example, machine learning requires repeatedly processing large amounts of data; this creates a bottleneck in conventional computers, where computing and memory are separated. To add to that, Moore’s “l...
Article
Full-text available
In a data‐driven economy, virtually all industries benefit from advances in information technology—powerful computing systems are critically important for rapid technological progress. However, this progress might be at risk of slowing down if the discrepancy between the current computing power demands and what the existing technologies can offer i...
Article
Full-text available
Recent years have seen a rapid rise of artificial neural networks being employed in a number of cognitive tasks. The ever‐increasing computing requirements of these structures have contributed to a desire for novel technologies and paradigms, including memristor‐based hardware accelerators. Solutions based on memristive crossbars and analog data pr...
Preprint
Full-text available
In a data-driven economy, virtually all industries benefit from advances in information technology—powerful computing systems are critically important for rapid technological progress. However, this progress might be at risk of slowing down if we do not address the discrepancy between our current computing power demands and what the existing techno...
Preprint
Full-text available
Recent years have seen a rapid rise of artificial neural networks being employed in a number of cognitive tasks. The ever-increasing computing requirements of these structures have contributed to a desire for novel technologies and paradigms, including memristor-based hardware accelerators. Solutions based on memristive crossbars and analog data pr...
Preprint
Full-text available
Artificial neural networks are notoriously power- and time-consuming when implemented on conventional von Neumann computing systems. Consequently, recent years have seen an emergence of research in machine learning hardware that strives to bring memory and computing closer together. A popular approach is to realise artificial neural networks in har...
Article
Full-text available
Artificial neural networks are notoriously power- and time-consuming when implemented on conventional von Neumann computing systems. Consequently, recent years have seen an emergence of research in machine learning hardware that strives to bring memory and computing closer together. A popular approach is to realise artificial neural networks in har...
Article
Full-text available
Crossbar arrays are a popular solution when implementing systems that have array-like architecture. With the recent developments in the field of neuromorphic engineering, crossbars are now routinely used to implement artificial neural networks or, more generally, to perform vector–matrix multiplication in hardware. However, the interconnect resista...
Article
Full-text available
Resistive Random Access Memory (RRAM) is a promising technology for power efficient hardware in applications of artificial intelligence (AI) and machine learning (ML) implemented in non-von Neumann architectures. However, there is an unanswered question if the device non-idealities preclude the use of RRAM devices in this potentially disruptive tec...
Article
Full-text available
We report a study of the relationship between oxide microstructure at the scale of tens of nanometres and resistance switching behaviour in silicon oxide. In the case of sputtered amorphous oxides, the presence of columnar structure enables efficient resistance switching by providing an intial structured distribution of defects that can act as prec...

Network

Cited By