Alessio MicheliUniversity of Pisa | UNIPI · Department of Computer Science
Alessio Micheli
Professor
About
222
Publications
68,862
Reads
How we measure 'reads'
A 'read' is counted each time someone views a publication summary (such as the title, abstract, and list of authors), clicks on a figure, or views or downloads the full-text. Learn more
6,020
Citations
Introduction
Neural Networks for Graphs and Structured data (sequences, trees and graphs);
Deep Learning on Graphs;
Reservoir Computing and Echo State Networks.
Publications
Publications (222)
Artificial Intelligence (AI) applications and Machine Learning (ML) methods have gained much attention in recent years for their ability to automatically detect patterns in data without being explicitly taught rules. Specific features characterise the ECGs of patients with Brugada Syndrome (BrS); however, there is still ambiguity regarding the corr...
Motivation
Dynamical properties of biochemical pathways (BPs) help in understanding the functioning of living cells. Their in silico assessment requires simulating a dynamical system with a large number of parameters such as kinetic constants and species concentrations. Such simulations are based on numerical methods that can be time-expensive for...
In this research, we present a novel approach to evaluate and interpret Convolutional Neural Networks (CNNs) for the diagnosis of Brugada Syndrome (BrS), a rare heart rhythm disease, from the electrocardiogram (ECG) time series. First, the model is assessed on the ECG classification of type-1 BrS. Then, we define a method to interpret the BrS predi...
We propose an extension of the Contextual Graph Markov Model, a deep and probabilistic machine learning model for graphs, to model the distribution of edge features. Our approach is architectural, as we introduce an additional Bayesian network mapping edge features into discrete states to be used by the original model. In doing so, we are also able...
Graph neural networks compute node representations by performing multiple message-passing steps that consist in local aggregations of node features. Having deep models that can leverage longer-range interactions between nodes is hindered by the issues of over-smoothing and over-squashing. In particular, the latter is attributed to the graph topolog...
Node classification tasks on graphs are addressed via fully-trained deep message-passing models that learn a hierarchy of node representations via multiple aggregations of a node's neighbourhood. While effective on graphs that exhibit a high ratio of intra-class edges, this approach poses challenges in the opposite case, i.e. heterophily, where nod...
Graph Echo State Networks (GESN) have already demonstrated their efficacy and efficiency in graph classification tasks. However, semi-supervised node classification brought out the problem of over-smoothing in end-to-end trained deep models, which causes a bias towards high homophily graphs. We evaluate for the first time GESN on node classificatio...
Reservoir computing (RC) is a popular approach to the efficient design of recurrent neural networks (RNNs), where the dynamical part of the model is initialized and left untrained. Deep echo state networks (ESNs) combined the deep learning approach with RC, by structuring the reservoir in multiple layers, thus offering the striking advantage of enc...
The Contextual Graph Markov Model (CGMM) is a deep, unsupervised, and probabilistic model for graphs that is trained incrementally on a layer-by-layer basis. As with most Deep Graph Networks , an inherent limitation is the need to perform an extensive model selection to choose the proper size of each layer's latent representation. In this paper, we...
Funding Acknowledgements
Type of funding sources: Public grant(s) – National budget only. Main funding source(s): This research project is funded by Tuscany Region
Background/Introduction
Electrocardiograms (ECGs) are rapidly moving from analog to digital versions. Consequently, a series of automatic analyses of standard 12-lead ECGs are attractin...
The increasing digitization and datification of all aspects of people’s daily life, and the consequent growth in the use of personal data, are increasingly challenging the current development and adoption of Machine Learning (ML). First, the sheer complexity and amount of data available in these applications strongly demands for ML algorithms that...
Reservoir computing (RC) is a popular class of recurrent neural networks (RNNs) with untrained dynamics. Recently, advancements on deep RC architectures have shown a great impact in time-series applications, showing a convenient trade-off between predictive performance and required training complexity. In this paper, we go more in depth into the an...
We present in silico modeling methods for the investigation of dynamical properties of biochemical pathways, that are chemical reaction networks underlying cell functioning. Since pathways are (complex) dynamical systems, in-silico models are often studied by applying numerical integration techniques for Ordinary Differential Equations (ODEs), or s...
Monitoring of human states from streams of sensor data is an appealing applicative area for Recurrent Neural Network (RNN) models. In such a scenario, Echo State Network (ESN) models from the Reservoir Computing paradigm can represent good candidates due to the efficient training algorithms, which, compared to fully trainable RNNs, definitely ease...
This chapter surveys the recent advancements on the extension of Reservoir Computing toward deep architectures, which is gaining increasing research attention in the neural networks community. Within this context, we focus on describing the major features of Deep Echo State Networks based on the hierarchical composition of multiple reservoirs. The...
This paper discusses the perspective of the H2020 TEACHING project on the next generation of autonomous applications running in a distributed and highly heterogeneous environment comprising both virtual and physical resources spanning the edge-cloud continuum. TEACHING puts forward a human-centred vision leveraging the physiological, emotional, and...
We propose a deep Graph Neural Network (GNN) model that alternates two types of layers. The first type is inspired by Reservoir Computing (RC) and generates new vertex features by iterating a non-linear map until it converges to a fixed point. The second type of layer implements graph pooling operations, that gradually reduce the support graph and...
The limits of molecular dynamics (MD) simulations of macromolecules are steadily pushed forward by the relentless development of computer architectures and algorithms. The consequent explosion in the number and extent of MD trajectories induces the need for automated methods to rationalize the raw data and make quantitative sense of them. Recently,...
Artificial Recurrent Neural Networks are a powerful information processing abstraction, and Reservoir Computing provides an efficient strategy to build robust implementations by projecting external inputs into high dimensional dynamical system trajectories. In this paper, we propose an extension of the original approach, a local unsupervised learni...
We propose a deep Graph Neural Network (GNN) model that alternates two types of layers. The first type is inspired by Reservoir Computing (RC) and generates new vertex features by iterating a non-linear map until it converges to a fixed point. The second type of layer implements graph pooling operations, that gradually reduce the support graph and...
Dynamical properties of biochemical pathways are often assessed by performing numerical (ODE-based) or stochastic simulations. These methods are often computationally very expensive and require reliable quantitative parameters, such as kinetic constants and initial concentrations, to be available. Biochemical pathways are often represented as graph...
Recurrent Neural Networks (RNNs) represent a natural paradigm for modeling sequential data like text written in natural language. In fact, RNNs and their variations have long been the architecture of choice in many applications, however in practice they require the use of labored architectures (such as gating mechanisms) and computationally heavy t...
We introduce the Graph Mixture Density Network, a new family of machine learning models that can fit multimodal output distributions conditioned on arbitrary input graphs. By combining ideas from mixture models and graph representation learning, we address a broad class of challenging regression problems that rely on structured data. Our main contr...
In this work we investigate the use of machine learning models for the management and monitoring of sustainable mobility, with particular reference to the transport mode recognition. The specific aim is to automatize the detection of the user’s means of transport among those considered in the data collected with an App installed on the users smartp...
This paper proposes a method for clustering of time series, based upon the ability of deep Reservoir Computing networks to grasp the dynamical structure of the series that is presented as input. A standard clustering algorithm, such as k-means, is applied to the network states, rather than the input series themselves. Clustering is thus embedded in...
The increase in computational power of embedded devices and the latency demands of novel applications brought a paradigm shift on how and where the computation is performed. Although AI inference is slowly moving from the cloud to end-devices with limited resources, time-centric recurrent networks like Long-Short Term Memory remain too complex to b...
The limits of molecular dynamics (MD) simulations of macromolecules are steadily pushed forward by the relentless developments of computer architectures and algorithms. This explosion in the number and extent (in size and time) of MD trajectories induces the need of automated and transferable methods to rationalise the raw data and make quantitativ...
The adaptive processing of graph data is a long-standing research topic that has been lately consolidated as a theme of major interest in the deep learning community. The snap increase in the amount and breadth of related research has come at the price of little systematization of knowledge and attention to earlier literature. This work is a tutori...
Reservoir computing (RC) is a machine learning framework for temporal (sequential) pattern recognition, which originates from specific types of recurrent neural network models including echo state networks and liquid state machines. An RC system consists of a dynamical reservoir for mapping inputs into a high-dimensional temporal representation spa...
Machine Learning for graphs is nowadays a research topic of consolidated relevance. Common approaches in the field typically resort to complex deep neural network architectures and demanding training algorithms, highlighting the need for more efficient solutions. The class of Reservoir Computing (RC) models can play an important role in this contex...
Experimental reproducibility and replicability are critical topics in machine learning. Authors have often raised concerns about their lack in scientific publications to improve the quality of the field. Recently, the graph representation learning field has attracted the attention of a wide research community, which resulted in a large stream of wo...
We address the efficiency issue for the construction of a deep graph neural network (GNN). The approach exploits the idea of representing each input graph as a fixed point of a dynamical system (implemented through a recurrent neural network), and leverages a deep architectural organization of the recurrent units. Efficiency is gained by many aspec...
Graph generation with Machine Learning is an open problem with applications in various research fields. In this work, we propose to cast the generative process of a graph into a sequential one, relying on a node ordering procedure. We use this sequential process to design a novel generative model composed of two recurrent neural networks that learn...
We introduce an overview of methods for learning in structured domains covering foundational works developed within the last twenty years to deal with a whole range of complex data representations, including hierarchical structures, graphs and networks, and giving special attention to recent deep learning models for graphs. While we provide a gener...
Molecule generation is a challenging open problem in cheminformatics. Currently, deep generative approaches addressing the challenge belong to two broad categories, differing in how molecules are represented. One approach encodes molecular graphs as strings of text, and learns their corresponding character-based language model. Another, more expres...
Systems developed in wearable devices with sensors onboard are widely used to collect data of humans and animals activities with the perspective of an on-board automatic classification of data. An interesting application of these systems is to support animals' behaviour monitoring gathered by sensors' data analysis. This is a challenging area and i...
Graph generation with Machine Learning is an open problem with applications in various research fields. In this work, we propose to cast the generative process of a graph into a sequential one, relying on a node ordering procedure. We use this sequential process to design a novel generative model composed of two recurrent neural networks that learn...
We propose a new Graph Neural Network that combines recent advancements in the field. We give theoretical contributions by proving that the model is strictly more general than the Graph Isomorphism Network and the Gated Graph Neural Network, as it can approximate the same functions and deal with arbitrary edge values. Then, we show how a single nod...
We propose a new Graph Neural Network that combines recent advancements in the field. We give theoretical contributions by proving that the model is strictly more general than the Graph Isomorphism Network and the Gated Graph Neural Network, as it can approximate the same functions and deal with arbitrary edge values. Then, we show how a single nod...
In this paper we address the problem of grounded weights initialization for Recurrent Neural Networks. Specifically, we propose a method, rooted in the field of Random Matrix theory, to perform a fast initialization of recurrent weight matrices that meet specific constraints on their spectral radius. Focusing on the Reservoir Computing (RC) framewo...
Biochemical pathways are often represented as graphs, in which nodes and edges give a qualitative description of the modeled reactions, while node and edge labels provide quantitative details such as kinetic and stoichiometric parameters. Dynamical properties of biochemical pathways are usually assessed by performing numerical (ODE-based) or stocha...
The adaptive processing of graph data is a long-standing research topic which has been lately consolidated as a theme of major interest in the deep learning community. The snap increase in the amount and breadth of related research has come at the price of little systematization of knowledge and attention to earlier literature. This work is designe...
Experimental reproducibility and replicability is a critical topic in machine learning. Authors have often raised concerns about such scholarship issues, which are aimed at improving the quality of the field. Recently, the graph representation learning field has attracted the attention of a wide research community, which resulted in a large stream...
We address the efficiency issue for the construction of a deep graph neural network (GNN). The approach exploits the idea of representing each input graph as a fixed point of a dynamical system (implemented through a recurrent neural network), and leverages a deep architectural organization of the recurrent units. Efficiency is gained by many aspec...
Slides of my invited talk at MLDM.it workshop @ AIxIA 2019.
Recurrent Neural Networks (RNNs) are at the foundation of many state-of-the-art results in text classification. However, to be effective in practical applications, they often require the use of sophisticated architectures and training techniques, such as gating mechanisms and pre-training by autoencoders or language modeling, with typically high co...
Deep Echo State Networks (DeepESNs) recently extended the applicability of Reservoir Computing (RC) methods towards the field of deep learning. In this paper we study the impact of constrained reservoir topologies in the architectural design of deep reservoirs, through numerical experiments on several RC benchmarks. The major outcome of our investi...
Deep Echo State Networks (DeepESNs) recently extended the applicability of Reservoir Computing (RC) methods towards the field of deep learning. In this paper we study the impact of constrained reservoir topologies in the architectural design of deep reservoirs, through numerical experiments on several RC benchmarks. The major outcome of our investi...
Technology is pushing cardiology towards non-invasive recording of continuous blood pressure, with methods that do not require the insertion of a pressure transducer in the Aorta. Although novel analyses based on the Electrocardiogram (ECG) and Photoplethysmography (PPG) provided an elegant model of the interaction between the heart and blood vesse...
Smart robotic environments combine traditional (ambient) sensing devices and mobile robots. This combination extends the type of applications that can be considered, reduces their complexity, and enhances the individual values of the devices involved by enabling new services that cannot be performed by a single device. To reduce the amount of prepa...
Slides for the presentation of the paper "Richness of Deep Echo State Network Dynamics" @ IWANN 2019, Gran Canaria (Spain) - 13 June 2019
Reservoir Computing (RC) is a popular methodology for the efficient design of Recurrent Neural Networks (RNNs). Recently, the advantages of the RC approach have been extended to the context of multi-layered RNNs, with the introduction of the Deep Echo State Network (DeepESN) model. In this paper, we study the quality of state dynamics in progressiv...
Performing machine learning on structured data is complicated by the fact that such data does not have vectorial form. Therefore, multiple approaches have emerged to construct vectorial representations of structured data, from kernel and distance approaches to recurrent, recursive, and convolutional neural networks. Recent years have seen heightene...
Reservoir Computing (RC) is a popular methodology for the efficient design of Recurrent Neural Networks (RNNs). Recently, the advantages of the RC approach have been extended to the context of multi-layered RNNs, with the introduction of the Deep Echo State Network (DeepESN) model. In this paper, we study the quality of state dynamics in progressiv...
Reservoir Computing (RC) is a leading edge paradigm for the design of efficiently trainable Recurrent Neural Network models. The first International Workshop on Reservoir Computing will bring together researchers to discuss the state-of-the-art and open challenges for the field of RC, in all its declinations. These include, but are not limited to,...
Recently, studies on deep Reservoir Computing (RC) highlighted the role of layering in deep recurrent neural networks (RNNs). In this paper, the use of linear recurrent units allows us to bring more evidence on the intrinsic hierarchical temporal representation in deep RNNs through frequency analysis applied to the state signals. The potentiality o...
As you open this January issue of the IEEE Transactions on Neural Networks and Learning Systems (TNNLS), I hope everyone enjoyed a great holiday season and is excited for the new year of 2019. I am very delighted and honored to report several key metrics of IEEE TNNLS to the community.
We propose an experimental comparison between Deep Echo State Networks (DeepESNs) and gated Recurrent Neural Networks (RNNs) on multivariate time-series prediction tasks. In particular, we compare reservoir and fully-trained RNNs able to represent signals featured by multiple time-scales dynamics. The analysis is performed in terms of efficiency an...
Tree structured data are a flexible tool to properly express many forms of hierarchical information. However, learning of such data through deep recursive models is particularly demanding. We will show through the introduction of the Deep Tree Echo State Network model (DeepTESN) that the randomized Neural Networks framework offers a formidable appr...
Echo State Networks (ESNs) represent a successful methodology for efficient modeling of Recurrent Neural Networks. Untrained recurrent dynamics in ESNs apparently need to comply a trade-off between the two desirable features of implementing a long memory over past inputs and the ability of modeling non-linear dynamics. In this paper, we analyze suc...
Estimation of mortality risk of very preterm neonates is carried out in clinical and research settings. We aimed at elaborating a prediction tool using machine learning methods. We developed models on a cohort of 23747 neonates <30 weeks gestational age, or <1501 g birth weight, enrolled in the Italian Neonatal Network in 2008-2014 (development set...