January 2025
What is this page?
This page lists works of an author who doesn't have a ResearchGate profile or hasn't added the works to their profile yet. It is automatically generated from public (personal) data to further our legitimate goal of comprehensive and accurate scientific recordkeeping. If you are this author and want this page removed, please let us know.
Publications (110)
October 2024
·
30 Reads
This paper presents Grammar Reinforcement Learning (GRL), a reinforcement learning algorithm that uses Monte Carlo Tree Search (MCTS) and a transformer architecture that models a Pushdown Automaton (PDA) within a context-free grammar (CFG) framework. Taking as use case the problem of efficiently counting paths and cycles in graphs, a key challenge in network analysis, computer science, biology, and social sciences, GRL discovers new matrix-based formulas for path/cycle counting that improve computational efficiency by factors of two to six w.r.t state-of-the-art approaches. Our contributions include: (i) a framework for generating gramformers that operate within a CFG, (ii) the development of GRL for optimizing formulas within grammatical structures, and (iii) the discovery of novel formulas for graph substructure counting, leading to significant computational improvements.
August 2024
·
6 Reads
·
1 Citation
Pattern Recognition Letters
July 2024
·
14 Reads
Dynamic link prediction is a critical task in the analysis of evolving networks, with applications ranging from recommender systems to economic exchanges. However, the concept of the temporal receptive field, which refers to the temporal context that models use for making predictions, has been largely overlooked and insufficiently analyzed in existing research. In this study, we present a comprehensive analysis of the temporal receptive field in dynamic graph learning. By examining multiple datasets and models, we formalize the role of temporal receptive field and highlight their crucial influence on predictive accuracy. Our results demonstrate that appropriately chosen temporal receptive field can significantly enhance model performance, while for some models, overly large windows may introduce noise and reduce accuracy. We conduct extensive benchmarking to validate our findings, ensuring that all experiments are fully reproducible.
January 2024
·
157 Reads
·
26 Citations
IEEE Access
In recent years, Dynamic Graph (DG) representations have been increasingly used for modeling dynamic systems due to their ability to integrate both topological and temporal information in a compact representation. Dynamic graphs efficiently handle applications such as social network prediction, recommender systems, traffic forecasting, or electroencephalography analysis, which cannot be addressed using standard numerical representations. As a direct consequence, dynamic graph learning has emerged as a new machine learning problem, combining challenges from both sequential/temporal data processing and static graph learning. In this research area, the Dynamic Graph Neural Network (DGNN) has become the state-of-the-art approach and a plethora of models have been proposed in the very recent years. This paper aims to provide a review of the problems and models related to dynamic graph learning. The various dynamic graph supervised learning settings are analyzed and discussed. We identify the similarities and differences between existing models concerning the way time information is modeled. Finally, we provide guidelines for DGNN design and optimization, and review public datasets for evaluating model performance on various tasks, along with the corresponding publications.
April 2023
·
84 Reads
·
1 Citation
In recent years, Dynamic Graph (DG) representations have been increasingly used for modeling dynamic systems due to their ability to integrate both topological and temporal information in a compact representation. Dynamic graphs allow to efficiently handle applications such as social network prediction, recommender systems, traffic forecasting or electroencephalography analysis, that can not be adressed using standard numeric representations. As a direct consequence of the emergence of dynamic graph representations, dynamic graph learning has emerged as a new machine learning problem, combining challenges from both sequential/temporal data processing and static graph learning. In this research area, Dynamic Graph Neural Network (DGNN) has became the state of the art approach and plethora of models have been proposed in the very recent years. This paper aims at providing a review of problems and models related to dynamic graph learning. The various dynamic graph supervised learning settings are analysed and discussed. We identify the similarities and differences between existing models with respect to the way time information is modeled. Finally, general guidelines for a DGNN designer when faced with a dynamic graph learning problem are provided.
March 2023
·
117 Reads
This paper proposes a new GNN design strategy. This strategy relies on Context-Free Grammars (CFG) generating the matrix language MATLANG. It enables us to ensure both WL-expressive power, substructure counting abilities and spectral properties. Applying our strategy, we design Grammatical Graph Neural Network GN, a provably 3-WL GNN able to count at edge-level cycles of length up to 6 and able to reach band-pass filters. A large number of experiments covering these properties corroborate the presented theoretical results.
December 2021
·
8 Reads
This paper reports results from the design phase of EurHisFirm. Its goal is to integrate isolated and badly accessible financial data sets on 19 th and 20 th century European companies so that users can query the data as if they reside in one large database. In addition, it wants to stimulate database construction by providing not only methodology and tools to connect to and collaborate with existing ones, but also a collaborative platform, based on machine learning and artificial intelligence, that allows harvesting data in a semi-automatic way. We present the proof-of-concept results of this platform in addition to the performance of matching algorithms, which are necessary to connect and collate the different constituent databases as well as to connect them to contemporary commercial databases.
October 2021
·
60 Reads
·
10 Citations
Pattern Recognition Letters
In this paper, we propose a method to both extract and classify symbols in floorplan images. This method relies on the very recent developments of Graph Neural Networks (GNN). In the proposed approach, floorplan images are first converted into Region Adjacency Graphs (RAGs). In order to achieve both classification and extraction, two different GNNs are used. The first one aims at classifying each node of the graph while the second targets the extraction of clusters corresponding to symbols. In both cases, the model is able to take into account edge features. Each model is firstly evaluated independently before combining both tasks simultaneously, increasing the quickness of the results.
June 2021
·
700 Reads
·
1 Citation
Since the Message Passing (Graph) Neural Networks (MPNNs) have a linear complexity with respect to the number of nodes when applied to sparse graphs, they have been widely implemented and still raise a lot of interest even though their theoretical expressive power is limited to the first order Weisfeiler-Lehman test (1-WL). In this paper, we show that if the graph convolution supports are designed in spectral-domain by a non-linear custom function of eigenvalues and masked with an arbitrary large receptive field, the MPNN is theoretically more powerful than the 1-WL test and experimentally as powerful as a 3-WL existing models, while remaining spatially localized. Moreover, by designing custom filter functions, outputs can have various frequency components that allow the convolution process to learn different relationships between a given input graph signal and its associated properties. So far, the best 3-WL equivalent graph neural networks have a computational complexity in with memory usage in , consider non-local update mechanism and do not provide the spectral richness of output profile. The proposed method overcomes all these aforementioned problems and reaches state-of-the-art results in many downstream tasks.
Citations (72)
... The Graph Edit Distance(GED) [4] is a state-ofthe-art method for this purpose; however, it suffers from NP-hard complexity. Recently, several architectures have been proposed to address this limitation [9,7,8,11,12] in a learning framework. These architectures generally consist of two main components. ...
Reference:
3-WL GNNs for Metric Learning on Graphs
- Citing Article
August 2024
Pattern Recognition Letters
... Consequently, researchers have proposed the construction of dynamic graphs to model the dynamic relationships contained in historical time series data. Dynamic graphs, the topological structure of which can be adjusted over time, offer significant advantages in capturing the evolutionary features and dynamic dependencies of spatio-temporal data [27][28][29]. For example, Li et al [30] developed a DSTGN model that used dynamic graph structures (SDs) to effectively capture dynamic associations within the data, enabling predictions based on the extraction of dynamic features from multivariate time series. ...
- Citing Article
- Full-text available
January 2024
IEEE Access
... Most research focussed on one type of object detector, such as You Only Look Once (YOLO) [28] based approaches [11,15,24,[29][30][31][32] or Faster Regionbased Convolutional Neural Network (Faster R-CNN) [33] based approaches [2,6,34,35,55]. Other approaches were based on Fully Convolutional Network (FCN) [37] segmentation models [38,39] or graph-based methods [40][41][42]. ...
- Citing Article
October 2021
Pattern Recognition Letters
... The difference is that GCN uses the Laplace matrix, GAT uses the attention coefficient. Different from the above two methods, we adopt a simpler method [39] to aggregate neighbor information, which depends on the node update mode given in the following equation: ...
- Citing Preprint
- File available
June 2021
... Spectral decomposition. In spectral decomposition/ analysis (Balcilar et al. 2021;Gao et al. 2021), the normalized Laplacian matrix L of graph G is defined as L = I − D −1/2 AD −1/2 , where A and D are the adjacency matrix and the degree matrix, respectively. We decompose L as L = UΛU ⊤ via a spectral decomposition, where Λ is a diagonal matrix Λ = diag(λ 1 , . . . ...
Reference:
Dynamic Spectral Graph Anomaly Detection
- Citing Conference Paper
- Full-text available
May 2021
... We subsequently used the depth labels provided by the ChebNet in the link prediction module. The reason for ChebNet to perform better than the remaining two methods could be attributed to the fact that ChebNet covers the full spectrum profile compared to GCN, as indicated in [53]. HGNN builds on top of the linear approximation used in GCN, and follows similar performance. ...
Reference:
Detecting Near-Duplicate Face Images
- Citing Conference Paper
- Full-text available
July 2020
... [91] Use modern char-level LM embeddings (Flair) [184] Use modern word-level LM embeddings (BERT, ELMo) [70,152,203] Uses stack of modern embeddings [105,137,162] Transfer learning How well modern embeddings can transfer to historical texts? What is the impact of in-domain embeddings? ...
- Citing Chapter
August 2020
Lecture Notes in Computer Science
... [12][13][14] Moreover, several machine learning (ML) approaches have recently emerged based on Mayr's database which currently holds experimental reactivity parameters for 355 electrophiles and 1300 nucleophiles. 6,[15][16][17][18][19][20][21] Our recent work introduces ESNUEL, 22 a fully automated quantum chemistry (QM)-based workow for EStimating NUcleophilicity and ELectrophilicity. A workow that builds upon studies by van Vranken and Baldi showing that calculated methyl cation affinities (MCAs) and methyl anion affinities (MAAs) of structurally different molecules correlate with Mayr's N × s N and E, respectively, when accounting for solvent effects. ...
- Citing Article
July 2020
Journal of Computational Chemistry
... The GraphSAGE network is a spatial graph neural network introduced in Reference [42], and it is a type of graph neural networks. Graph neural networks are currently widely applied in various fields, including link prediction [43], node classification [44], community detection [45], graph classification [46], and graph embedding [47]. ...
- Citing Preprint
- File available
March 2020
... Ferrando (2018) adopts a heuristic approach to identify room connections as edges and room regions as nodes with the assumption that all doors are axis-aligned, making it difficult to be generalized to complex floor plans. Renton et al. (2019) apply a graph neural network to detect and classify indoor elements from a floor plan image and transform them into adjacency graphs, excluding other important architectural elements such as walls and rooms. Song and Yu (2021) convert floor plan images into polygon graph data using graph neural networks. ...
Reference:
CAADRIA 2023: HUMAN-CENTRIC - Volume 2
- Citing Conference Paper
September 2019