Figure 5 - uploaded by Qin Zhao
Content may be subject to copyright.
Source publication
Network representation learning has attracted widespread attention as a pre-processing process for some machine learning and deep learning tasks. However, most existing methods only consider influence of nodes' low-order neighbors to represent them. Either nodes' high-order neighbor or the intrinsic characteristic attributes of nodes are ignored, l...
Contexts in source publication
Context 1
... though there may not be a strong relationship between two nodes in the initial network, because they are sufficiently similar in structure and attributes, this weak relationship eventually turns into a strong relationship. For example, for node 2 and node 5 in the Figure 5, there is no connection between them, but after strengthening the structure, a new connection appears between them. After the processing of the structure enhancement algorithm, the adjacency relationship of the entire graph can be represented by L = (rel ij ) n×n ∈ R n×n , combined with the initial feature matrix A ∈ R n×d of the network, and sent to the graph convolutional neural network for representation learning. ...
Context 2
... based on the similarity matrix calculated above, we follow Kipf and Welling [25] and propose a Graph Convolutional Neural Network based on Structure Enhancement (SEGCN). Figure 5 illustrates the process of utilizing GCN to complete node representation learning by using the generated structure-enhanced network. ...
Context 3
... though there may not be a strong relationship between two nodes in the initial network, because they are sufficiently similar in structure and attributes, this weak relationship eventually turns into a strong relationship. For example, for node 2 and node 5 in the Figure 5, there is no connection between them, but after strengthening the structure, a new connection appears between them. After the processing of the structure enhancement algorithm, the adjacency relationship of the entire graph can be represented by L = (rel ij ) n×n ∈ R n×n , combined with the initial feature matrix A ∈ R n×d of the network, and sent to the graph convolutional neural network for representation learning. ...
Context 4
... based on the similarity matrix calculated above, we follow Kipf and Welling [25] and propose a Graph Convolutional Neural Network based on Structure Enhancement (SEGCN). Figure 5 illustrates the process of utilizing GCN to complete node representation learning by using the generated structure-enhanced network. ...
Similar publications
We study the dynamics of the weights of a Graph Neural Network layer during training and we employ the concept of neural network temperature to prune unimportantinput features. We give a theoretical understanding of the results and make comparisons with similar experiments on Convolutional Neural Networks.
Nuclei classification is a critical step in computer-aided diagnosis with histopathology images. In the past, various methods have employed graph neural networks (GNN) to analyze cell graphs that model inter-cell relationships by considering nuclei as vertices. However, they are limited by the GNN mechanism that only passes messages among local nod...
Due to the complexity of natural language, the current relation extraction methods can no longer meet people’s requirements. Dependency trees have been proved to be able to capture the long-distance relation between the target entity pairs, this study mainly focuses on how to prune the dependency tree more effectively, and improve the model’s perfo...
The purpose of text classification is to label the text with known labels. In recent years, the method based on graph neural network (GNN) has achieved good results. However, the existing methods based on GNN only regard the text as the set of co‐occurring words, without considering the position information of each word in the statement. At the sam...
Citations
... Deep learning techniques, with their ability to effectively capture intricate relationships among users, have revolutionized the field. In particular, methods based on Graph Neural Networks (GNNs) have gained widespread popularity for obtaining user embeddings that aid in user clustering [7,8,9,10,11]. However, it is important to acknowledge that these GNN-based methods often require a substantial amount of time and computational resources, rendering them impractical for application on large-scale social networks. ...
... To begin, we calculate the information entropy of each attribute using Equation (8). Any attributes with an information entropy greater than the threshold value τ are eliminated. ...
With the increasing diversity and complexity of online social networks, effectively dividing communities presents a growing challenge. These networks are characterized by their large scale, sparse structure, and numerous isolated points. Traditional community detection methods lack consideration of node attribute information, thereby negatively impacting the accuracy of community detection. To address these challenges, this paper presents a novel Louvain-FTAS community detection algorithm that integrates topology and attribute structure. The proposed algorithm first selects attributes with positive effects to account for attribute heterogeneity. Subsequently, it utilizes a semi-local strategy to calculate topology similarity and information entropy to calculate attribute similarity. These values are combined to obtain the final node similarity matrix, which is then fed into the Louvain algorithm to maximize modularity and incorporate multi-dimensional attribute features to enhance community detection accuracy. The proposed model is evaluated through comparative experiments on two real datasets and artificial synthetic networks, demonstrating its rationality and effectiveness.
... GCN [32] also used to detect community, which efficiently captures intricate features from network topology and node attributes by employing a series of convolutional operations, similar to how CNN (Convolutional Neural Network) [33] operates. Thus, some methods employ it to represent the node information [34]. Zhao et al. employ Graph Attention Network to aggregate neighborhood information by computing attention coefficients for neighboring nodes [35], [36], which allowing the algorithm selectively emphasize the most relevant nodes during information aggregation. ...
Heterogeneous information networks provide abundant structural and semantic information. Two main strategies for leveraging this data include meta-path-based and meta-path-free methods. The effectiveness of the former heavily depends on the quality of manually defined meta-paths, which may lead to the instability of the model. However, the existing meta-path-free methods lack of neighbor screening during aggregating, and there is also an overemphasis on attribute information. To address these issues, we propose the Heterogeneous Graph Neural Network model by incorporating Quantitative Sampling and Structure-aware Attention. We introduce a Quantitative Sampling Module that calculates the similarity between neighbors of the target nodes and target nodes, enabling us to select the top k nodes with the strongest relevance to the target node based on this measure, and incorporate a Structure-aware Attention module during the aggregation of neighbor information. This module combines both structural and attribute information to aggregate the neighbor information effectively. By implementing these improvements, our proposed model exhibits superior performance compared to several state-of-the-art methods on two real-world datasets.