Presentation Information
[3E1-GS-2d-03]Correction of Structural Embeddings via Relative Degree: Eliminating Scale Bias in Cross-Network Alignment
〇Shu Liu1 (1. Japan Data Science Consortium Co. Ltd.)
Keywords:
Representation Learning,Structural Embedding,Cross-Network Alignment
In this study, we investigate alignment methods based on Structural Equivalence and their limitations in cross-network analysis, where two independent graphs are embedded into a single latent space. While existing embedding methods are broadly categorized into Proximity-based and Structure-based approaches, structural approaches that capture local connectivity patterns are particularly effective for cross-network analysis. However, the behavior of these methods across graphs with differing scales remains largely unexplored.
First, we demonstrate that role identity evaluation based on k-nearest neighbors (kNN) is the most stable method for assessing embeddings. Next, through experiments using standard graph models such as Barbell, BA, WS, and SBM, we reveal the existence of a bias wherein the joint embedding space is dominated by the characteristics of the network with the higher maximum degree. Finally, we propose a modified version of struc2vec to mitigate this scale discrepancy and validate its effectiveness using both synthetic graphs and real-world datasets.
First, we demonstrate that role identity evaluation based on k-nearest neighbors (kNN) is the most stable method for assessing embeddings. Next, through experiments using standard graph models such as Barbell, BA, WS, and SBM, we reveal the existence of a bias wherein the joint embedding space is dominated by the characteristics of the network with the higher maximum degree. Finally, we propose a modified version of struc2vec to mitigate this scale discrepancy and validate its effectiveness using both synthetic graphs and real-world datasets.
