Presentation Information
[4Yin-B-23]An Experimental Evaluation of Graph Information for Node Activity Prediction
〇Ryotaro Hamada1, Takashi Hattori1 (1. Keio University)
Keywords:
GNN,Popularity Prediction
Temporal Graph Neural Networks such as TGN are widely used for dynamic graph tasks,
yet whether their graph-structural information genuinely improves node activity prediction remains unclear.
We fix the prediction model (Transformer) across all conditions
and vary only the graph-derived features:
(1)~aggregated neighbor activity,
(2)~TGN memory learned via link prediction, and
(3)~TGN memory fine-tuned for activity regression.
On tgbn-reddit with three random seeds,
neighbor aggregation yields negligible change in RMSLE,
while TGN memory degrades accuracy across all forecast horizons and seeds (12/12 conditions).
A PCA ablation largely rules out high-dimensional input as the cause:
degradation persists even with only five principal components (9/12 conditions).
These results suggest that graph information encoded in TGN memory
does not improve---and can hinder---node activity prediction in this setting.
Generalization to other datasets remains future work.
yet whether their graph-structural information genuinely improves node activity prediction remains unclear.
We fix the prediction model (Transformer) across all conditions
and vary only the graph-derived features:
(1)~aggregated neighbor activity,
(2)~TGN memory learned via link prediction, and
(3)~TGN memory fine-tuned for activity regression.
On tgbn-reddit with three random seeds,
neighbor aggregation yields negligible change in RMSLE,
while TGN memory degrades accuracy across all forecast horizons and seeds (12/12 conditions).
A PCA ablation largely rules out high-dimensional input as the cause:
degradation persists even with only five principal components (9/12 conditions).
These results suggest that graph information encoded in TGN memory
does not improve---and can hinder---node activity prediction in this setting.
Generalization to other datasets remains future work.
