TR2020-055
Pruned Graph Scattering Transforms
-
- "Pruned Graph Scattering Transforms", International Conference on Learning Representations (ICLR), April 2020.BibTeX TR2020-055 PDF
- @inproceedings{Ioannidis2020apr,
- author = {Ioannidis, Vassilis and Chen, Siheng and Giannakis, Georgios},
- title = {Pruned Graph Scattering Transforms},
- booktitle = {International Conference on Learning Representations (ICLR)},
- year = 2020,
- month = apr,
- url = {https://www.merl.com/publications/TR2020-055}
- }
,
- "Pruned Graph Scattering Transforms", International Conference on Learning Representations (ICLR), April 2020.
-
Research Area:
Abstract:
Graph convolutional networks (GCNs) have achieved remarkable performance in a variety of network science learning tasks. However, theoretical analysis of such approaches is still at its infancy. Graph scattering transforms (GSTs) are non-trainable deep GCN models that are amenable to generalization and stability analyses. The present work addresses some limitations of GSTs by introducing a novel so-termed pruned (p)GST approach. The resultant pruning algorithm is guided by a graph-spectrum-inspired criterion, and retains informative scattering features on-the-fly while bypassing the exponential complexity associated with GSTs. It is further established that pGSTs are stable to perturbations of the input graph signals with bounded energy. Experiments showcase that i) pGST performs comparably to the baseline GST that uses all scattering features, while achieving significant computational savings; ii) pGST achieves comparable performance to state-of-the-art GCNs; and iii) Graph data from various domains lead to different scattering patterns, suggesting domain-adaptive pGST network architectures.