Artificial Intelligence and Quantum Computing for Advanced Wireless Networks. Savo G. Glisic
2014, 2014.
13 13 Hammond, D.K., Vandergheynst, P., and Gribonval, R. (2011). Wavelets on graphs via spectral graph theory. Appl. Comput. Harmonic Anal. 30 (2): 129–150.
14 14 T. N. Kipf and M. Welling, “Semi‐supervised classification with graph convolutional networks,” in Proc. of ICLR 2017, 2017.
15 15 D. K. Duvenaud, D. Maclaurin, J. Aguileraiparraguirre, R. Gomezbombarelli, T. D. Hirzel, A. Aspuruguzik, and R. P. Adams, “Convolutional networks on graphs for learning molecular fingerprints,” NIPS 2015, pp. 2224–2232, 2015.
16 16 J. Atwood and D. Towsley, “Diffusion‐convolutional neural networks,” in Proc. of NIPS 2016, 2016, pp. 1993–2001.
17 17 C. Zhuang and Q. Ma, “Dual graph convolutional networks for graph‐based semi‐supervised classification,” in WWW 2018, 2018.
18 18 S. Cao, W. Lu, and Q. Xu. Grarep: Learning graph representations with global structural information. In KDD, 2015.
19 19 A. Grover and J. Leskovec. node2vec: Scalable feature learning for networks. In KDD, 2016.
20 20 B. Perozzi, R. Al‐Rfou, and S. Skiena. Deepwalk: Online learning of social representations. In KDD, 2014
21 21 J. Tang, M. Qu, M. Wang, M. Zhang, J. Yan, and Q. Mei. Line: Large‐scale information network embedding. In WWW, 2015.
22 22 D. Wang, Daixin Wang, Peng Cui, Wenwu Zhu Structural deep network embedding. In KDD, 2016.
23 23 W. L. Hamilton, Z. Ying, and J. Leskovec, “Inductive representation learning on large graphs,” NIPS 2017, pp. 1024–1034, 2017. https://arxiv.org/pdf/1706.02216.pdf
24 24 K. He, X. Zhang, S. Ren, and J. Sun, “Identity mappings in deep residual networks,” in ECCV 2016. Springer, 2016, pp. 630–645.
25 25 K. Cho, B. Van Merrienboer, C. Gulcehre, D. Bahdanau, F. Bougares, H. Schwenk, and Y. Bengio, “Learning phrase representations using rnn encoder–decoder for statistical machine translation,” EMNLP 2014, pp. 1724–1734, 2014.
26 26 Hochreiter, S. and Schmidhuber, J. (1997). Long short‐term memory. Neural Comput. 9 (8): 1735–1780.
27 27 Y. Li, D. Tarlow, M. Brockschmidt, and R. S. Zemel, “Gated graph sequence neural networks,” arXiv: Learning, 2016.
28 28 K. S. Tai, R. Socher, and C. D. Manning, “Improved semantic representations from tree‐structured long short‐term memory networks,” IJCNLP 2015, pp. 1556–1566, 2015.
29 29 V. Zayats and M. Ostendorf, “Conversation modeling on reddit using a graph‐structured lstm,” TACL 2018, vol. 6, pp. 121–132, 2018.
30 30 N. Peng, H. Poon, C. Quirk, K. Toutanova, and W.‐t. Yih,“Cross‐sentence n‐ary relation extraction with graph lstms,” arXiv preprint arXiv:1708.03743, 2017.
31 31 D. Bahdanau, K. Cho, and Y. Bengio, “Neural machine translation by jointly learning to align and translate,” ICLR 2015, 2015.
32 32 J. Gehring, M. Auli, D. Grangier, and Y. N. Dauphin, “A convolutional encoder model for neural machine translation,” ACL 2017,vol. 1, pp. 123–135,
33 33 A. Vaswani, N. Shazeer, N. Parmar, L. Jones, J. Uszkoreit, A. N.Gomez, and L. Kaiser, “Attention is all you need,” NIPS 2017, pp.5998–6008, 2017.
34 34 J. Cheng, L. Dong, and M. Lapata, “Long short‐term memory‐networks for machine reading,” EMNLP 2016, pp. 551–561, 2016.
35 35 P. Velickovic, G. Cucurull, A. Casanova, A. Romero, P. Lio, and Y. Bengio, “Graph attention networks,” ICLR 2018, 2018.
36 36 J. Gilmer, S. S. Schoenholz, P. F. Riley, O. Vinyals, and G. E. Dahl, “Neural message passing for quantum chemistry,” arXiv preprint arXiv:1704.01212, 2017.
37 37 P. W. Battaglia, J. B. Hamrick, V. Bapst, A. Sanchez‐Gonzalez, V. Zambaldi, M. Malinowski, A. Tacchetti, D. Raposo, A. Santoro, R. Faulkner et al., “Relational inductive biases, deep learning, and graph networks,” arXiv preprint arXiv:1806.01261, 2018.
38 38 Z. Wu, Zonghan Wu, Shirui Pan, Fengwen Chen, Guodong Long, Chengqi Zhang, Philip S. Yu, A Comprehensive Survey on Graph Neural Networks, arXiv:1901.00596v4 [cs.LG] 4 Dec 2019, also in in IEEE Transactions on Neural Networks and Learning Systems, doi: 10.1109/TNNLS.2020.2978386
39 39 D. V. Tran, A. Sperduti Dinh V. Tran, Nicol'o Navarin, Alessandro Sperduti, “On filter size in graph convolutional networks,” in SSCI. IEEE, 2018, pp. 1534–1541.
40 40 C. Gallicchio and A. Micheli, “Graph echo state networks,” in IJCNN. IEEE, 2010, pp. 1–8
41 41 Y. Li, D. Tarlow, M. Brockschmidt, and R. Zemel, “Gated graph sequence neural networks,” in Proc. of ICLR, 2015
42 42 K. Cho, B. Van Merrienboer, C. Gulcehre, D. Bahdanau, F. Bougares, ¨ H. Schwenk, and Y. Bengio,“Learning phrase representations using rnn encoder‐decoder for statistical machine translation,” in Proc. of EMNLP, 2014, pp. 1724–1734.
43 43 H. Dai, Z. Kozareva, B. Dai, A. Smola, and L. Song, “Learning steadystates of iterative algorithms over graphs,” in Proc. of ICML, 2018, pp. 1114–1122.
44 44 Shuman, D.I., Narang, S.K., Frossard, P. et al. (2013). The emerging field of signal processing on graphs: extending high‐dimensional data analysis to networks and other irregular domains. IEEE Signal Process. Mag. 30 (3): 83–98.
45 45 M. Defferrard, X. Bresson, and P. Vandergheynst, “Convolutional neural networks on graphs with fast localized spectral filtering,” in Proc. of NIPS, 2016, pp. 3844–3852.
46 46 M. Henaff, J. Bruna, and Y. LeCun, “Deep convolutional networks on graph‐structured data,” arXiv preprint arXiv:1506.05163, 2015.
47 47 Levie, R., Monti, F., Bresson, X., and Bronstein, M.M. (2017). Cayleynets: graph convolutional neural networks with complex rational spectral filters. IEEE Trans. Signal Process. 67 (1): 97–109.
48 48 Micheli, A. (2009). Neural network for graphs: a contextual constructive approach. IEEE Trans. Neural Netw. 20 (3): 498–511.
49 49 Y. Li, R. Yu, C. Shahabi, and Y. Liu, “Diffusion convolutional recurrent neural network: Data‐driven traffic forecasting,” in Proc. of ICLR, 2018
50 50 S. Yan, Y. Xiong, and D. Lin, “Spatial temporal graph convolutional networks for skeleton‐based action recognition,” in Proc. of AAAI, 2018.
51 51 J. Gilmer, S. S. Schoenholz, P. F. Riley, O. Vinyals, and G. E. Dahl, “Neural message passing for quantum chemistry,” in Proc. of ICML, 2017, pp. 1263–1272.
52 52 K. Xu, W. Hu, J. Leskovec, and S. Jegelka, “How powerful are graph neural networks,” in Proc. of ICLR, 2019
53 53 P. Velickovic, G. Cucurull, A. Casanova, A. Romero, P. Lio, and Y. Bengio, “Graph attention networks,” in Proc. of ICLR, 2017
54 54 S. Cao, W. Lu, and Q. Xu, “Deep neural networks for learning graph representations,” in Proc. of AAAI, 2016, pp. 1145–1152
55 55 D. Wang, P. Cui, and W. Zhu, “Structural deep network embedding,” in Proc. of KDD. ACM, 2016, pp. 1225–1234.
56 56 T. N. Kipf and M. Welling, “Variational graph auto‐encoders,” NIPS Workshop on Bayesian Deep Learning, 2016.
57 57 K. Tu, P. Cui, X. Wang, P. S. Yu, and W. Zhu, “Deep recursive network embedding with regular equivalence,” in Proc. of KDD. ACM, 2018, pp. 2357–2366.
58 58 W. Yu, C. Zheng, W. Cheng, C. C. Aggarwal, D. Song, B. Zong, H. Chen, and W. Wang, “Learning deep network representations with adversarially regularized autoencoders,” in Proc. of AAAI. ACM, 2018, pp. 2663–2671.
59 59 Y. Li, O. Vinyals, C. Dyer, R. Pascanu, and P. Battaglia, “Learning deep generative models of graphs,” in Proc. of ICML, 2018.
60 60 M. Simonovsky and N. Komodakis, “Graphvae: Towards generation of small graphs using variational autoencoders,”