Artificial Intelligence and Quantum Computing for Advanced Wireless Networks. Savo G. Glisic
upper W Super Superscript l plus 1 Superscript upper D Super Superscript l plus 1 Superscript right-parenthesis times left-parenthesis upper H Super Superscript l Superscript upper W Super Superscript l Superscript upper D Super Superscript l Superscript right-parenthesis"/>.
One triplet of indexes (il + 1, jl + 1, dl + 1) specifies a row in S, while (il, jl, dl) specifies a column. These two triplets together pinpoint one element in (xl). We set that element to 1 if dl + 1 = dl and
Figure 3.27 Illustration of preprocessing in a cooperative neural network (CoNN)‐based image classifier.
resulting in
(3.104)
S(xl) is very sparse since it has only one nonzero entry in every row. Thus, we do not need to use the entire matrix in the computation. Instead, we just need to record the locations of those nonzero entries – there are only Hl + 1 Wl + 1 Dl + 1 such entries in S(xl). Figure 3.27 illustrates preprocessing in a CoNN‐based image classifier.
For further readings on CoNNs, the reader is referred to [30].
References
1 1 CS231n Convolutional Neural Networks for Visual Recognition. Stanford University. https://cs231n.github.io/neural‐networks‐1
2 2 Haykin, S. (1996). Adaptive Filter Theory, 3e. Upper Saddle River, NJ: Prentice‐Hall.
3 3 Haykin, S. (1996). Neural networks expand SP's horizons. IEEE Signal Process. Mag. 13 (2): 24–49.
4 4 Haykin, S. (1999). Neural Networks: A Comprehensive Foundation, 2e. Upper Saddle River, NJ: Prentice‐Hall.
5 5 Wan, E.A. (1993). Finite impulse response neural networks with applications in time series prediction, Ph.D. dissertation. Department of Electrical Engineering, Stanford University, Stanford, CA.
6 6 Box, G. and Jenkins, G.M. (1976). Time Series Analysis: Forecasting and Control. San Francisco, CA: Holden‐Day.
7 7 Weigend, A.S. and Gershenfeld, N.A. (1994). Time Series Prediction: Fore‐ Casting the Future and Understanding the Past. Reading, MA: Addison‐Wesley.
8 8 Hochreiter, S. and Schmidhuber, J. (1997). Long short‐term memory. Neural Comput. 9 (8): 1735–1780.
9 9 Schuster, M. and Paliwal, K.K. (1997). Bidirectional recurrent neural networks. IEEE Trans. Signal Process. 45 (11): 2673–2681.
10 10 Graves, A. and Schmidhuber, J. (2005). Framewise phoneme classification with bidirectional LSTM and other neural network architectures. Neural Netw. 18 (5): 602–610.
11 11 Graves, A., Liwicki, M., Fernandez, S. et al. (2009). A novel connectionist system for unconstrained handwriting recognition. IEEE Trans. Pattern Anal. Mach. Intell. 31 (5): 855–868.
12 12 Graves, A., Wayne, G., and Danihelka, I. (2014). Neural Turing machines. arXiv preprint arXiv:1410.5401.
13 13 Baddeley, A., Sala, S.D., and Robbins, T.W. (1996). Working memory and executive control [and discussion]. Philos. Trans. R. Soc. B: Biol. Sci. 351 (1346): 1397–1404.
14 14 Chua, L.O. and Roska, T. (2002). Cellular Neural Networks and Visual Computing: Foundations and Applications. New York, NY: Cambridge University Press.
15 15 Chua, L.O. and Yang, L. (1988). Cellular neural network: theory. IEEE Trans. Circuits Syst. 35: 1257–1272.
16 16 Molinar‐Solis, J.E., Gomez‐Castaneda, F., Moreno, J. et al. (2007). Programmable CMOS CNN cell based on floating‐gate inverter unit. J. VLSI Signal Process. Syst. Signal, Image, Video Technol. 49: 207–216.
17 17 Pan, C. and Naeemi, A. (2016). A proposal for energy‐efficient cellular neural network based on spintronic devices. IEEE Trans. Nanotechnol. 15 (5): 820–827.
18 18 Wang, L. et al. (1998). Time multiplexed color image processing based on a CNN with cell‐state outputs. IEEE Trans. VLSI Syst. 6 (2): 314–322.
19 19 Roska, T. and Chua, L.O. (1993). The CNN universal machine: an analogic array computer. IEEE Trans. Circuits Systems II: Analog Digital Signal Process. 40 (3): 163–173.
20 20 Pickett, M.D. et al. (2009). Switching dynamics in titanium dioxide memristive devices. J. Appl. Phys. 106 (7): 074508.
21 21 Hu, X., Duan, S., and Wang, L. (2012). A novel chaotic neural network using memristive synapse with applications in associative memory. Abstract Appl. Anal. 2012: 1–19. https://doi.org/10.1155/2012/405739.
22 22 Kim, H. et al. (2012). Memristor bridge synapses. Proc. IEEE 100 (6): 2061–2070.
23 23 Adhikari, S.P., Yang, C., Kim, H., and Chua, L.O. (2012). Memristor bridge synapse‐based neural network and its learning. IEEE Trans. Neural Netw. Learn. Syst. 23 (9): 1426–1435.
24 24 Corinto, F., Ascoli, A., Kim, Y.‐S., and Min, K.‐S. (2014). Cellular nonlinear networks with memristor synapses. In: Memristor Networks, ed. Andrew Adamatzky, Leon Chua 267–291. New York, NY: Springer‐Verlag.
25 25 Wang, L., Drakakis, E., Duan, S., and He, P. (2012). Memristor model and its application for chaos generation. Int. J. Bifurcation Chaos 22 (8): 1250205.
26 26 Liu, S., Wang, L., Duan, S. et al. (2012). Memristive device based filter and integration circuits with applications. Adv. Sci. Lett. 8 (1): 194–199.
27 27 Chua, L.O. and Kang, S.M. (1976). Memristive devices and systems. Proc. IEEE 64 (2): 209–223.
28 28