PANN: A New Artificial Intelligence Technology. Tutorial. Boris Zlotin
Other Ways of Comparison and Indexing in BCF
Identifying patterns to understand and manage events is one of the most essential applications of neural networks.
Two images can be similar or appear similar to us for many reasons. Most often, similarity is determined by the common origin or manufacture of different objects or by the fact that different objects change and develop according to some general patterns, such as the laws of nature. Patterns in painting or music may be laws of composition, and the construction of machines may be formulas of material resistance science, customs, legal laws in society, etc.
An analogue of an object is another object with a high degree of similarity to this object. Analogy (similarity) can be general or particular, for a specific parameter, static or dynamic, complete or partial, etc. Any object can have a significant number of different analogues.
We described the recognition of images and the formation of search indices using similarity coefficients obtained through the vector product of image matrices. But that’s not the only option available with PANN. We have also tested other features, in particular, recognition through:
1. Matrix products of the input and comparison arrays on the array representing the «comparison standard» [Xst] and CoS calculations through the difference of the resulting matrix sums.
2. Characteristic sums of two arrays and calculation of CoS through the difference in the power spectra of the input and compared arrays.
3. Fourier transformation of the amplitude-frequency spectra of the input with compared arrays, and calculation of CoS through the difference or ratio of the BCF format harmonics lines of the same name.
Different types of recognition can be used together to improve the accuracy and reliability of the conclusion.
2.5. COMPARISON OF LIBRARIES AS A BASIS FOR RECOGNITION
Recognition on PANN networks is similar to recognition in the living brain.
Human memory is a vast library containing many objects and information related to these objects. At the same time, many objects are directly or indirectly connected by associative connections. When we see an object, we compare it with images in our memory and thus recognize it, for example, as a dog, a house, or a car. When we recognize an object and recall its closest analogues, we can transfer information from analogues to this object. In this way, we gain additional knowledge about the object, realize the possibilities of using this object or protecting ourselves from it, and so on.
PANN networks work similarly. Comparison libraries are formed in the computer’s memory, and recognition operates by comparing the received information with the information in these libraries according to the degree of similarity determined by similarity coefficients.
PANN comparison libraries consist of «memory units» whereby:
1. Each «memory unit» is a numerical sequence that can be written in graphical or text formats or a BCF format explicitly designed for PANN.
2. Each «memory unit» can be provided with its indices (public and private, in different details) so that the PANN network can quickly search libraries for information.
3. Each «memory unit» has a complex structure and contains data on different parameters and properties of the object. For example, when one of us says «airplane,» many airplanes come to mind, seen in real life or pictures, as well as knowledge about their design and application and the problems we have solved for Boeing and other companies.
4. Each «memory unit» has associative, programmatic, hypertextual, etc., connections with many other «memory units.» For example, I associate an airplane with a rubber-engine model that I built as a child, with the time I almost got into an air accident, with the terrorist attacks of September 11, 2001, etc.
5. Also, the «memory unit» can store crucial additional information, including information leading to understanding the process, emotional attitude, assessment of its usefulness, harmfulness, risks, etc.
Fig. 10. Associative memory unit
The memory library provides the identification of a particular object and, on its basis, the identification of close analogues or antagonist objects and the possibility of transferring information related to the found analogues to the identified object.
Each newly identified «memory unit» can be included in the comparison libraries, allowing the PANN to be continuously retrained.
2.6. FORMATION OF A NEURAL NETWORK BASED
ON PROGRESS NEURONS
New unique opportunities in the formation of a neural network.
In classical neural networks, the first step is to form a network structure of «empty,» untrained neurons and a random set of weights at synapses. Only then does the training of the prepared network begin.
In PANN, the situation is entirely different: you can train any number of neurons individually, train neurons in groups of five, ten, hundreds, or thousands of neurons, or prepare entire libraries in BCF format. Then, you combine everything you need to get a single network.
Конец ознакомительного фрагмента.
Текст предоставлен ООО «Литрес».
Прочитайте эту книгу целиком, купив полную легальную версию на Литрес.
Безопасно оплатить книгу можно банковской картой Visa, MasterCard, Maestro, со счета мобильного телефона, с платежного терминала, в салоне МТС или Связной, через PayPal, WebMoney, Яндекс.Деньги, QIWI Кошелек, бонусными картами или другим удобным Вам способом.