Intelligent Neural Network Machine with Thinking Functions
- Autores: Osipov V.Y.1
-
Afiliações:
- St. Petersburg Federal Research Center of the Russian Academy of Sciences (SPC RAS)
- Edição: Volume 23, Nº 4 (2024)
- Páginas: 1077-1109
- Seção: Artificial intelligence, knowledge and data engineering
- URL: https://bakhtiniada.ru/2713-3192/article/view/265766
- DOI: https://doi.org/10.15622/ia.23.4.6
- ID: 265766
Citar
Texto integral
Resumo
Palavras-chave
Sobre autores
V. Osipov
St. Petersburg Federal Research Center of the Russian Academy of Sciences (SPC RAS)
Email: osipov_vasiliy@mail.ru
14-th Line V.O. 39
Bibliografia
- Тьюринг А. Может ли машина мыслить? С приложением статьи Дж. фон Неймана «Общая и логическая теория автоматов». Перевод с английского Ю.А. Данилова. М.: Физ.-Мат. Лит., 1960. 112 С.
- Мышление – Большой энциклопедический словарь. URL: https://gufo.me/dict/bes/МЫШЛЕНИЕ (дата доступа 05.04.2024).
- Velankar M.R., Mahalle P.N., Shinde G.R. Machine Thinking: New Paradigm Shift. In: Cognitive Computing for Machine Thinking. Innovations in Sustainable Technologies and Computing. 2024. pp. 43–53.
- Malsburg C. Toward understanding the neural code of the brain. Biological Cybernetics. 2021. vol. 115. no. 5. pp. 439–449.
- Yamakawa H. The whole brain architecture approach: accelerating the development of artificial general intelligence by referring to the brain. Neural Networks. 2021. vol. 144. pp. 478–495.
- Haykin S. Neural Networks and Learning Machines, third ed., Prentice Hall, New York. 2008. URL: http://dai.fmph.uniba.sk/courses/NN/haykin.neural-networks.3ed.2009.pdf (дата доступа 24.04.2024).
- Kotseruba I., Tsotsos J. 40 years of cognitive architectures: core cognitive abilities and practical applications. Artificial Intelligence Review. 2020. vol. 53. no. 1. pp. 17–94.
- Dormehl L. Thinking machine: The Quest for Artificial Intelligence – and Where It's Taking Us Next. Penguin, 2017. 209 p.
- Takano S. Thinking Machines. Machine Learning and Its Hardware Implementation. Academic Press, 2021. 306 p.
- Hawkins J., Blakeslee S. On intelligence. Brown Walker, 2006. 174 p.
- Osipov V., Osipova M. Space-time signal binding in recurrent neural networks with controlled elements. Neurocomputing. 2018. vol. 308. pp. 194–204.
- Hawkins J., Ahmad S. Hierarchical temporal memory including HTM cortical learning algorithms. Hosted at Numenta.org. 2011. 68 p.
- Spoerer C.J., McClure, P., Kriegeskorte, N., 2017. Recurrent convolutional neural networks: a better model of biological object recognition. Frontiers in psychology. 2017. vol. 8. doi: 10.3389/fpsyg.2017.01551.
- Patrick M., Adekoya A., Mighty A., Edward B. Capsule networks – a survey. Journal of King Saud University – Computer and Information Sciences. 2022. vol. 34(1). pp. 1295–1310.
- Yang G., Ding F. Associative memory optimized method on deep neural networks for image classification. Information Sciences. 2020. vol. 533. pp. 108–119.
- Yang J., Zhang L., Chen C., Li Y., Li R., Wang G., Jiang S., Zeng Z. A hierarchical deep convolutional neural network and gated recurrent unit framework for structural damage detection. Information Sciences. 2020. vol. 540. pp. 117–130.
- Ma T., Lv S., Huang L., Hu S. HiAM: A hierarchical attention based model for knowledge graph multi-hop reasoning. Neural Networks. 2021. vol. 143. pp. 261–270.
- Grossberg S. Adaptive resonance theory: how a brain learns to consciously attend, learn, and recognize a changing world. Neural Networks. 2013. vol. 37. pp. 1–47.
- Khowaja S., Lee S.L. Hybryd and hierarchical fusion networks: a deep cross-modal learning architecture for action recognition. Neural Computing and Applications. 2020. vol. 32. no. 14. pp. 10423–10434.
- Saha S., Gan Z., Cheng L., Gao J., Kafka O., Xie X., Li H., Tajdari M., Kim H., Liu W. Hierarchical deep learning neural network (HiDeNN): an artificial intelligence (AI) framework for computational science and engineering. Computer Methods in Applied Mechanics and Engineering. 2021. vol. 373. doi: 10.1016/j.cma.2020.113452.
- Yang M., Chen L., Lyu Z., Liu J., Shen Y., Wu Q. Hierarchical fusion of common sense knowledge and classifier decisions for answer selection in community question answering. Neural Networks. 2020. vol. 132. pp. 53–65.
- Wolfrum P., Wolff C., Lucke J., Malsburg C. A recurrent dynamic model for correspondence-based face recognition. Journal of Vision. 2008, vol. 8(7). no. 34. pp. 1–18. doi: 10.1167/8.7.34.
- Han Y., Huang G., Song S., Yang L., Wang H., Wang Y. Dynamic neural networks: a survey. arXiv:2102.04906v4. 2021. pp. 1–20.
- Osipov V., Nikiforov V., Zhukova N., Miloserdov D. Urban traffic flows forecasting by recurrent neural networks with spiral structures of layers. Neural Computing and Applications. 2020. vol. 32. no. 18. pp. 14885–14897.
- Osipov V., Kuleshov S., Zaytseva A., Levonevskiy D., Miloserdov D. Neural network forecasting of news feeds. Expert systems with applications. 2021. vol. 169. doi: 10.1016/j.eswa.2020.114521.
- Osipov V., Kuleshov S., Miloserdov D., Zaytseva A., Aksenov A. Recurrent Neural Networks with Continuous Learning in Problems of News Streams Multifunctional Processing. Informatics and Automation. 2022. vol. 21. no. 6. pp. 1145–1168.
- Osipov V., Osipova M. Method and device of intellectual processing of information in neural network, Patent RU2413304. 2011.
- Osipov V. Method for intelligent multi-level information processing in neural network, Patent RU2737227. 2020.
- He J., Yang H., He L., Zhao L. Neural networks based on vectorized neurons. Neurocomputing. 2021. vol. 465. pp. 63–70.
- Deng C., Litany O., Duan Y., Poulenard A., Tagliasacchi A., Guibas L. Vector neurons: a general framework for SO(3)-Equivariant networks. arXiv:2104.12229v1. 2021. pp. 1–12.
- Kryzhanovsky B., Litinskii L., Mikaelian A. Vector-Neuron Model of Associative Memory. IEEE International Joint Conference on Neural Networks. 2004. vol. 2. pp. 909–914.
- Tuszynski J.A., Friesen D.E., Freedman H., Sbitnev V.I., Kim H., Santelices L., Kalra A., Patel S., Shankar K., Chua L.O. Microtubules as Sub-Cellular Memristors. Scientific Reports. 2020. vol. 10(1). doi: 10.1038/s41598-020-58820-y.
- Bicanski A., Burgess N. Neural vector coding in spatial cognition. Nature Reviews Neuroscience. 2020. vol. 21. pp. 453–470.
- Rvachev M. V. Neuron as a reward-modulated combinatorial switch and a model of learning behavior. Neural Networks. 2013. vol. 46. pp. 62–74.
- Осипов В.Ю. Векторные свойства и память нейронов. Сборник тезисов XXIV съезда физиологического общества им. И.П. Павлова. 2023. С. 586–587.
- Sardi S., Vardi R., Sheinin A., Goldental A., Kanter I. New types of experiments reveal that a neuron functions as multiple independent threshold units. Scientific Reports. 2017. vol. 7(1). doi: 10.1038/s41598-017-18363-1.
Arquivos suplementares
