Standard

Necessary conditions for STDP-based pattern recognition learning in a memristive spiking neural network. / Demin, V. A.; Nekhaev, D. V.; Surazhevsky, I. A.; Nikiruy, K. E.; Emelyanov, A. V.; Nikolaev, S. N.; Rylkov, V. V.; Kovalchuk, M. V.

In: Neural Networks, Vol. 134, 01.02.2021, p. 64-75.

Research output: Contribution to journalArticlepeer-review

Harvard

Demin, VA, Nekhaev, DV, Surazhevsky, IA, Nikiruy, KE, Emelyanov, AV, Nikolaev, SN, Rylkov, VV & Kovalchuk, MV 2021, 'Necessary conditions for STDP-based pattern recognition learning in a memristive spiking neural network', Neural Networks, vol. 134, pp. 64-75. https://doi.org/10.1016/j.neunet.2020.11.005

APA

Demin, V. A., Nekhaev, D. V., Surazhevsky, I. A., Nikiruy, K. E., Emelyanov, A. V., Nikolaev, S. N., Rylkov, V. V., & Kovalchuk, M. V. (2021). Necessary conditions for STDP-based pattern recognition learning in a memristive spiking neural network. Neural Networks, 134, 64-75. https://doi.org/10.1016/j.neunet.2020.11.005

Vancouver

Demin VA, Nekhaev DV, Surazhevsky IA, Nikiruy KE, Emelyanov AV, Nikolaev SN et al. Necessary conditions for STDP-based pattern recognition learning in a memristive spiking neural network. Neural Networks. 2021 Feb 1;134:64-75. https://doi.org/10.1016/j.neunet.2020.11.005

Author

Demin, V. A. ; Nekhaev, D. V. ; Surazhevsky, I. A. ; Nikiruy, K. E. ; Emelyanov, A. V. ; Nikolaev, S. N. ; Rylkov, V. V. ; Kovalchuk, M. V. / Necessary conditions for STDP-based pattern recognition learning in a memristive spiking neural network. In: Neural Networks. 2021 ; Vol. 134. pp. 64-75.

BibTeX

@article{ac3e1d715e084ded849a60f892eb539c,
title = "Necessary conditions for STDP-based pattern recognition learning in a memristive spiking neural network",
abstract = "This work is aimed to study experimental and theoretical approaches for searching effective local training rules for unsupervised pattern recognition by high-performance memristor-based Spiking Neural Networks (SNNs). First, the possibility of weight change using Spike-Timing-Dependent Plasticity (STDP) is demonstrated with a pair of hardware analog neurons connected through a (CoFeB)x(LiNbO3)1−x nanocomposite memristor. Next, the learning convergence to a solution of binary clusterization task is analyzed in a wide range of memristive STDP parameters for a single-layer fully connected feedforward SNN. The memristive STDP behavior supplying convergence in this simple task is shown also to provide it in the handwritten digit recognition domain by the more complex SNN architecture with a Winner-Take-All competition between neurons. To investigate basic conditions necessary for training convergence, an original probabilistic generative model of a rate-based single-layer network with independent or competing neurons is built and thoroughly analyzed. The main result is a statement of “correlation growth-anticorrelation decay” principle which prompts near-optimal policy to configure model parameters. This principle is in line with requiring the binary clusterization convergence which can be defined as the necessary condition for optimal learning and used as the simple benchmark for tuning parameters of various neural network realizations with population-rate information coding. At last, a heuristic algorithm is described to experimentally find out the convergence conditions in a memristive SNN, including robustness to a device variability. Due to the generality of the proposed approach, it can be applied to a wide range of memristors and neurons of software- or hardware-based rate-coding single-layer SNNs when searching for local rules that ensure their unsupervised learning convergence in a pattern recognition task domain.",
keywords = "Hardware analog neuron, Memristive STDP, Memristor, Probabilistic generative model, Spiking neural network, Unsupervised learning, Neural Networks, Computer, Neurons/physiology, Pattern Recognition, Automated/methods, Algorithms, Neuronal Plasticity/physiology, Models, Neurological, DEVICE, MODEL, TIMING-DEPENDENT PLASTICITY",
author = "Demin, {V. A.} and Nekhaev, {D. V.} and Surazhevsky, {I. A.} and Nikiruy, {K. E.} and Emelyanov, {A. V.} and Nikolaev, {S. N.} and Rylkov, {V. V.} and Kovalchuk, {M. V.}",
note = "Publisher Copyright: {\textcopyright} 2020 Elsevier Ltd",
year = "2021",
month = feb,
day = "1",
doi = "10.1016/j.neunet.2020.11.005",
language = "English",
volume = "134",
pages = "64--75",
journal = "Neural Networks",
issn = "0893-6080",
publisher = "Elsevier",

}

RIS

TY - JOUR

T1 - Necessary conditions for STDP-based pattern recognition learning in a memristive spiking neural network

AU - Demin, V. A.

AU - Nekhaev, D. V.

AU - Surazhevsky, I. A.

AU - Nikiruy, K. E.

AU - Emelyanov, A. V.

AU - Nikolaev, S. N.

AU - Rylkov, V. V.

AU - Kovalchuk, M. V.

N1 - Publisher Copyright: © 2020 Elsevier Ltd

PY - 2021/2/1

Y1 - 2021/2/1

N2 - This work is aimed to study experimental and theoretical approaches for searching effective local training rules for unsupervised pattern recognition by high-performance memristor-based Spiking Neural Networks (SNNs). First, the possibility of weight change using Spike-Timing-Dependent Plasticity (STDP) is demonstrated with a pair of hardware analog neurons connected through a (CoFeB)x(LiNbO3)1−x nanocomposite memristor. Next, the learning convergence to a solution of binary clusterization task is analyzed in a wide range of memristive STDP parameters for a single-layer fully connected feedforward SNN. The memristive STDP behavior supplying convergence in this simple task is shown also to provide it in the handwritten digit recognition domain by the more complex SNN architecture with a Winner-Take-All competition between neurons. To investigate basic conditions necessary for training convergence, an original probabilistic generative model of a rate-based single-layer network with independent or competing neurons is built and thoroughly analyzed. The main result is a statement of “correlation growth-anticorrelation decay” principle which prompts near-optimal policy to configure model parameters. This principle is in line with requiring the binary clusterization convergence which can be defined as the necessary condition for optimal learning and used as the simple benchmark for tuning parameters of various neural network realizations with population-rate information coding. At last, a heuristic algorithm is described to experimentally find out the convergence conditions in a memristive SNN, including robustness to a device variability. Due to the generality of the proposed approach, it can be applied to a wide range of memristors and neurons of software- or hardware-based rate-coding single-layer SNNs when searching for local rules that ensure their unsupervised learning convergence in a pattern recognition task domain.

AB - This work is aimed to study experimental and theoretical approaches for searching effective local training rules for unsupervised pattern recognition by high-performance memristor-based Spiking Neural Networks (SNNs). First, the possibility of weight change using Spike-Timing-Dependent Plasticity (STDP) is demonstrated with a pair of hardware analog neurons connected through a (CoFeB)x(LiNbO3)1−x nanocomposite memristor. Next, the learning convergence to a solution of binary clusterization task is analyzed in a wide range of memristive STDP parameters for a single-layer fully connected feedforward SNN. The memristive STDP behavior supplying convergence in this simple task is shown also to provide it in the handwritten digit recognition domain by the more complex SNN architecture with a Winner-Take-All competition between neurons. To investigate basic conditions necessary for training convergence, an original probabilistic generative model of a rate-based single-layer network with independent or competing neurons is built and thoroughly analyzed. The main result is a statement of “correlation growth-anticorrelation decay” principle which prompts near-optimal policy to configure model parameters. This principle is in line with requiring the binary clusterization convergence which can be defined as the necessary condition for optimal learning and used as the simple benchmark for tuning parameters of various neural network realizations with population-rate information coding. At last, a heuristic algorithm is described to experimentally find out the convergence conditions in a memristive SNN, including robustness to a device variability. Due to the generality of the proposed approach, it can be applied to a wide range of memristors and neurons of software- or hardware-based rate-coding single-layer SNNs when searching for local rules that ensure their unsupervised learning convergence in a pattern recognition task domain.

KW - Hardware analog neuron

KW - Memristive STDP

KW - Memristor

KW - Probabilistic generative model

KW - Spiking neural network

KW - Unsupervised learning

KW - Neural Networks, Computer

KW - Neurons/physiology

KW - Pattern Recognition, Automated/methods

KW - Algorithms

KW - Neuronal Plasticity/physiology

KW - Models, Neurological

KW - DEVICE

KW - MODEL

KW - TIMING-DEPENDENT PLASTICITY

UR - http://www.scopus.com/inward/record.url?scp=85097344390&partnerID=8YFLogxK

UR - https://www.mendeley.com/catalogue/1a39d286-10a6-3467-a909-1d252470fecf/

U2 - 10.1016/j.neunet.2020.11.005

DO - 10.1016/j.neunet.2020.11.005

M3 - Article

C2 - 33291017

AN - SCOPUS:85097344390

VL - 134

SP - 64

EP - 75

JO - Neural Networks

JF - Neural Networks

SN - 0893-6080

ER -

ID: 88196232