Krestinskaya, Olga2020-02-142020-02-142018-05http://nur.nu.edu.kz/handle/123456789/4497The on-chip implementation of learning algorithms would accelerate the training of neural networks in crossbar arrays. The circuit level design and implementation of backpropagation algorithm using gradient descent operation for neural network architectures is an open problem. In addition, the learning architecture for Hierarchical Temporal Memory (HTM) has not been proposed yet. In this work, the HTM learning process is investigated. The analog hardware implementation of backpropagation learning circuit based on memristive crossbar arrays is proposed. The learning stages in HTM are investigated. The learning circuit for HTM Temporal Memory is proposed. The integration of HTM Spatial Pooler with the backpropagation learning stage is illustrated. The study of rule-based HTM Spatial Pooler without learning is shown. The analog backpropagation learning circuits for various memristive learning architectures, such as Deep Neural Network (DNN), Binary Neural Network (BNN), Multiple Neural Network (MNN), Hierarchical Temporal Memory (HTM) and Long-Short Term Memory (LSTM) are proposed. The implementation of additional circuit and activation functions that can be used in the construction of various biologically inspired learning architectures is shown. The circuits are simulated in SPICE using TSMC 180nm CMOS process models, and HP memristor models. The proposed learning methods are tested for various visual data processing applications, such as face recognition and handwritten digits recognition.enmultiple neural networkResearch Subject Categories::TECHNOLOGYResearch Subject Categories::TECHNOLOGY::Electrical engineering, electronics and photonicsbinary neural networkBNNMNNdeep neural networkDNNSLAM: Spatiotemporal Learning with AnalogMemristive Circuits for HTMMaster's thesis