Özet:
Analog neural networks exhibit a potential for proper/suitable hardware implementation of artificial neural networks. Their advantages such as small size, low power and high speed, however, are seriously questioned/confronted by the difficulty in the training of analog neural network circuitry. Especially, hardware training by software, i.e., training of the circuitry by software based on models, so as to avoid on-chip and chip in-the-loop training methods, is threatened by circuit nonidealities and variations at outputs of identical blocks. The performance of the analog neural network is severely degraded in the presence of those unwanted effects caused mainly by statistical variations in the production process. We propose a new paradigm for the backpropagation algorithm in hardware training of multilayer perceptron type analog neural networks. The variations at outputs of analog neural network circuitry are modeled based on the transistor level mismatches occurring between identically designed transistors. Those variations can be used as additive noise during the training, and it has been shown that this will increase fault tolerance of the trained neural network drastically. The method has been compared to the method of injecting random noise, and our method outperforms the latter where injecting random noise is seen to be inadequate for establishing a satisfactory level of fault tolerance in the presence of mismatch based variations. The concept of mismatch based variations has been verified by measurements on our test chip.