Level-Tuned Synaptic Output for Analog Neural Network

Thread Starter

CSystems

Joined Dec 30, 2016
1
I am working on an open source VLSI implementation of an unsupervised analog neural network. While most neural networks compute the synaptic output as the product of the input and the weight, my synapse outputs the absolute difference subtracted from the supply (the synapse has the strongest output when the weight is equal to the input). I already have a very efficient circuit to store and tune the weight (similar to sample and hold, but instead compare and nudge). The weight can be stored in either a capacitor or memristor, but either way, the output portion of the circuit must consume very little current so that the stored weight is not significantly changed.

Output = VoltageSupply - ABS(VoltageWeight - VoltageInput)

This is fairly straightforward with op amps, but considering that there will be thousands of synapses in the system, I need to make the circuit that computes the above formula as simple and efficient as possible.

Any ideas would be greatly appreciated.
 
Top