Simple neural networks tested with xor and back propagation Array Blocks for the Neural Network Data layer outputs weights errors Scalable solution specificy the network dimensions e g 2 3 1 Be warned large networks with lots of layers is really really slow at training Some asserts scattered around to check basic array size aligment data let fp await fetch https cdn plot ly plotly 2 1 0 min js let ft await fp text var script document createElement script script src https cdn plot ly plotly 2 1 0 min js script innerHTML ft document head appendChild script const VARIANCE_W 0 5 const randomUniform min max return 0 123 Math random max min min const randomUniform min max Math random max min min const ru return randomUniform VARIANCE_W VARIANCE_W const layers 2 3 1 const maxLayerSize layers sort a b b a 0 const xordataset inputs 0 0 outputs 0 inputs 0 1 outputs 1 inputs 1 0 outputs 1 inputs 1 1 outputs 0 console assert xordataset 0 outputs length layers layers length 1 const weights Array layers length maxLayerSize maxLayerSize fill 0 const biases Array layers length maxLayerSize fill 0 const loutputs Array layers length maxLayerSize fill 0 const errors Array layers length maxLayerSize fill 0 const MAX_NEURONS_PER_LAYER maxLayerSize const NUM_LAYERS layers length for let i 0 i layers length 1 i for let k 0 k layers i i for let g 0 g layers i 1 g setWeight i k g ru function getBias layer neuron return biases layer MAX_NEURONS_PER_LAYER neuron function setBias layer neuron value biases layer MAX_NEURONS_PER_LAYER neuron value function setOutput layer neuron value loutputs layer MAX_NEURONS_PER_LAYER neuron value function getOutput layer neuron return loutputs layer MAX_NEURONS_PER_LAYER neuron function getWeight layer fromNeuron toNeuron return weights layer MAX_NEURONS_PER_LAYER MAX_NEURONS_PER_LAYER fromNeuron MAX_NEURONS_PER_LAYER toNeuron function setWeight layer fromNeuron toNeuron value weights layer MAX_NEURONS_PER_LAYER MAX_NEURONS_PER_LAYER fromNeuron MAX_NEURONS_PER_LAYER toNeuron value function setError layer neuron value errors layer MAX_NEURONS_PER_LAYER neuron value function getError layer neuron return errors layer MAX_NEURONS_PER_LAYER neuron const sigmoid x 1 0 1 0 Math exp x const sigmoidDerivative x x 1 x const relu x return Math max 0 0 x const reluDerivative x if x 0 0 return 1 0 return 0 0 const leakyRelu x alpha 0 01 return x 0 x alpha x const leakyReluDerivative x alpha 0 01 return x 0 1 alpha const activate iin console assert iin length outputs 0 length console assert weights 0 length layers 0 console assert weights 0 0 length layers 1 console assert weights 1 length layers 1 console assert weights 1 0 length layers 2 for let i 0 i NUM_LAYERS i if i 0 for let k 0 k iin length k setOutput 0 k iin k else for let k 0 k layers i k var sum 0 0 for let b 0 b layers i 1 b sum getOutput i 1 b getWeight i 1 b k setOutput i k sigmoid sum getBias i k return getOutput NUM_LAYERS 1 0 const propagate target alpha 0 2 for let i NUM_LAYERS 1 i 0 i for let k 0 k layers i k if i NUM_LAYERS 1 let error target k getOutput i k sigmoidDerivative getOutput i k setError i k error else setError i k 0 0 for let g 0 g layers i 1 g let error getError i 1 g getWeight i k g sigmoidDerivative getOutput i k setError i k error for let i 0 i NUM_LAYERS i for let k 0 k layers i k if i NUM_LAYERS 1 for let g 0 g layers i 1 g var weight getWeight i k g weight alpha getOutput i k getError i 1 g setWeight i k g weight let bias getBias i k bias alpha getError i k setBias i k bias test the neural network using iteration loop xor dataset console log new Date for let epoch 0 epoch 10000 epoch let indexes Array from Array xordataset length keys indexes sort Math random 0 5 for let j of indexes activate xordataset j inputs propagate xordataset j outputs 0 2 if epoch 1000 0 let cost 0 for let j 0 j xordataset length j let o activate xordataset j inputs for let b 0 b xordataset j outputs length b cost Math pow xordataset j outputs b o b 2 cost 4 console log epoch epoch mean squared error cost for let i 0 i xordataset length i const result activate xordataset i inputs console log for input xordataset i inputs expected xordataset i outputs predicted result 0 toFixed 4 which is Math round result 0 xordataset i outputs 0 correct incorrect console log done
alpha const activate iin console assert iin length outputs 0 length console assert weights 0 length layers 0 console assert weights 0 0 length layers 1 console assert weights 1 length layers 1 console assert weights 1 0 length layers 2 for let i 0 i NUM_LAYERS i if i 0 for let k 0 k iin length k setOutput 0 k iin k else for let k 0 k layers i k var sum 0 0 for let b 0 b layers i 1 b sum getOutput i 1 b getWeight i 1 b k setOutput i k sigmoid sum getBias i k return getOutput NUM_LAYERS 1 0 const propagate target alpha 0 2 for let i NUM_LAYERS 1 i 0 i for let k 0 k layers i k if i NUM_LAYERS 1 let error target k getOutput i k sigmoidDerivative getOutput i k setError i k error else setError i k 0 0 for let g 0 g layers i 1 g let error getError i 1 g getWeight i k g sigmoidDerivative getOutput i k setError i k error for let i 0 i NUM_LAYERS i for let k 0 k layers i k if i NUM_LAYERS 1 for let g 0 g layers i 1 g var weight getWeight i k g weight alpha getOutput i k getError i 1 g setWeight i k g weight let bias getBias i k bias alpha getError i k setBias i k bias test the neural network using iteration loop xor dataset console log new Date for let epoch 0 epoch 10000 epoch let indexes Array from Array xordataset length keys indexes sort Math random 0 5 for let j of indexes activate xordataset j inputs propagate xordataset j outputs 0 2 if epoch 1000 0 let cost 0 for let j 0 j xordataset length j let o activate xordataset j inputs for let b 0 b xordataset j outputs length b cost Math pow xordataset j outputs b o b 2 cost 4 console log epoch epoch mean squared error cost for let i 0 i xordataset length i const result activate xordataset i inputs console log for input xordataset i inputs expected xordataset i outputs predicted result 0 toFixed 4 which is Math round result 0 xordataset i outputs 0 correct incorrect console log done