Switch Net

 Switch Net

Switch Net.

Switch Net is a neural network based on using the fast Walsh Hadamard tranform (WHT) as a fixed set of layer weights with a parametric switching based activation function.
The cost of the WHT is nlog2(n) add subtracts as opposed to n squared fused multiply adds for a conventional dense neural layer's weight calculations.

A fixed random pattern of sign flips is applied to the input data so that the first WHT fairly distributes information with a Gaussian distribution across its outputs.

Then a switching decision is applied to each output of the WHT and the output is multiplied by one weight or another. 
The section between the red lines is basically a layer and can be repeated a number of times.
A final WHT is done to combine the last lot of switched weight outputs.
Some example code is here:

One slight issue is if the switching weights for a particular output have opposite signs then the resultant output can only have one sign, + or -. Which is information loss. However ReLU in a conventional net also loses information when its input x<0. 
Nevertheless it would seem better,technically, to make the switch net 2 or 4 times wider than the input to ameliorate information loss issues.
Reference:
Also similar to Switch Net is Switch Net 4:




Comments

  1. Linux AMD 64 code with backpropagation:
    https://archive.org/details/switchnetbp

    ReplyDelete
  2. Processing (Java like) code with backpropagation:
    https://archive.org/details/swnetprocessing

    ReplyDelete
  3. JavaScript online version: https://editor.p5js.org/siobhan.491/sketches/RvqZfikaE

    ReplyDelete
  4. JavaScript stochastic gradient descent online version with random initialization:
    https://editor.p5js.org/siobhan.491/sketches/RvqZfikaE

    ReplyDelete
  5. Current versions of SwitchNet: https://archive.org/details/swnet2

    ReplyDelete
  6. Java code, about 70% the performance of handwritten assembly language on a CPU.
    https://archive.org/details/switch-net

    ReplyDelete
  7. Java code with some Float.floatToRawIntBits() type bit-hacks: https://archive.org/details/switch-net-java-2

    ReplyDelete

Post a Comment

Popular posts from this blog

2 Siding ReLU via Forward Projections

Artificial Neural Networks