2 Siding ReLU via Forward Projections
2 Siding ReLU via Forward Projections The ReLU acivation function in neural networks has special properties that allow it to be considered in a different way to other activation functions. The Switching Viewpoint You can view ReLU as being non-conducting when x<=0 and fully conducting when x>0. It is a switch which is automatically turned on when x>0 and automatically turned off when x<=0. Which is an ideal rectifier in electrical engineering terms. Hence the term Rectified Linear Unit. A switch isn't there when it is on from the point of view of the flowing electricity (or analogous thing.) Electricity flows through pushed together switch contacts the same as through the wires to the switch. All the ReLUs in a neural network that are conducting, wire together various sub-components of the network. The wiring being complete the ReLUs become essentially invisible, until one or more ReLUs change state. Since neural networks are computed in a ...
Comments
Post a Comment