Neural Network v0.2.0 NeuralNetwork.Network View Source
Contains layers which makes up a matrix of neurons.
Link to this section Summary
Functions
Activate the network given list of input values
Return the network by pid
Pass in layer sizes which will generate the layers for the network. The first number represents the number of neurons in the input layer. The last number represents the number of neurons in the output layer. [Optionally] The middle numbers represent the number of neurons for hidden layers
Set the network error and output layer’s deltas propagate them backward through the network. (Back Propogation!)
Update the network layers
Activation function for the hidden layer
Link to this section Functions
Activate the network given list of input values.
Return the network by pid.
Pass in layer sizes which will generate the layers for the network. The first number represents the number of neurons in the input layer. The last number represents the number of neurons in the output layer. [Optionally] The middle numbers represent the number of neurons for hidden layers.
Set the network error and output layer’s deltas propagate them backward through the network. (Back Propogation!)
The input layer is skipped (no use for deltas).
Update the network layers.
Activation function for the hidden layer.
‘identity’, no-op activation, useful to implement linear bottleneck, returns f(x) = x ‘sigmoid’, the logistic sigmoid function, returns f(x) = 1 / (1 + exp(-x)). ‘tanh’, the hyperbolic tan function, returns f(x) = tanh(x). ‘relu’, the rectified linear unit function, returns f(x) = max(0, x)