Neural Network v0.2.0 NeuralNetwork.Network View Source

Contains layers which makes up a matrix of neurons.

Link to this section Summary

Functions

Activate the network given list of input values

Return the network by pid

Pass in layer sizes which will generate the layers for the network. The first number represents the number of neurons in the input layer. The last number represents the number of neurons in the output layer. [Optionally] The middle numbers represent the number of neurons for hidden layers

Set the network error and output layer’s deltas propagate them backward through the network. (Back Propogation!)

Update the network layers

Activation function for the hidden layer

Link to this section Functions

Link to this function activate(network, input_values) View Source

Activate the network given list of input values.

Return the network by pid.

Link to this function start_link(layer_sizes \\ [], options \\ %{activation: :relu}) View Source

Pass in layer sizes which will generate the layers for the network. The first number represents the number of neurons in the input layer. The last number represents the number of neurons in the output layer. [Optionally] The middle numbers represent the number of neurons for hidden layers.

Link to this function train(network, target_outputs) View Source

Set the network error and output layer’s deltas propagate them backward through the network. (Back Propogation!)

The input layer is skipped (no use for deltas).

Update the network layers.

Link to this function update_activation(pid, activation) View Source

Activation function for the hidden layer.

‘identity’, no-op activation, useful to implement linear bottleneck, returns f(x) = x ‘sigmoid’, the logistic sigmoid function, returns f(x) = 1 / (1 + exp(-x)). ‘tanh’, the hyperbolic tan function, returns f(x) = tanh(x). ‘relu’, the rectified linear unit function, returns f(x) = max(0, x)