Annex v0.2.0 Annex.Layer.Activation View Source
The Activation layer is the Annex.Layer that is responsible for applying an activation function to the data during the feedforward and supplying the gradient function (derivative) of the activation function to the Backprops during backpropagation.
Link to this section Summary
Link to this section Types
Link to this type
data()
View Source
data()
View Source
data() :: Annex.Data.data()
data() :: Annex.Data.data()
Link to this type
func_name()
View Source
func_name()
View Source
func_name() :: :relu | :sigmoid | :tanh | {:relu, number()}
func_name() :: :relu | :sigmoid | :tanh | {:relu, number()}
Link to this type
func_type()
View Source
func_type()
View Source
func_type() :: :float | :list
func_type() :: :float | :list
Link to this section Functions
Link to this function
from_name(name) View Source
Link to this function
generate_outputs(layer, inputs)
View Source
generate_outputs(layer, inputs)
View Source
generate_outputs(t(), Annex.Data.data()) :: Annex.Data.data()
generate_outputs(t(), Annex.Data.data()) :: Annex.Data.data()
Link to this function
get_activator(activation) View Source
Link to this function
get_derivative(activation) View Source
Link to this function
get_inputs(activation)
View Source
get_inputs(activation)
View Source
get_inputs(t()) :: Annex.Data.data()
get_inputs(t()) :: Annex.Data.data()
Link to this function
get_name(activation) View Source
Link to this function
relu(n) View Source
Link to this function
relu_deriv(x) View Source
Link to this function
relu_deriv(x, threshold) View Source
Link to this function
sigmoid(n) View Source
Link to this function
sigmoid_deriv(x) View Source
Link to this function
softmax(values) View Source
Link to this function
tanh(n) View Source
Link to this function