Annex v0.2.1 Annex.Layer.Activation View Source

The Activation layer is the Annex.Layer that is responsible for applying an activation function to the data during the feedforward and supplying the gradient function (derivative) of the activation function to the Backprops during backpropagation.

Link to this section Summary

Link to this section Types

Link to this type

func_name() View Source
func_name() :: :relu | :sigmoid | :tanh | {:relu, number()}

Link to this type

func_type() View Source
func_type() :: :float | :list

Link to this type

t() View Source
t() :: %Annex.Layer.Activation{
  activator: (number() -> number()),
  derivative: (number() -> number()),
  func_type: func_type(),
  inputs: term(),
  name: atom(),
  outputs: term()
}

Link to this section Functions

Link to this function

from_name(name) View Source
from_name(func_name()) :: t() | no_return()

Link to this function

generate_outputs(layer, inputs) View Source
generate_outputs(t(), Annex.Data.data()) :: Annex.Data.data()

Link to this function

get_activator(activation) View Source
get_activator(t()) :: (number() -> number())

Link to this function

get_derivative(activation) View Source
get_derivative(t()) :: any()

Link to this function

get_inputs(activation) View Source
get_inputs(t()) :: Annex.Data.data()

Link to this function

get_name(activation) View Source
get_name(t()) :: any()

Link to this function

relu_deriv(x) View Source
relu_deriv(float()) :: float()

Link to this function

relu_deriv(x, threshold) View Source
relu_deriv(float(), float()) :: float()

Link to this function

sigmoid_deriv(x) View Source
sigmoid_deriv(float()) :: float()

Link to this function

softmax(values) View Source
softmax(data()) :: data()

Link to this function

tanh_deriv(x) View Source
tanh_deriv(float()) :: float()