ExTorch.NN.Module behaviour
(extorch v0.3.0)
Copy Markdown
DSL for defining neural network modules in Elixir.
Use this module to declaratively define neural network architectures
with a PyTorch-inspired syntax. Each module defines layers and a
forward/2 function.
Example
defmodule MyMLP do
use ExTorch.NN.Module
deflayer :fc1, ExTorch.NN.Linear, in_features: 784, out_features: 128
deflayer :relu, ExTorch.NN.ReLU
deflayer :fc2, ExTorch.NN.Linear, in_features: 128, out_features: 10
def forward(model, x) do
x
|> layer(model, :fc1)
|> layer(model, :relu)
|> layer(model, :fc2)
end
endUsage
# Fresh model with random weights
model = MyMLP.new()
output = MyMLP.forward(model, input)
# Load pre-trained weights from a TorchScript file
model = MyMLP.from_jit("model.pt")
output = MyMLP.forward(model, input)When loaded via from_jit/1, the JIT model's forward method is called
directly, using the pre-trained weights. The DSL definition serves as a
structural contract that is validated against the .pt file's submodules.
deflayer declares a layer at compile time. layer/3 is a runtime
function that looks up and applies a named layer during forward.
Summary
Callbacks
Layer behaviour that nn layer modules must implement.
Functions
Declare a layer in the module definition.
Callbacks
@callback create(keyword()) :: ExTorch.NN.Layer.t()
Layer behaviour that nn layer modules must implement.
Functions
Declare a layer in the module definition.
Arguments
name- Atom name for the layer.layer_module- The layer type module (e.g.,ExTorch.NN.Linear).opts- Keyword options passed to the layer'screate/1callback.