API Reference extorch v#0.2.0

Copy Markdown

Modules

The ExTorch namespace contains data structures for multi-dimensional tensors and mathematical operations over these are defined. Additionally, it provides many utilities for efficient serializing of Tensors and arbitrary types, and other useful utilities.

An ExTorch.Complex is a struct that represents a complex number with real and imaginary parts.

A torch.dtype is an object that represents the data type of a torch.Tensor. ExTorch has twelve different data types

Public API documentation for DelegateWithDocs. This module is based on https://github.com/danielberkompas/delegate_with_docs

A torch.device is an object representing the device on which a torch.Tensor is or will be allocated. The torch.device contains a device type ('cpu' or 'cuda') and optional device ordinal for the device type. If the device ordinal is not present, this object will always represent the current device for the device type, even after torch.cuda.set_device() is called; e.g., a torch.Tensor constructed with device 'cuda' is equivalent to 'cuda:X' where X is the result of torch.cuda.current_device().

An index is an object that can act as an accessor to a ExTorch.Tensor. ExTorch has five kinds of indices

Slice index definition.

TorchScript model loading, inference, and management.

Represents a loaded TorchScript model.

A GenServer that wraps a loaded TorchScript model for concurrent serving.

A torch.layout is an object that represents the memory layout of a torch.Tensor. Currently, we support torch.strided (dense Tensors) and have beta support for torch.sparse_coo (sparse COO Tensors).

A torch.memory_format is an object representing the memory format on which a torch.Tensor is or will be allocated.

ETS-backed metrics collection for ExTorch model serving.

Utilities used to define a module mixin that inherits documentation and specs.

Neural network layer creation and operations.

1D adaptive average pooling.

2D adaptive average pooling.

1D average pooling.

2D average pooling.

1D batch normalization layer.

2D batch normalization layer.

1D convolution layer.

2D convolution layer.

3D convolution layer.

1D transposed convolution.

2D transposed convolution.

Dropout layer.

ELU activation.

Embedding layer.

Flatten layer.

GELU activation.

GRU recurrent layer.

Group normalization.

1D instance normalization.

2D instance normalization.

Introspect the structure of TorchScript (JIT) models.

Structured representation of a JIT model's architecture.

A model instance backed by a loaded TorchScript (.pt) file.

LSTM recurrent layer.

Represents an instantiated neural network layer (nn.Module).

Layer normalization.

LeakyReLU activation.

Linear (fully connected) layer.

LogSoftmax.

1D max pooling.

2D max pooling.

Mish activation.

DSL for defining neural network modules in Elixir.

Multi-head attention.

PReLU activation.

ReLU activation.

SiLU (Swish) activation.

Sigmoid activation.

Softmax activation.

Tanh activation.

Unflatten layer.

The ExTorch.Native module contains all NIF declarations to call libtorch in C++.

Conveniences for declaring native calls to a library in Rustler.

General purpose macros to automatically generate binding declarations and calls for both ExTorch callable functions and Rustler signature calls to the NIF library.

A Phoenix LiveDashboard page for monitoring ExTorch model serving.

An ExTorch.Scalar is any singular value that can be stored in a ExTorch.Tensor.

An ExTorch.Tensor is a multi-dimensional matrix containing elements of a single data type.

Zero-copy tensor exchange between ExTorch and other tensor frameworks.

A tensor view backed by foreign memory.

The ExTorch.Tensor.Options struct defines the creation parameters of a tensor.

Struct used to represent a list with elements or lists of elements.

Tensor printing options.

General type hierarchy comparison utils