Extensor v0.1.3 Extensor.Session
The session module provides functions for loading tensorflow graphs into a session and executing operations within the session. Graphs are represented by protocol buffers and named tensors are used for all inputs/outputs.
There are two primary methods for serializing models in tensorflow: frozen graph_defs and saved_models. A frozen graph_def is just a protocol buffer containing a compute graph with no variables (all variables have been frozen as consts). A saved_model is a directory containing metadata used for tensorflow serving (TFS) as well as weights for graph variables. For more information on these formats, see:
This module can be used to load either type of model.
parse/load_frozen_graph
loads a graph_def protocol buffer and imports
it using TF_GraphImportGraphDef
, and load_saved_model
loads a
saved_model using TF_LoadSessionFromSavedModel
. Both functions create and
return a reference to a tensorflow session that can be used to run
operations, like model inference. A tensorflow ConfigProto can be passed to
either function, in order to configure the session (GPU options, etc.).
Once the session has been created, it can be executed any number of times
using the run
function. Tensorflow sessions are also thread-safe and
maintain graph state per call, so they can be executed in parallel. The
run function accepts a map of named input tensors and a list of output
tensor names to evaluate.
Example (Pythagorean Triple):
iex> session = Extensor.Session.load_saved_model!("test/data/pythagoras")
iex> input = %{
"a" => Extensor.Tensor.from_list([5]),
"b" => Extensor.Tensor.from_list([12])
}
iex> output = Extensor.Session.run!(session, input, ["c"])
iex> Extensor.Tensor.to_list(output["c"])
[13.0]
See the Tensorflow.ConfigProto
module and
documentation
for more information on how to pass configuration when creating a new
session. The tensorflow protocol buffer modules were generated with the
protobuf-elixir library.
Link to this section Summary
Link to this section Types
Link to this section Functions
load_frozen_graph( path :: String.t(), config :: %Tensorflow.ConfigProto{ allow_soft_placement: term(), cluster_def: term(), device_count: term(), device_filters: term(), experimental: term(), gpu_options: term(), graph_options: term(), inter_op_parallelism_threads: term(), intra_op_parallelism_threads: term(), isolate_session_state: term(), log_device_placement: term(), operation_timeout_in_ms: term(), placement_period: term(), rpc_options: term(), session_inter_op_thread_pool: term(), use_per_session_threads: term() } ) :: {:ok, t()} | {:error, any()}
loads a graph_def from a file path
load_frozen_graph!( path :: String.t(), config :: %Tensorflow.ConfigProto{ allow_soft_placement: term(), cluster_def: term(), device_count: term(), device_filters: term(), experimental: term(), gpu_options: term(), graph_options: term(), inter_op_parallelism_threads: term(), intra_op_parallelism_threads: term(), isolate_session_state: term(), log_device_placement: term(), operation_timeout_in_ms: term(), placement_period: term(), rpc_options: term(), session_inter_op_thread_pool: term(), use_per_session_threads: term() } ) :: t() | no_return()
loads a graph_def from a file path
load_saved_model( path :: String.t(), config :: %Tensorflow.ConfigProto{ allow_soft_placement: term(), cluster_def: term(), device_count: term(), device_filters: term(), experimental: term(), gpu_options: term(), graph_options: term(), inter_op_parallelism_threads: term(), intra_op_parallelism_threads: term(), isolate_session_state: term(), log_device_placement: term(), operation_timeout_in_ms: term(), placement_period: term(), rpc_options: term(), session_inter_op_thread_pool: term(), use_per_session_threads: term() }, tag :: String.t() ) :: {:ok, t()} | {:error, any()}
loads a saved_model from a directory path
load_saved_model!( path :: String.t(), config :: %Tensorflow.ConfigProto{ allow_soft_placement: term(), cluster_def: term(), device_count: term(), device_filters: term(), experimental: term(), gpu_options: term(), graph_options: term(), inter_op_parallelism_threads: term(), intra_op_parallelism_threads: term(), isolate_session_state: term(), log_device_placement: term(), operation_timeout_in_ms: term(), placement_period: term(), rpc_options: term(), session_inter_op_thread_pool: term(), use_per_session_threads: term() }, tag :: String.t() ) :: t() | no_return()
loads a saved_model from a directory path
parse_frozen_graph( graph_pb :: binary(), config :: %Tensorflow.ConfigProto{ allow_soft_placement: term(), cluster_def: term(), device_count: term(), device_filters: term(), experimental: term(), gpu_options: term(), graph_options: term(), inter_op_parallelism_threads: term(), intra_op_parallelism_threads: term(), isolate_session_state: term(), log_device_placement: term(), operation_timeout_in_ms: term(), placement_period: term(), rpc_options: term(), session_inter_op_thread_pool: term(), use_per_session_threads: term() } ) :: {:ok, t()} | {:error, any()}
loads a graph_def from a binary string
parse_frozen_graph!( graph_pb :: binary(), config :: %Tensorflow.ConfigProto{ allow_soft_placement: term(), cluster_def: term(), device_count: term(), device_filters: term(), experimental: term(), gpu_options: term(), graph_options: term(), inter_op_parallelism_threads: term(), intra_op_parallelism_threads: term(), isolate_session_state: term(), log_device_placement: term(), operation_timeout_in_ms: term(), placement_period: term(), rpc_options: term(), session_inter_op_thread_pool: term(), use_per_session_threads: term() } ) :: t() | no_return()
loads a graph_def from a binary string
run( session :: t(), input_tensors :: %{optional(String.t()) => Extensor.Tensor.t()}, output_names :: [String.t(), ...] ) :: {:ok, %{optional(String.t()) => Extensor.Tensor.t()}} | {:error, any()}
executes a tensorflow session
run!( session :: t(), input_tensors :: %{optional(String.t()) => Extensor.Tensor.t()}, output_names :: [String.t(), ...] ) :: %{optional(String.t()) => Extensor.Tensor.t()}
executes a tensorflow session