View Source AxonInterp (axon_interp v0.1.0)
The thin wrapper for Axon inference. It is a Deep Learning inference framework that can be used in the same way as my *Interp.
Link to this section Summary
Functions
Get name of backend NN framework.
Ensure that the back-end framework is as expected.
Get the flat binary from the output tensor on the interpreter.
Get the propaty of the model.
Execute the inference session. In session mode, data input/execution of inference/output of results to the DL model is done all at once.
Put a flat binary to the input tensor on the interpreter.
Stop the axon interpreter.
Ensure that the model matches the back-end framework.
Link to this section Functions
Get name of backend NN framework.
Ensure that the back-end framework is as expected.
Get the flat binary from the output tensor on the interpreter.
parameters
Parameters
- session - session record
- index - index of output tensor in the model
- opts:
- [:binary] - convert the output to binary
Get the propaty of the model.
parameters
Parameters
- mod - modules' names
Execute the inference session. In session mode, data input/execution of inference/output of results to the DL model is done all at once.
parameters
Parameters
- session - session
examples
Examples.
output_bin0 = session()
|> AxonInterp.set_input_tensor(0, input_bin0)
|> AxonInterp.invoke()
|> AxonInterp.get_output_tensor(0)
Put a flat binary to the input tensor on the interpreter.
parameters
Parameters
- sessin - session record
- index - index of input tensor in the model
- bin - input data - flat binary, cf. serialized tensor
- opts
- [:binary] - data is binary
Stop the axon interpreter.
parameters
Parameters
- mod - modules' names
Ensure that the model matches the back-end framework.
parameters
Parameters
- model - path of model file
- url - download site