GenAI.Provider.LocalLLama (GenAI local model extension v0.1.0)

This module implements the GenAI provider for Local AI.

Link to this section Summary

Functions

Sends a chat completion request to the LocalLlama

Low level inference, pass in model, messages, tools, and various settings to prepare final provider specific API requires.

Retrieves a list of available Local models.

Link to this section Functions

Link to this function

chat(messages, tools, settings)

Sends a chat completion request to the LocalLlama

This function constructs the request body based on the provided messages, tools, and settings, sends the request to the ExLLama instanceI, and returns a GenAI.ChatCompletion struct with the response.

Link to this function

chat(model, messages, tools, hyper_parameters, provider_settings \\ [])

Low level inference, pass in model, messages, tools, and various settings to prepare final provider specific API requires.

Link to this function

models(settings \\ [])

Retrieves a list of available Local models.

This function calls the Local API to retrieve a list of models and returns them as a list of GenAI.Model structs.