View Source HipcallOpenai (HipcallOpenai v0.2.0)

Documentation for HipcallOpenai.

Summary

Functions

Create chat completion

The same chat_completions/1 just overwrite config.

Retrieve model

The same with model/1 just overwrite the config.

Functions

Link to this function

chat_completions(params)

View Source
@spec chat_completions(params :: keyword()) ::
  {:ok, map()} | {:error, map()} | {:error, any()}

Create chat completion

For more information https://platform.openai.com/docs/api-reference/chat/create

Examples

iex> params = [
iex>   model: "gpt-4-1106-preview",
iex>   messages: [
iex>     %{role: "system", content: "Sen yardımcı olan bir asistansın."},
iex>     %{role: "user", content: "Merhaba!"}
iex>   ]
iex> ]
iex> HipcallOpenai.chat_completions(params)
...> {:ok,
...>   %{
...>     "choices" => [
...>       %{
...>         "finish_reason" => "stop",
...>         "index" => 0,
...>         "logprobs" => nil,
...>         "message" => %{
...>           "content" => "Merhaba! Size nasıl yardımcı olabilirim?",
...>           "role" => "assistant"
...>         }
...>       }
...>     ],
...>     "created" => 1705330002,
...>     "id" => "chatcmpl-8hIWYa9L3HBatp1Wyp5fJfneaBMUi",
...>     "model" => "gpt-4-1106-preview",
...>     "object" => "chat.completion",
...>     "system_fingerprint" => "fp_168383a679",
...>     "usage" => %{
...>       "completion_tokens" => 15,
...>       "prompt_tokens" => 27,
...>       "total_tokens" => 42
...>     }
...>   }}

Params

  • :model (String.t/0) - ID of the model to use. See the model endpoint compatibility table for details on which models work with the Chat API. The default value is "gpt-3.5-turbo".

  • :messages (list of map/0) - A list of messages comprising the conversation so far. The default value is [%{role: "system", content: "You are a helpful assistant."}, %{role: "user", content: "Hello!"}].

  • :frequency_penalty (float/0) - Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model's likelihood to repeat the same line verbatim.

  • :max_tokens (pos_integer/0) - The maximum number of tokens that can be generated in the chat completion. The total length of input tokens and generated tokens is limited by the model's context length.

  • :stream (boolean/0) - If set, partial message deltas will be sent, like in ChatGPT. Tokens will be sent as data-only server-sent events as they become available, with the stream terminated by a data: [DONE] message.

  • :temperature (float/0) - What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic.

  • :user (String.t/0) - A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse.

Raises

Returns

  • {:ok, Finch.Response.t()}
  • {:error, Exception.t()}
Link to this function

chat_completions(params, config)

View Source
@spec chat_completions(params :: keyword(), config :: struct()) ::
  {:ok, map()} | {:error, map()} | {:error, any()}

The same chat_completions/1 just overwrite config.

@spec model(model :: String.t()) :: {:ok, map()} | {:error, map()} | {:error, any()}

Retrieve model

For more information https://platform.openai.com/docs/api-reference/models/retrieve

Examples

iex> iex(1)> iex(1)> HipcallOpenai.model("gpt-3.5-turbo-instruct")
...> {:ok,
...>  %{
...>    "created" => 1692901427,
...>    "id" => "gpt-3.5-turbo-instruct",
...>    "object" => "model",
...>    "owned_by" => "system"
...>  }}

Arguments

  • model

Raises

There is no exceptions.

Returns

  • {:ok, map()}
  • {:error, Exception.t()}
@spec model(model :: String.t(), config :: struct()) ::
  {:ok, map()} | {:error, map()} | {:error, any()}

The same with model/1 just overwrite the config.

Link to this function

models(config \\ %Config{})

View Source
@spec models(config :: struct()) :: {:ok, map()} | {:error, map()} | {:error, any()}

List models

For more information https://platform.openai.com/docs/api-reference/models

Examples

iex> iex(1)> config_override = %Config{api_key: "asdf_api"}
iex> iex(2)> HipcallOpenai.models(config_override)
...> {:ok,
...> %{
...>   "data" => [
...>     %{
...>       "created" => 1686588896,
...>       "id" => "gpt-4-0613",
...>       "object" => "model",
...>       "owned_by" => "openai"
...>     },
...>     %{
...>       "created" => 1651172509,
...>       "id" => "curie-search-query",
...>       "object" => "model",
...>       "owned_by" => "openai-dev"
...>     },
...>     %{
...>       "created" => 1687882411,
...>       "id" => "gpt-4",
...>       "object" => "model",
...>       "owned_by" => "openai"
...>     },
...>     %{
...>       "created" => 1651172509,
...>       "id" => "babbage-search-query",
...>       "object" => "model",
...>       "owned_by" => "openai-dev"
...>     },
...>     %{
...>       "created" => 1698785189,
...>       "id" => "dall-e-3",
...>       "object" => "model",
...>       "owned_by" => "system"
...>     }
...>   ],
...>   "object" => "list"
...> }}

Arguments

  • config

Raises

  • There is no exception.

Returns

  • {:ok, Finch.Response.t()}
  • {:error, Exception.t()}