View Source HipcallOpenai (HipcallOpenai v0.3.0)
Documentation for HipcallOpenai
.
Summary
Functions
Create an audio from text.
The same audio_create_speech/1
just overwrite config.
Create chat completion
The same chat_completions/1
just overwrite config.
Retrieve model
The same with model/1
just overwrite the config.
List models
Functions
Create an audio from text.
For more information
- https://platform.openai.com/docs/guides/text-to-speech
- https://platform.openai.com/docs/api-reference/audio/createSpeech
Examples
iex> %HipcallOpenai.Config{
iex> api_key: "YOUR_TOKEN_HERE",
iex> api_organization: "YOUR_ORG_KEY_HERE",
iex> api_url: nil
iex> }
iex> params = [
iex> model: "tts-1",
iex> input: "Hello, I'm an Elixir wrapper for OpenAI.",
iex> voice: "nova"
iex> ]
iex> HipcallOpenai.audio_create_speech(params, config_override)
...> {:ok, <<255, 243, 228, 196, 0, 103, 84, 58, 0, 5, 90, 208, 0, 1, 141, 82, 99, 56, 64,
...> 88, 0, 132, 9, 139, 34, 101, 75, 153, 147, 38, 100, 201, 155, 42, 104, 14,
...> 25, 227, 198, 120, 241, 158, 56, 102, 139, 25, 66, 6, ...>>}
You can easily create a new mp3 file. For example
iex> {:ok, file_content} = HipcallOpenai.audio_create_speech(params, config_override)
iex> File.write!("test.mp3", file_content)
...> :ok
Arguments
:model
(String.t/0
) - Required. One of the available TTS models:tts-1
ortts-1-hd
The default value is"tts-1"
.:input
(String.t/0
) - Required. The text to generate audio for. The maximum length is 4096 characters. The default value is"Hello world!"
.:voice
(String.t/0
) - Required. The voice to use when generating the audio. Supported voices arealloy
,echo
,fable
,onyx
,nova
, andshimmer
. Previews of the voices are available in the Text to speech guide. The default value is"nova"
.:response_format
(String.t/0
) - The voice to use when generating the audio. Supported voices arealloy
,echo
,fable
,onyx
,nova
, andshimmer
. Previews of the voices are available in the Text to speech guide. The default value is"mp3"
.:speed
(float/0
) - The speed of the generated audio. Select a value from 0.25 to 4.0. 1.0 is the default. The default value is1.0
.
Raises
- Raise
NimbleOptions.ValidationError
if params are not valid.
Returns
{:ok, file_content}
{:error, Exception.t()}
@spec audio_create_speech(params :: keyword(), config :: struct()) :: {:ok, map()} | {:error, map()} | {:error, any()}
The same audio_create_speech/1
just overwrite config.
Create chat completion
For more information https://platform.openai.com/docs/api-reference/chat/create
Examples
iex> params = [
iex> model: "gpt-4-1106-preview",
iex> messages: [
iex> %{role: "system", content: "Sen yardımcı olan bir asistansın."},
iex> %{role: "user", content: "Merhaba!"}
iex> ]
iex> ]
iex> HipcallOpenai.chat_completions(params)
...> {:ok,
...> %{
...> "choices" => [
...> %{
...> "finish_reason" => "stop",
...> "index" => 0,
...> "logprobs" => nil,
...> "message" => %{
...> "content" => "Merhaba! Size nasıl yardımcı olabilirim?",
...> "role" => "assistant"
...> }
...> }
...> ],
...> "created" => 1705330002,
...> "id" => "chatcmpl-8hIWYa9L3HBatp1Wyp5fJfneaBMUi",
...> "model" => "gpt-4-1106-preview",
...> "object" => "chat.completion",
...> "system_fingerprint" => "fp_168383a679",
...> "usage" => %{
...> "completion_tokens" => 15,
...> "prompt_tokens" => 27,
...> "total_tokens" => 42
...> }
...> }}
Arguments
:model
(String.t/0
) - ID of the model to use. See the model endpoint compatibility table for details on which models work with the Chat API. The default value is"gpt-3.5-turbo"
.:messages
(list ofmap/0
) - A list of messages comprising the conversation so far. The default value is[%{role: "system", content: "You are a helpful assistant."}, %{role: "user", content: "Hello!"}]
.:frequency_penalty
(float/0
) - Number between -2.0 and 2.0. Positive values penalize new tokens based on their existing frequency in the text so far, decreasing the model's likelihood to repeat the same line verbatim.:max_tokens
(pos_integer/0
) - The maximum number of tokens that can be generated in the chat completion. The total length of input tokens and generated tokens is limited by the model's context length.:stream
(boolean/0
) - If set, partial message deltas will be sent, like in ChatGPT. Tokens will be sent as data-only server-sent events as they become available, with the stream terminated by a data: [DONE] message.:temperature
(float/0
) - What sampling temperature to use, between 0 and 2. Higher values like 0.8 will make the output more random, while lower values like 0.2 will make it more focused and deterministic.:user
(String.t/0
) - A unique identifier representing your end-user, which can help OpenAI to monitor and detect abuse.
Raises
- Raise
NimbleOptions.ValidationError
if params are not valid.
Returns
{:ok, Finch.Response.t()}
{:error, Exception.t()}
@spec chat_completions(params :: keyword(), config :: struct()) :: {:ok, map()} | {:error, map()} | {:error, any()}
The same chat_completions/1
just overwrite config.
Retrieve model
For more information https://platform.openai.com/docs/api-reference/models/retrieve
Examples
iex> iex(1)> iex(1)> HipcallOpenai.model("gpt-3.5-turbo-instruct")
...> {:ok,
...> %{
...> "created" => 1692901427,
...> "id" => "gpt-3.5-turbo-instruct",
...> "object" => "model",
...> "owned_by" => "system"
...> }}
Arguments
model
Raises
There is no exceptions.
Returns
{:ok, map()}
{:error, Exception.t()}
@spec model(model :: String.t(), config :: struct()) :: {:ok, map()} | {:error, map()} | {:error, any()}
The same with model/1
just overwrite the config.
List models
For more information https://platform.openai.com/docs/api-reference/models
Examples
iex> iex(1)> config_override = %Config{api_key: "asdf_api"}
iex> iex(2)> HipcallOpenai.models(config_override)
...> {:ok,
...> %{
...> "data" => [
...> %{
...> "created" => 1686588896,
...> "id" => "gpt-4-0613",
...> "object" => "model",
...> "owned_by" => "openai"
...> },
...> %{
...> "created" => 1651172509,
...> "id" => "curie-search-query",
...> "object" => "model",
...> "owned_by" => "openai-dev"
...> },
...> %{
...> "created" => 1687882411,
...> "id" => "gpt-4",
...> "object" => "model",
...> "owned_by" => "openai"
...> },
...> %{
...> "created" => 1651172509,
...> "id" => "babbage-search-query",
...> "object" => "model",
...> "owned_by" => "openai-dev"
...> },
...> %{
...> "created" => 1698785189,
...> "id" => "dall-e-3",
...> "object" => "model",
...> "owned_by" => "system"
...> }
...> ],
...> "object" => "list"
...> }}
Arguments
config
Raises
- There is no exception.
Returns
{:ok, Finch.Response.t()}
{:error, Exception.t()}