OpenAI.Responses - v0.4.0 released; API cleanup, better support for structured outputs and function calling

openai_responses library has been updated to v0.4.0. This release cleans up both the API and implementation details, and adds improved support for streaming (including streaming JSON events) , structured outputs, and function calling. Additional details can be found in Changelog for v0.4.0.

Here are some examples of using the new API from the Livebook Tutorial:

  1. You can use a previous response to continue the conversation; OpenAI takes care of keeping the state:
alias OpenAI.Responses

Responses.create!(
  input: [
    %{role: :developer, content: "Talk like a pirate."},
    %{role: :user, content: "Write me a haiku about Elixir"}
  ]
)
|> Responses.create!(input: "Which programming language is this haiku about?")
|> Map.get(:text)
|> IO.puts()
# Output contains "Elixir"; still talks like a pirate
  1. Costs are calculated automatically:
{:ok, response} = Responses.create("Explain quantum computing")

# All cost values are Decimal for precision
IO.inspect(response.cost)
# => %{
#      input_cost: #Decimal<0.0004>,
#      output_cost: #Decimal<0.0008>,
#      total_cost: #Decimal<0.0012>,
#      cached_discount: #Decimal<0>
#    }
  1. You can request a Structured Output and stream JSON events:
Responses.stream(
  input: "List 3 programming languages with their year of creation",
  model: "gpt-4o-mini",
  schema: %{
    languages: {:array, %{
      name: :string,
      year: :integer,
      paradigm: {:string, description: "Main programming paradigm"}
    }}
  }
)
|> Responses.Stream.json_events()
|> Enum.each(&IO.puts/1)
  1. You can define provide access to your own functions and automatically resolve function calls (notice it uses run/2 instead of create/1):
# Define available functions
functions = %{
  "get_weather" => fn %{"location" => location} ->
    # In a real app, this would call a weather API
    case location do
      "Paris" -> "15°C, partly cloudy"
      "London" -> "12°C, rainy"
      "New York" -> "8°C, sunny"
      _ -> "Weather data not available"
    end
  end,
  "get_time" => fn %{"timezone" => timezone} ->
    # In a real app, this would get actual time for timezone
    case timezone do
      "Europe/Paris" -> "14:30"
      "Europe/London" -> "13:30" 
      "America/New_York" -> "08:30"
      _ -> "Unknown timezone"
    end
  end
}

# Define function tools
weather_tool = Responses.Schema.build_function(
  "get_weather",
  "Get current weather for a location",
  %{location: {:string, description: "City name"}}
)

time_tool = Responses.Schema.build_function(
  "get_time", 
  "Get current time in a timezone",
  %{timezone: {:string, description: "Timezone like Europe/Paris"}}
)

# Run the conversation with automatic function calling
responses = Responses.run(
  [
    input: "What's the weather and time in Paris?",
    tools: [weather_tool, time_tool]
  ],
  functions
)
1 Like

A post was merged into an existing topic: Openai_responses - A client for OpenAI’s new Responses API