Responses - LLM library supporting OpenAI's and xAI's Responses API

Responses is a new client library that supports Responses API by OpenAI and xAI. It’s a successor to my previous library openai_responses, which only supported OpenAI - hence the name change.

What is Responses API?

Conveniently, OpenAI just published a blog titled “Why we built Responses API”. In short, it’s a modern (launched this spring), stateful, multimodal, efficient API, an upgrade to Chat Completions, a de-facto industry standard, that was designed by OpenAI in just a weekend back in 2023.

And xAI just launched their own support of Responses API last week.

Why a dedicated library?

There is no shortage of Elixir libraries that support a broad range of LLM providers - from the battle tested LangChain, to ReqLLM that was announced couple of week ago. However, I think there is still space for a small, dedicated library that doesn’t need to bring many different providers to a common denominator. And now that xAI supports the same standard, users of Responses will get additional flexibility in avoiding a vendor lock-in.

Some examples of usage

The simple usage is, well, simple:

# Explicit input and model are required
{:ok, response} = Responses.create(input: "Write me a haiku about Elixir", model: "gpt-4.1-mini")
# -> {:ok, %Responses.Response{text: ..., ...}}

IO.puts(response.text)

You also get automatic cost calculations,

response =
  Responses.create!(
    input: [
      %{role: :developer, content: "Talk like a pirate."},
      %{role: :user, content: "Write me a haiku about Elixir"}
    ],
    model: "grok-4-fast"
  )

IO.puts("#{response.text}\n\nCost: $#{response.cost.total_cost}")
# Arrr, Elixir's code,
# Functional waves crash concurrent,
# Phoenix rises strong.
#
# Cost: $0.00015560

streaming and structured output support,

Responses.stream(
  input: "Tell me about the first 2 U.S. Presidents",
  schema: %{presidents: {:array, %{name: :string, birth_year: :integer}}},
  model: "gpt-4.1-mini"
)
|> Responses.Stream.json_events()
|> Stream.each(&IO.inspect/1)
|> Stream.run()

function calls, web search with OpenAI, prompt helpers, and more.

Please try it out and tell me what you think!

3 Likes