AgentObs - LLM Agent Observability for Elixir

AgentObs - LLM Agent Observability for Elixir

I’m excited to share AgentObs, a new library for instrumenting LLM agentic
applications in Elixir with rich observability!

What is AgentObs?

AgentObs provides a simple, idiomatic interface for tracking agent loops, tool
calls, LLM requests, and prompts in your Elixir applications. It uses native
:telemetry events and translates them to OpenTelemetry spans with
OpenInference semantic conventions.

Key Features:

  • :bullseye: High-level instrumentation helpers - trace_agent/3, trace_tool/3,
    trace_llm/3, trace_prompt/3
  • :robot: Optional ReqLLM integration - Automatic instrumentation with token tracking
    and streaming support
  • :electric_plug: Pluggable backend architecture - Phoenix (OpenInference), Generic
    OpenTelemetry, or custom handlers
  • :bar_chart: Rich metadata tracking - Token usage, costs, tool calls, and more
  • :rocket: Built on OTP - Supervised handlers with fault tolerance

Quick Example

defmodule MyApp.WeatherAgent do
  def get_forecast(city) do
    AgentObs.trace_agent("weather_forecast", %{input: "What's the weather in #{city}?"}, fn ->
      # Call LLM to determine tool to use
      {:ok, _response, _metadata} =
        AgentObs.trace_llm("gpt-4o", %{
          input_messages: [%{role: "user", content: "Get weather for #{city}"}]
        }, fn ->
          response = call_openai(...)
          {:ok, response, %{
            output_messages: [%{role: "assistant", content: response}],
            tokens: %{prompt: 50, completion: 25, total: 75}
          }}
        end)

      # Execute the tool
      {:ok, weather_data} =
        AgentObs.trace_tool("get_weather_api", %{arguments: %{city: city}}, fn ->
          {:ok, %{temp: 72, condition: "sunny"}}
        end)

      {:ok, "The weather in #{city} is #{weather_data.condition}", %{
        tools_used: ["get_weather_api"]
      }}
    end)
  end
end

ReqLLM Integration

For apps using ReqLLM, AgentObs provides helpers
that automatically extract tokens, tool calls, and more:

# Non-streaming
{:ok, response} =
  AgentObs.ReqLLM.trace_generate_text(
    "anthropic:claude-3-5-sonnet",
    [%{role: "user", content: "Hello!"}]
  )

# Streaming
{:ok, stream_response} =
  AgentObs.ReqLLM.trace_stream_text(
    "anthropic:claude-3-5-sonnet",
    [%{role: "user", content: "Tell me a story"}]
  )

Visualization

Use with Arize Phoenix to get beautiful trace visualization with chat messages,
token usage, and nested spans:

Links

2 Likes