Dialyzer error - trying to handle :error responses with multiple values in the error tuple

I’m having an issue getting my Dialyzer to agree with my output, and can’t seem to figure it out. Essentially, the response is {:error, %Mint.HTTP2{}, %Mint.HTTPError{}} instead of just {:error, _} and I can’t get the spec to reflect that.

This is the issue I’m facing, I keep getting an error from my Mint transport service (which is a whole separate issue) but I keep getting alerts that my handle_response isn’t able to match the type. After looking into it more the error tuple has two arguments in the shape of: {:error, %Mint.HTTP2{}, %Mint.HTTPError{}}.

Here is my handle_response:

  @spec handle_response(
          {:ok, [map()] | function()}
          | {:error, String.t()}
          | {:error, %Mint.HTTP2{}, %Mint.HTTPError{}},
          Keyword.t()
        ) ::
          {:ok, [map()] | function()} | {:error, term()}
  defp handle_response(resp, opts)

  defp handle_response({:ok, stream} = resp, stream: true) when is_function(stream), do: resp

  defp handle_response({:ok, resp}, opts) when is_binary(resp),
    do: handle_response({:ok, Jason.decode!(resp)}, opts)

  defp handle_response({:ok, resp}, _), do: {:ok, Map.get(resp, "choices", [])}

  defp handle_response({:error, _} = err, _), do: err

  defp handle_response({:error, http, http_error}, _)
       when is_map(http) and is_map(http_error),
       do: {:error, http_error}

But I keep getting an error from my dialyzer here:

lib/broadcast.ex:129:pattern_match
The pattern can never match the type.

Pattern:
{:error, _http, _http_error}, _

Type:
{:error, _} | {:ok, _}, [{:stream, _}, ...]

I’m not sure why, I’m pretty sure the spec should be correct. Can anyone see what would be incorrect about this or does this seem reasonable?

After searching more, I found this open issue that “solves” the problem I’m facing by downgrading the protocol to http1. Mint adapter cannot upload more than 65535 bytes on HTTP/2 · Issue #394 · elixir-tesla/tesla · GitHub

That being said, still not sure why my dialyzer is not working

Can you show the code that calls handle_response? The error message suggests that Dialyzer has inferred that the {:error, map(), map()} case “can’t happen” despite it being observed in production.

1 Like

Definitely! The code is called here:

  defp do_generate(%{messages: _messages} = params, stream, opts) do
    opts
    |> OpenAI.Client.new()
    |> OpenAI.Chat.create_completion(params, opts)
    |> handle_response(stream: stream)
  end

In the create_completion function

  def create_completion(client, params, opts \\ []) do
    client
    |> Client.post("/v1/chat/completions", params, opts)
    |> Client.handle_response(opts)
  end

The client handle_response

  @doc false
  @spec handle_response(result(), Keyword.t()) :: {:ok, body()} | {:error, term()}
  def handle_response(response, opts \\ [])

  def handle_response({:ok, %Tesla.Env{status: status, body: body}}, _opts) when status >= 400,
    do: {:error, body}

  def handle_response({:ok, %Tesla.Env{body: body}}, _opts), do: {:ok, body}

  def handle_response({:error, _reason} = err, _opts), do: err

I’m realizing doing this it might be from the handle_response of the client, maybe I need to update that one?

(assuming that do_generate and the handle_response from your initial post are in the same module)

Dialyzer is telling you the {:error, _, _} branch in handle_response can’t be reached because create_completion’s return type matches Client.handle_response ({:ok, body()} | {:error, term()})

As shown, Client.handle_response will crash if Client.post returns {:error, _, _}, since it doesn’t have a matching clause. You’ll likely need to update:

  • the definition of result in Client, if it doesn’t include the 3-tuple case already
  • the return type for Client.handle_response, if you want to handle the 3-tuple case in the caller instead of in Client
2 Likes

One thing to understand with dialyzer is that it starts out not caring about your spec at all.

Dialyzer does type inference. And for the given functions it inferred that the input will never be a tuple-3. Your function matches on such a tuple though, which is surfaced as an error.

The spec of the function is compared against inferred values and mismatches (as in the spec doesn’t have overlap with the inferred value) are reported. Iirc the overlap between the inferred type and the spec for the return type is then used as the input to later inferrence.

So an incorrect typespec can mess up validation on deeped down in the callstack function calls, but typespec cannot “widen” what would be considered a possible input to a function call compared to the inferred value.

2 Likes