Is there some way to retrieve the used tokens when using an action with AshAi prompt/2 ?
The only way i can workaround that is enabling verbose?, but this only shows in the logs, I want to get it back so I can calculate the total between multiple calls.
I haven’t tried this yet, but my understanding is that prompt-backed actions take a LangChain model instance (e.g. LangChain.ChatModels.ChatOpenAI.new!/1) as the first argument to prompt/2.
If that’s true, shouldn’t you be able to attach a Telemetry handler to LangChain’s built-in telemetry events? Or maybe wrap the call and emit Telemetry metrics?
I would be pretty surprised if there’s no way to track token usage with ash_ai.
It’s not possible currently, but it could be added pretty easily, just haven’t gotten around to it. We would add a custom type into AshAi, like AshAi.Response, that contains additional metadata. So if something looks like this
action :whatever, AshAi.Response do
constraints [type: <actual_type>, constraints: <actual_constraints>]
end