Instructor brought the very idea of structured LLM prompting to the Elixir ecosystem. InstructorLite is aimed to distill it into a composable set of simple tools that people can build upon.
What’s new in InstructorLite 0.1.0 compared to Instructor 0.0.5?
More granular API for more flexibility
Anthropic adapter
Reduced source code complexity
All streaming capabilities removed
BYO json schema as an official policy. Schema converter is still there, but it’s best-effort
Llamacpp adapter now uses JSON schema instead of GBNF grammar
I didn’t test it with Ollama for 0.1.0 so I don’t think any of the built-in adapters will work with it out of the box. It should be pretty straightforward to write your own adapter though! And I’ll definitely look into it the future.
Appreciate your work! I just returned to a project using Instructor today and bumped into the compatibility issue with Ecto 3.12. Tried the fork that attempted to fix it but it didn’t work (bummer that instructor doesn’t seem to be maintained anymore!). But migrating to instructor_lite was a breeze and everything works great now.
I looked into Ollama and the implementation would depend heavily on the underlying open source model, so I don’t believe it makes sense implementing a universal adapter. However, I went ahead and created an example cookbook on how to implement your own Ollama adapter. It pretty much includes a functional adapter for llama3.1 model