feat: rust interpreter support for modelResponse, ollama-rs tooling calling, and no-stream #883
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
for now, we’ll have to ride my branch of ollama-rs, till i have discussion with them re: the small changes needed on their side to support tooling. pepperoni21/ollama-rs#185
(ollama-rs does have a notion of tools, but their code currently assumes that all tools are statically defined as rust functions, which is not compatible with pdl's approach; the required ollama-rs changes aren't big, really just make a few of their Tool* structs public, and to enable the deserialize derive on them)