Replies: 4 comments 2 replies
-
Hi, @mfshaho. I will transfer this issue as a discussion. Anyway, the problem is that Ollama API doesn't support function call yet. So right now it is not possible to do it directly. It is possible to do it with prompt injection and I am doing some progress in that line. I hope to have a way to do it soon. |
Beta Was this translation helpful? Give feedback.
-
Just a headsup that Ollama now supports tool calling, but it seems only via the OpenAI compatible endpoint. eg, see the mistral-nemo model. I don't know if the feature will soon follow in on the Ollama-specific chat endpoint, but my hunch is maybe not, feels like the OpenAI compatible endpoint is probably the primary endpoint to use for Ollama moving forward. |
Beta Was this translation helpful? Give feedback.
-
Thanks @lebrunel This provider made sense because in Ollama the OpenAI API did not work well. If they finally manage to make Ollama 100% compatible with the OpenAI API, I will close the repository. Personally I think it is absurd that Ollama has two different APIs and they should focus on being compatible with OpenAI. |
Beta Was this translation helpful? Give feedback.
-
Finally they added support to its own API, I hope to release a new version using it soon. |
Beta Was this translation helpful? Give feedback.
-
the AI SDK says that in order to call tools using the LLM, The model needs to support calling tools.
using this Ollama Provider, which ollama models are capable of tool calling?
Beta Was this translation helpful? Give feedback.
All reactions