-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support LocalAI #15
Comments
This could be as simple as allowing us to change the URL for the ChatGPT endpoint as some of our local libraries now are chatGPT API compatible. |
It would be great if this project support local AI such as Ollama, LM Studio... |
Hi @dxcore35, @creuzerm , @AndyDavis8 , |
I opened #30 for this. Will look into it. |
@dxcore35 |
From what I can see, The issue here is that the models don't adhere to the format.
where it just prefaces the JSON with
where it appends text to the JSON. Its impossible to account for that lack of prompt adherence. I am not sure if the format parameter works with the OpenAI-node library. For now I can only recommend to try using different models to see if one will do. Feel free to also send some debug information about the completions you received. |
I think he need to implement this for Ollama: They also implement the structure output few days back! It will force any model to adhere to JSON |
@dxcore35 The I will be looking into this, though this might require some significant work and possible an alternative to OpenAI-node. |
Please, please!!!
Support LocalAI https://www.youtube.com/watch?v=3yPVDI8wZcI
It is already implemented in the plugin https://github.com/logancyang/obsidian-copilot
I don't think it will be too much code to copy relevant code from this project and implement fully local AI.
Thank you very much for your work!
The text was updated successfully, but these errors were encountered: