We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I would like to ask if it can be purely deployed locally and call the local ollama to process this information with llama3 or mistral local LLM.
The text was updated successfully, but these errors were encountered:
Very much this, keeping data local, simple, and not relying on 3rd party cloud services is a must for many.
Sorry, something went wrong.
Was this closed as completed because it's not planned or because as completed as in the feature exists?
As of right now, this is fully possible - but in a self hosted version. Would you be interested in an option like that in the local version?
Thanks for the response @Dhravya
I may be slow on the start here, self-hosted & local, are these not the same thing?
CodeTorso
No branches or pull requests
I would like to ask if it can be purely deployed locally and call the local ollama to process this information with llama3 or mistral local LLM.
The text was updated successfully, but these errors were encountered: