Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can it be deployed locally and processed by local LLM? #256

Open
TickToTock opened this issue Aug 20, 2024 · 4 comments
Open

Can it be deployed locally and processed by local LLM? #256

TickToTock opened this issue Aug 20, 2024 · 4 comments
Assignees

Comments

@TickToTock
Copy link

I would like to ask if it can be purely deployed locally and call the local ollama to process this information with llama3 or mistral local LLM.

@douglasg14b
Copy link

Very much this, keeping data local, simple, and not relying on 3rd party cloud services is a must for many.

@Dhravya Dhravya closed this as completed Jan 21, 2025
@douglasg14b
Copy link

Was this closed as completed because it's not planned or because as completed as in the feature exists?

@Dhravya
Copy link
Collaborator

Dhravya commented Jan 28, 2025

As of right now, this is fully possible - but in a self hosted version. Would you be interested in an option like that in the local version?

@Dhravya Dhravya reopened this Jan 28, 2025
@douglasg14b
Copy link

Thanks for the response @Dhravya

I may be slow on the start here, self-hosted & local, are these not the same thing?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants