Replies: 2 comments 10 replies
-
Hm. definitely worth thinking about it. Don't know much about running local llms, yet.. ;-) |
Beta Was this translation helpful? Give feedback.
10 replies
-
This is being tracked in #13 now. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
What do you think about using local / self hosted llms? Like ollama with a vision model? (for example like this https://ollama.com/blog/vision-models)
Maybe it's slower, maybe more expensive than a Pro subscription if you factor in your own hardware or electricity costs, but you don't have to upload your terabyte image collection to google/....
But maybe I'm missing something with this idea.
Beta Was this translation helpful? Give feedback.
All reactions