You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
So I ran across your project and wanted to try it out. I have a GPU so I'm trying the beta test curl installer. However, it fails because it assumes i don't have ollama already installed. The reason I use a local ollama is because I have a partition under /mnt/llm which is where I keep my downloaded LLM's. I also want a local Ollama for other things such as aider-chat. What would be the best way to deal with this?
The text was updated successfully, but these errors were encountered:
Choice between using existing Ollama or installing a new instance
Improved flexibility for users with pre-existing Ollama setups
Prevention of conflicts with custom Ollama configurations
This change will addresses issues for users with existing Ollama setups,
particularly those with custom LLM directories. It will improves flexibility
and prevents conflicts with pre-existing Ollama configurations.
oh very nice! Thank you. I have in the meantime, just gone into the beta folder and symlinked my ollama_data to my partition. Ollama works from the web page as well as from the CLI using docker Ollama as setup by Belullama. Just a work around. look forward to seeing your updates.
So I ran across your project and wanted to try it out. I have a GPU so I'm trying the beta test curl installer. However, it fails because it assumes i don't have ollama already installed. The reason I use a local ollama is because I have a partition under /mnt/llm which is where I keep my downloaded LLM's. I also want a local Ollama for other things such as aider-chat. What would be the best way to deal with this?
The text was updated successfully, but these errors were encountered: