-
-
Notifications
You must be signed in to change notification settings - Fork 1.8k
Issues: BerriAI/litellm
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
Issue with Bolt.diy Installation on macOS M1 Using Zsh Shell
#7494
opened Jan 1, 2025 by
hithamsmadi
[Bug]: The issue lies in the default team’s keys not being persisted in the database.
bug
Something isn't working
#7485
opened Dec 31, 2024 by
qte123
Error in function/tool calling in claude models accessed via bedrock
bug
Something isn't working
mlops user request
#7483
opened Dec 31, 2024 by
ash-01xor
[Bug]: acompletion logs duplicates to langfuse
bug
Something isn't working
mlops user request
#7477
opened Dec 30, 2024 by
N13T
[Feature]: batch set limit
enhancement
New feature or request
#7474
opened Dec 30, 2024 by
chenjianb
[info]: Regarding models compatible with OpenAI-Compatible Endpoints, such as Qwen. etc
bug
Something isn't working
#7471
opened Dec 30, 2024 by
Silence-Well
[Bug]: Cannot use Something isn't working
mlops user request
tools
through LiteLLM Proxy
bug
#7470
opened Dec 30, 2024 by
eliorc
[Feature]: Get Response Type based on what is specified in the Response_Model
enhancement
New feature or request
#7468
opened Dec 29, 2024 by
345ishaan
[Bug]: Cannot use redis cache without TTL
bug
Something isn't working
#7467
opened Dec 29, 2024 by
eliorc
[Bug]: supports_vision Method Incorrectly Returns False for Vision-Capable Models
bug
Something isn't working
#7454
opened Dec 28, 2024 by
githubuser16384
Add "ollama/llama3.3" in "model_prices_and_context_window.json"
awaiting: user response
#7431
opened Dec 26, 2024 by
Tejhing
[Feature]: print alert log to console?
enhancement
New feature or request
#7372
opened Dec 23, 2024 by
elvis-cai
[Bug]: Invalid double await in ollama embeddings in Proxy (fix in report)
bug
Something isn't working
#7366
opened Dec 22, 2024 by
aguadoenzo
[Bug]: Vertex AI - Code Gecko stream not working
bug
Something isn't working
#7360
opened Dec 22, 2024 by
ishaan-jaff
[Bug]: JSON mode with Ollama assumes Function Calling
bug
Something isn't working
#7355
opened Dec 21, 2024 by
sidjha1
[Feature]: Automatic Handling of Files Larger Than 20MB for Gemini API
enhancement
New feature or request
#7338
opened Dec 21, 2024 by
icefox57
[Bug]: Ollama as custom provider does not default to sync by default
bug
Something isn't working
#7332
opened Dec 20, 2024 by
shanbady
[Bug]: Auth issues trying to run replicate model
bug
Something isn't working
#7327
opened Dec 20, 2024 by
geekodour
[Feature]: Add Retry Logic for Guardrails or Allow Skip Post Call Rules or Add Response Format Validator with Type JSON
enhancement
New feature or request
#7320
opened Dec 20, 2024 by
aleksandrphilippov
[Bug]: docker-based build for UI_BASE_PATH fails
bug
Something isn't working
#7318
opened Dec 19, 2024 by
Jflick58
[Bug]: Some small inconsistencies in LiteLLM_SpendLogs -> api_base found
bug
Something isn't working
#7317
opened Dec 19, 2024 by
stronk7
Previous Next
ProTip!
Follow long discussions with comments:>50.