Skip to content

Releases: BerriAI/litellm

v1.7.11

29 Nov 05:52
Compare
Choose a tag to compare

💥 LiteLLM Router + Proxy handles 500+ requests/second

💥LiteLLM Proxy - Now handles 500+ requests/second, Load Balance Azure + OpenAI deployments, Track spend per user 💥
Try it here: https://docs.litellm.ai/docs/simple_proxy
🔑 Support for AZURE_OPENAI_API_KEY on Azure https://docs.litellm.ai/docs/providers/azure
h/t
@solyarisoftware
⚡️ LiteLLM Router can now handle 20% more throughput https://docs.litellm.ai/docs/routing
📖Improvement to litellm debugging docs h/t
@solyarisoftware
https://docs.litellm.ai/docs/debugging/local_debugging

Full Changelog: v1.7.1...v1.7.11

v1.7.1

25 Nov 23:21
Compare
Choose a tag to compare

What's Changed

  • 🚨 LiteLLM Proxy uses Async completion/embedding calls on this release onwards - this led to 30x more throughput for embedding/completion calls

New Contributors

Full Changelog: v1.1.0...v1.7.1

v1.1.0

18 Nov 15:23
Compare
Choose a tag to compare

What's Changed

🚨 Breaking Change v1.1.0 -> This version is only compatible with OpenAI python 1.1.0
Migration Guide: https://docs.litellm.ai/docs/migration

Key changes in v1.1.0

  • Requires openai>=1.0.0
  • openai.InvalidRequestError → openai.BadRequestError
  • openai.ServiceUnavailableErroropenai.APIStatusError
  • NEW litellm client, allow users to pass api_key
    • litellm.Litellm(api_key="sk-123")
  • response objects now inherit from BaseModel (prev. OpenAIObject)
  • NEW default exception - APIConnectionError (prev. APIError)
  • litellm.get_max_tokens() now returns an int not a dict
    max_tokens = litellm.get_max_tokens("gpt-3.5-turbo") # returns an int not a dict 
    assert max_tokens==4097

Other updates

New Contributors

Full Changelog: v0.11.1...v1.1.0

v0.11.1

23 Oct 15:54
3c5fb92
Compare
Choose a tag to compare

What's Changed

  • Update init.py model_list to include bedrock models by @canada4663 in #609
  • proxy /models endpoint with the results of get_valid_models() by @canada4663 in #611
  • fix: llm_provider add openai finetune compatibility by @Undertone0809 in #618
  • Update README.md by @Shivam250702 in #620
  • Verbose warning by @toniengelhardt in #625
  • Update the Dockerfile of the LiteLLM Proxy server and some refactorings by @coconut49 in #628
  • fix: updates to traceloop docs by @nirga in #639
  • docs: fixed typo in Traceloop docs by @nirga in #640
  • fix: disabled batch by default for Traceloop by @nirga in #643
  • Create GitHub Action to automatically build docker images by @coconut49 in #634
  • Tutorial for using LiteLLM within Gradio Chatbot Application by @dcruiz01 in #645
  • proxy server: fix langroid part by @pchalasani in #652
  • Create GitHub Action to automatically build docker images by @coconut49 in #655
  • deepinfra: Add supported models by @ichernev in #638
  • Update index.md by @Pratikdate in #663
  • Add perplexity namespace to model pricing dict by @toniengelhardt in #665
  • Incorrect boto3 parameter name by @shrikant14 in #671

New Contributors

Full Changelog: v0.8.4...v0.11.1

v0.8.4

14 Oct 17:13
Compare
Choose a tag to compare

🚨 IMPORTANT v0.8.4 has one major breaking change

What's Changed

  • Add missing litellm_provider for gpt-3.5-16k-0613 by @mocy in #436
  • added feedback button from feedbackrocket.io by @NANDINI-star in #443
  • Fix: merge conflict by @bitsnaps in #495
  • Update boto3 dependency to version 1.28.57, refactor bedrock client initialization and remove troubleshooting guide from documentation. by @coconut49 in #497
  • added model openrouter/mistralai/mistral-7b-instruct with test by @lucashofer in #498
  • add bedrock.anthropic support for system prompt using tag by @canada4663 in #499
  • remove .DS_Store and update .gitignore by @linediconsine in #500
  • Update README.md by @eltociear in #518
  • Update utils.py by @vedant-z in #530
  • [docs] minor typo correction by @Akash190104 in #537
  • Readme Update by @AnderMendoza in #556
  • Add host option to run_server() by @Sir-Photch in #558
  • Add support for passing external bedrock clients to completion by @zhooda in #562
  • Add custom_openai type in provider list by @kylehh in #560
  • shorter langroid example, update section title by @pchalasani in #581
  • Fix usage open in colab link by @biplobsd in #605

New Contributors

Full Changelog: v0.1.738...v0.8.4

v0.1.738

24 Sep 00:05
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v0.1.574...v0.1.738

v0.1.574

09 Sep 19:17
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v0.1.492...v0.1.574

0.1.492

27 Aug 05:33
a81c77c
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v0.1.387...v0.1.492

v0.1.387

14 Aug 17:50
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: https://github.com/BerriAI/litellm/commits/v0.1.387