Skip to content

Releases: BerriAI/litellm

v1.53.7-stable

08 Dec 03:14
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.53.6...v1.53.7-stable

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.53.7-stable

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 250.0 287.825675572594 6.147983179332712 0.0 1839 0 225.9885929999541 1840.4691450000428
Aggregated Passed ✅ 250.0 287.825675572594 6.147983179332712 0.0 1839 0 225.9885929999541 1840.4691450000428

v1.53.7

05 Dec 08:44
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.53.6...v1.53.7

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.53.7

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 250.0 285.5731596761653 6.103319742596985 0.0 1825 0 229.3374330000688 1651.5534569999772
Aggregated Passed ✅ 250.0 285.5731596761653 6.103319742596985 0.0 1825 0 229.3374330000688 1651.5534569999772

v1.53.6

05 Dec 07:06
Compare
Choose a tag to compare

What's Changed

  • UI - fix Application error: a client-side exception has occurred (see the browser console for more information). by @ishaan-jaff in #7027
  • (UI) Load time improvement - Sub 2s load time for Home Page ⚡️ by @ishaan-jaff in #7014
  • add cohere/rerank-v3.5 to model cost map by @ishaan-jaff in #7035
  • (feat) add Vertex Batches API support in OpenAI format by @ishaan-jaff in #7032

Full Changelog: v1.53.5...v1.53.6

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.53.6

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 230.0 255.31679997733426 6.192691301327917 0.0 1853 0 212.72625400001743 2747.1794109999905
Aggregated Passed ✅ 230.0 255.31679997733426 6.192691301327917 0.0 1853 0 212.72625400001743 2747.1794109999905

v1.53.5

04 Dec 18:21
Compare
Choose a tag to compare

What's Changed

New Contributors

Full Changelog: v1.53.4...v1.53.5

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.53.5

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 280.0 319.61681502986755 6.043486566751137 0.0 1808 0 233.45962199999803 4589.378371999999
Aggregated Failed ❌ 280.0 319.61681502986755 6.043486566751137 0.0 1808 0 233.45962199999803 4589.378371999999

v1.53.4

04 Dec 06:43
Compare
Choose a tag to compare

What's Changed

  • (QOL fix) - remove duplicate code from datadog logger by @ishaan-jaff in #7013
  • (UI) Sub 1s Internal User Tab load time by @ishaan-jaff in #7007
  • (fix) allow gracefully handling DB connection errors on proxy by @ishaan-jaff in #7017
  • (refactor) - migrate router.deployment_callback_on_success to use StandardLoggingPayload by @ishaan-jaff in #7015
  • (fix) 'utf-8' codec can't encode characters error on OpenAI by @ishaan-jaff in #7018

Full Changelog: v1.53.3...v1.53.4

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.53.4

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 250.0 291.34812296252045 6.153959693113714 0.0 1841 0 223.70142199997645 2984.8669300000097
Aggregated Passed ✅ 250.0 291.34812296252045 6.153959693113714 0.0 1841 0 223.70142199997645 2984.8669300000097

v1.53.3.dev2

04 Dec 02:02
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.53.3...v1.53.3.dev2

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.53.3.dev2

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 230.0 254.71533959127268 6.282620253609997 0.0 1879 0 197.70822599997473 3258.8738069999863
Aggregated Passed ✅ 230.0 254.71533959127268 6.282620253609997 0.0 1879 0 197.70822599997473 3258.8738069999863

v1.53.3-dev1

03 Dec 19:26
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.53.2...v1.53.3-dev1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.53.3-dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 270.0 322.8250497930263 5.940031623578464 0.0 1778 0 227.83484099994666 3640.05648899996
Aggregated Failed ❌ 270.0 322.8250497930263 5.940031623578464 0.0 1778 0 227.83484099994666 3640.05648899996

v1.53.3

03 Dec 21:04
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.53.2...v1.53.3

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.53.3

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 260.0 295.3963784342538 6.049806369807933 0.0 1810 0 224.3657600000688 2447.638761999997
Aggregated Passed ✅ 260.0 295.3963784342538 6.049806369807933 0.0 1810 0 224.3657600000688 2447.638761999997

v1.53.2

03 Dec 04:52
Compare
Choose a tag to compare

What's Changed

Full Changelog: v1.53.1...v1.53.2

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.53.2

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Failed ❌ 250.0 318.2618058818948 6.0033656688808605 0.003344493408847276 1795 1 225.67902299999787 55505.375238
Aggregated Failed ❌ 250.0 318.2618058818948 6.0033656688808605 0.003344493408847276 1795 1 225.67902299999787 55505.375238

v1.53.1.dev1

29 Nov 04:45
Compare
Choose a tag to compare

Full Changelog: v1.53.1...v1.53.1.dev1

Docker Run LiteLLM Proxy

docker run \
-e STORE_MODEL_IN_DB=True \
-p 4000:4000 \
ghcr.io/berriai/litellm:main-v1.53.1.dev1

Don't want to maintain your internal proxy? get in touch 🎉

Hosted Proxy Alpha: https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Load Test LiteLLM Proxy Results

Name Status Median Response Time (ms) Average Response Time (ms) Requests/s Failures/s Request Count Failure Count Min Response Time (ms) Max Response Time (ms)
/chat/completions Passed ✅ 250.0 285.15501328346113 6.138794444114975 0.0 1838 0 223.90917799998533 2684.1706850000264
Aggregated Passed ✅ 250.0 285.15501328346113 6.138794444114975 0.0 1838 0 223.90917799998533 2684.1706850000264