Releases: Zipstack/unstract-sdk
Releases · Zipstack/unstract-sdk
v0.40.0
What's Changed
- Vector DB secure connections by @gaya3-zipstack in #77
Full Changelog: v0.39.0...v0.40.0
v0.39.0
What's Changed
- FEAT: Add Support for Public Indexing and Prompt Run Functionality by @tahierhussain in #74
- SDK & adapters merge by @gaya3-zipstack in #73
New Contributors
- @tahierhussain made their first contribution in #74
Full Changelog: v0.38.1...v0.39.0
v0.38.1
v0.38.0
What's Changed
- Update README.md with logo and Readme link by @ritwik-g in #72
- [FEATURE] Send provider in the usage callback by @Deepak-Kesavan in #75
New Contributors
Full Changelog: v0.37.0...v0.38.0
v0.37.0
What's Changed
- Support for Ollama embedding URL configuration & LLM whisperer pages_to_extract by @gaya3-zipstack in #70
- Increment SDK versions for Ollama URL and LLM Whisperer pages_to_extract changes by @gaya3-zipstack in #71
Full Changelog: v0.36.0...v0.37.0
v0.36.0
What's Changed
- Added support to pass params for prompt service API calls by @Deepak-Kesavan in #69
Full Changelog: v0.35.0...v0.36.0
v0.35.0
What's Changed
- Fix/adding whisper hash in metadata by @johnyrahul in #64
- Roll SDK version by @gaya3-zipstack in #68
Full Changelog: v0.34.0...v0.35.0
v0.34.0
v0.33.1
What's Changed
- SDK and Adapters version bump by @Deepak-Kesavan in #66
Full Changelog: v0.33.0...v0.33.1
v0.33.0
What's Changed
- Update lock files after clearing cache by @gaya3-zipstack in #60
- Change top_k value by @gaya3-zipstack in #62
- Support llm token counting using llama index response by @gaya3-zipstack in #63
- Added llm_usage_reason along with usage data by @Deepak-Kesavan in #61
- SDK version bump by @Deepak-Kesavan in #65
Full Changelog: v0.32.0...v0.33.0