Skip to content

Commit

Permalink
chore: version 1.1.0
Browse files Browse the repository at this point in the history
  • Loading branch information
sgomez committed Dec 6, 2024
1 parent 7de9ec5 commit 1c6c05b
Show file tree
Hide file tree
Showing 4 changed files with 16 additions and 7 deletions.
10 changes: 4 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,13 +1,10 @@
# ollama-ai-provider

Vercel AI Provider for running Large Language Models locally using Ollama
Vercel AI Provider for running Large Language Models locally using Ollama.

> **Note: This module is under development and may contain errors and frequent incompatible changes.**
>
> All releases will be of type MAJOR following the 0.MAJOR.MINOR scheme. Only bugs and model updates will be released as MINOR.
> Please read the [Tested models and capabilities](#tested-models-and-capabilities) section to know about the features
> implemented in this provider.
## Requirements

This provider requires Ollama >= 0.5.0

## Installation

Expand Down Expand Up @@ -78,6 +75,7 @@ You need to use any model with visual understanding. These are tested:
* llava
* llava-llama3
* llava-phi3
* llama3.2-vision
* moondream

Ollama does not support URLs, but the ai-sdk is able to download the file and send it to the model.
Expand Down
7 changes: 7 additions & 0 deletions packages/ollama/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,12 @@
# ollama-ai-provider

## 1.1.0

### Minor Changes

- Add support to structured output
- Requires Ollama >= 0.5.0

## 1.0.0

### Major Changes
Expand Down
4 changes: 4 additions & 0 deletions packages/ollama/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,10 @@
The **[Ollama Provider](https://github.com/sgomez/ollama-ai-provider)** for the [Vercel AI SDK](https://sdk.vercel.ai/docs)
contains language model support for the Ollama APIs and embedding model support for the Ollama embeddings API.

## Requirements

This provider requires Ollama >= 0.5.0

## Setup

The Ollama provider is available in the `ollama-ai-provider` module. You can install it with
Expand Down
2 changes: 1 addition & 1 deletion packages/ollama/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "ollama-ai-provider",
"version": "1.0.0",
"version": "1.1.0",
"description": "Vercel AI Provider for running LLMs locally using Ollama",
"main": "./dist/index.js",
"module": "./dist/index.mjs",
Expand Down

0 comments on commit 1c6c05b

Please sign in to comment.