Skip to content

Commit

Permalink
Merge pull request #97 from Dino-Kupinic/develop
Browse files Browse the repository at this point in the history
  • Loading branch information
Dino-Kupinic authored Feb 4, 2025
2 parents 1d2d188 + feec5ee commit f8e00f9
Show file tree
Hide file tree
Showing 5 changed files with 45 additions and 12 deletions.
34 changes: 29 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,8 @@ ai-backend is a backend for AI-powered applications. It leverages FastAPI and Ol

### Prerequisites

(Also available via `nix-shell`)

- Python 3.12
- pip
- git
Expand All @@ -23,20 +25,42 @@ ai-backend is a backend for AI-powered applications. It leverages FastAPI and Ol
git clone https://github.com/Dino-Kupinic/ai-backend.git
```

2. Install dependencies
2. [Optional] Using a python virtual environment for a local installation
```bash
python3 -m venv venv
```
Activate venv: (example for *nix systems)
```bash
source ./venv/bin/activate
```

3. Install dependencies

```bash
pip install -r requirements.txt
```

3. Create a `.env` file in the root directory and copy over the fields from the `.env.example` file.
4. Create a `.env` file in the root directory and copy over the fields from the `.env.example` file.

4. Download ollama for your system from [here](https://ollama.com/download).
5. Download ollama for your system from [here](https://ollama.com/download).

> [!NOTE]
> Can be skipped if you use `nix-shell`.
> [!NOTE]
> In the future, ollama will be downloaded from the command line automatically.
5. Run the server
6. Start Ollama and Pull the model

```bash
ollama serve
```

```bash
ollama pull llama3
```

7. Run the server

```bash
fastapi dev src/main.py
Expand All @@ -55,7 +79,7 @@ The OpenAPI documentation is available at `/docs`. It is automatically generated
### Usage

```bash
curl -X POST "http://localhost:8000/message/" -H "Content-Type: application/json" -d '{"text": "Tell me something about Vienna, Austria"}' --no-buffer
curl -X POST "http://localhost:8000/message/" -H "Content-Type: application/json" -d '{"prompt": "Tell me something about Vienna, Austria", "model": "llama3"}' --no-buffer
```

> [!TIP]
Expand Down
2 changes: 1 addition & 1 deletion docs/package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "ai-backend-docs",
"version": "0.3.3",
"version": "0.3.4",
"description": "Documentation for the AI Backend project",
"scripts": {
"docs:dev": "vitepress dev src",
Expand Down
10 changes: 5 additions & 5 deletions requirements.txt
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
fastapi~=0.115.6
ollama~=0.4.4
pydantic~=2.10.4
mypy~=1.13.0
fastapi~=0.115.8
ollama~=0.4.7
pydantic~=2.10.6
mypy~=1.14.1
mypy-extensions~=1.0.0
pytest~=8.3.4

# Optional dependencies
ruff~=0.8.4
ruff~=0.9.4
9 changes: 9 additions & 0 deletions shell.nix
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
let pkgs = import <nixpkgs> { };
in pkgs.mkShell {
packages = with pkgs; [
git
python3
ollama
curl
];
}
2 changes: 1 addition & 1 deletion src/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
app = FastAPI(
title="AI Backend",
description="ai backend for your app powered by llama3",
version="0.3.3",
version="0.3.4",
)

origins = ["*"]
Expand Down

0 comments on commit f8e00f9

Please sign in to comment.