Skip to content

Asimov helps you build high performance LLM apps, written in Rust 🦀

License

Notifications You must be signed in to change notification settings

overmindai/asimov

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

25 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Asimov: Build blazingly fast LLM applications in Rust 🦀

Overview

asimov is a Rust library to build high performance LLM based applications. The crate is divided into the following modules:

  1. io Structured input and output parsing, stream processing.
  2. db Abstractions to interact with vector databases.
  3. models Abstractions to interact with LLMs.
  4. tokenizers Tokenizer utilities

Quickstart

Here's a simple example using asimov to generate a response from an LLM. See the examples directory for more examples.

use asimov::prelude::*;

#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
    std::env::var("OPENAI_API_KEY").expect("OPENAI_API_KEY must be set");

    let gpt35 = OpenAiLlm::default();
    let response: RawString = gpt35.generate("How are you doing").await?;

    println!("Response: {}", response.0);

    Ok(())
}

Run examples

  1. Structured parsing from streams
cargo run --example structured_streaming --features openai -- --nocapture
  1. Basic streaming
cargo run --example streaming --features openai -- --nocapture
  1. Chaining two LLMs
cargo run --example simple_chain --features openai
  1. Few shot generation
cargo run --example simple_fewshot --features openai 

Install

Add asimov as a dependency in your Cargo.toml

cargo add asimov

or

[dependencies]
asimov = "0.1.0"

Build from source

Prerequisites

  • Rust 1.54.0 or later
  • Cargo (comes with Rust)

Clone the repository:

git clone https://github.com/yourusername/asimov.git
cd asimov

Build the project

cargo build

Run Tests

  1. I/O module
cargo test io
  1. Vector DB module
cargo test db
  1. Tokenizers module
cargo test tokenizers
  1. Models module
cargo test models features --openai

Features

The following optional features can be enabled:

  • openai Enables use of the async-openai crate.
  • qdrant Enables the use of the qdrant-client crate.

To enable a feature, use the --features flag when building or running:

cargo build --features "openai qdrant"

About

Asimov helps you build high performance LLM apps, written in Rust 🦀

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages