asimov
is a Rust library to build high performance LLM based applications. The crate is divided into the following modules:
io
Structured input and output parsing, stream processing.db
Abstractions to interact with vector databases.models
Abstractions to interact with LLMs.tokenizers
Tokenizer utilities
Here's a simple example using asimov
to generate a response from an LLM. See the examples directory for more examples.
use asimov::prelude::*;
#[tokio::main]
async fn main() -> Result<(), Box<dyn std::error::Error>> {
std::env::var("OPENAI_API_KEY").expect("OPENAI_API_KEY must be set");
let gpt35 = OpenAiLlm::default();
let response: RawString = gpt35.generate("How are you doing").await?;
println!("Response: {}", response.0);
Ok(())
}
- Structured parsing from streams
cargo run --example structured_streaming --features openai -- --nocapture
- Basic streaming
cargo run --example streaming --features openai -- --nocapture
- Chaining two LLMs
cargo run --example simple_chain --features openai
- Few shot generation
cargo run --example simple_fewshot --features openai
Add asimov as a dependency in your Cargo.toml
cargo add asimov
or
[dependencies]
asimov = "0.1.0"
git clone https://github.com/yourusername/asimov.git
cd asimov
cargo build
- I/O module
cargo test io
- Vector DB module
cargo test db
- Tokenizers module
cargo test tokenizers
- Models module
cargo test models features --openai
The following optional features can be enabled:
openai
Enables use of theasync-openai
crate.qdrant
Enables the use of theqdrant-client
crate.
To enable a feature, use the --features
flag when building or running:
cargo build --features "openai qdrant"