The following environment variables let you customize LLMLean.
Example:
LLMLEAN_API
:together
: to use Together API (or any similar supported API, or your own server)
LLMLEAN_API_KEY
:- API key for Together API
LLMLEAN_ENDPOINT
: API endpointhttps://api.together.xyz/v1/completions
for Together API
LLMLEAN_PROMPT
:fewshot
: for base modelsinstruction
: for instruction-tuned models
LLMLEAN_MODEL
:- Example for Together API:
mistralai/Mixtral-8x7B-Instruct-v0.1
- Example for Together API:
LLMLEAN_NUMSAMPLES
:- Example:
10
- Example:
LLMLEAN_API
:ollama
: to use ollama (default)
LLMLEAN_ENDPOINT
:- With ollama it is
http://localhost:11434/api/generate
- With ollama it is
LLMLEAN_PROMPT
:fewshot
: for base modelsinstruction
: for instruction-tuned models
LLMLEAN_MODEL
:- Example:
solobsd/llemma-7b
- Example:
LLMLEAN_NUMSAMPLES
:- Example:
10
- Example:
To set environment variables in VS Code, go to:
- Settings (
Command
+,
on Mac) - Extensions -> Lean 4
- Add the environment variables to Server Env.
Then restart the Lean Server (Command
+ t
, then type > Lean 4: Restart Server
).