This is a document summarization application using BunJS and Ollama AI server to generate AI-powered summaries of documents stored in a Paperless service.
This is a research project on how AI can be used to do useful stuff. Feel free to use it, but use it at your own risk.
- Fetches documents from a specified Paperless URL.
- Generates AI-based summaries using the Ollama server with the specified Llama model.
- Posts the generated summaries back to the Paperless service as comments.
- Supports saving the generated summaries locally as text files.
- Node.js and BunJS installed.
- Access to a Paperless service API.
- Running instance of Ollama AI server.
- Clone the repository:
git clone <repository-url>
cd <project-directory>
- Install dependencies: Make sure you have BunJS installed. Then run:
bun install
Set up the following environment variables:
PAPERLESS_TOKEN
: Your API token for authenticating with the Paperless service.PAPERLESS_URL
: The base URL for your Paperless service.OUTPUT_TXT
: Set to1
if you want to save summaries as text files, otherwise0
.OUTPUT_PATH
: Directory path where text summaries should be saved (ifOUTPUT_TXT
is1
).- Optional: Override default values for the summarization model settings by setting:
MODEL_NAME
: Specify the model to use (default isllama3.2
).CONTEXT_LENGTH
: Set the max context length for the summarizer (default is8096
).SUMMARY_MARKER
: Define a custom marker for identifying AI-generated summaries.
To run the application and start processing documents, execute:
bun run src/index.ts
The script will:
- Fetch documents from Paperless service without an AI-generated summary.
- Generate summaries using the configured AI model.
- Post the generated summaries as comments on each document.
- Optionally save summaries to the local file system.
To reset a summary, simply remove all corresponding notes on the document, that
contain your SUMMARY_MARKER
, wich is 'AI_SUMMARY' by default.
- Ensure that your environment variables are correctly set up; otherwise, the script will throw an error.
- Ollama AI server should be running and configured with the required models to generate summaries.
This project is open-source and available under the MIT License.
This project was created using bun init
in bun v1.1.34. Bun is a fast all-in-one JavaScript runtime.