This repo uses Streamlit to create an user-facing demo application that showcases various capabilities from LlamaCloud.
It uses the llama-cloud
API Python client (pip install llama-cloud
)
You can use the streamlit application now by visiting https://llamacloud.streamlit.app
You will need access to LlamaCloud in order to create an API key first to use within the app.
We wanted to create a demo that showcases some of the Agentic RAG capabilities that LlamaCloud enables through an interactive UX. Additionally, by open-sourcing the codebase for this, we hope that developers can use this code as a reference for setting up their own applications that rely on the LlamaCloud API.
Here are the steps for setting up your development environment to run this project locally:
- Clone this repo e.g.
gh repo clone run-llama/llamacloud_streamlit
- Install
poetry
if you haven't already - Install the poetry dependencies by running
poetry shell
and thenpoetry install
within this project's root directory. - Add a
secrets.toml
file in the.streamlit
folder and add a value foropenai_key
to ittouch .streamlit/secrets.toml
- Add a line within the newly created
secrets.toml
that readsopenai_key = "YOUR OPENAI API KEY"
- (Optional) You can also setup a
.env
file to pre-populate some of the values used for global settings inapp/app_settings.py
. This may ease local use of the app so you don't need to continuously fill in the LlamaCloud API key in the UI. - Run
make run
to run the streamlit app locally. You can then visit the application athttp://localhost:8501
- Please note you will need to setup the LlamaCloud API key the app will use on the API Keys tab in the UI first.