Skip to content

Commit

Permalink
More changes (shreyaskarnik#31)
Browse files Browse the repository at this point in the history
  • Loading branch information
shreyaskarnik authored Nov 15, 2023
1 parent 19f6944 commit 96db3e0
Show file tree
Hide file tree
Showing 21 changed files with 1,262 additions and 81 deletions.
Binary file added Chat.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added ChatWithDocs.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added ChatWithPage.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added Overview.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file removed Q&A.gif
Binary file not shown.
94 changes: 52 additions & 42 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,38 +4,40 @@

## What is DistiLlama?

DistiLlama is a Chrome extension that leverages locally running LLM to generate a summary of the web page you are currently viewing.
DistiLlama is a Chrome extension that leverages locally running LLM perform following tasks.

One of the things that I was experimenting with is how to use a locally running LLM instance for various tasks and summarization (tl;dr) was on the top of my list. It was key to have all calls to LLM be local.
![Overview](./Overview.png)

One of the things that I was experimenting with is how to use a locally running LLM instance for various tasks and summarization (tl;dr) was on the top of my list. It was key to have all calls to LLM be local and all the data to stay private.

This project utilizes [Ollama](https://ollama.ai/) as the locally running LLM instance. Ollama is a great project that is easy to setup and use. I highly recommend checking it out.

To generate the summary I am using the following approach:

* Grab the current active tab id
* Use [Readability](https://github.com/mozilla/readability) to extract the text content from the page. In my experiments it was clear that the quality of the summary was much better when using Readability as it removed a lot of un-necessary content from the page.
* Use [LangChain (LangChain.js)](https://js.langchain.com/docs/get_started/introduction/) to summarize the text content.
* Display the summary in a popup window.
- Grab the current active tab id
- Use [Readability](https://github.com/mozilla/readability) to extract the text content from the page. In my experiments it was clear that the quality of the summary was much better when using Readability as it removed a lot of un-necessary content from the page.
- Use [LangChain (LangChain.js)](https://js.langchain.com/docs/get_started/introduction/) to summarize the text content.
- Display the summary in a popup window.

## How to use DistiLlama?

* Prerequisites:
* Install [Ollama](https://ollama.ai/download) you can also choose to run Ollama in a [Docker container](https://ollama.ai/blog/ollama-is-now-available-as-an-official-docker-image).
* Start Ollama using the following command: `OLLAMA_ORIGINS=* OLLAMA_HOST=127.0.0.1:11435 ollama serve`
* In another terminal you can run `ollama pull llama2:latest` or `ollama pull mistral:latest`
* Choice of model depends on your use case. Here are the models supported by Ollama <https://ollama.ai/library>
* `OLLAMA_ORIGINS=*` is important as it will not block traffic from the extension.

* Clone this repo
* Install pnpm `npm install -g pnpm`
* run `yarn install`
* run `yarn dev`
* Open Chrome and navigate to `chrome://extensions/`
* Enable developer mode (if not already enabled)
* Click on `Load unpacked` and select the `dist` folder from the base of the cloned project.
* You should see the DistiLlama added to your Chrome extensions.
* You may want to pin the extension to your Chrome toolbar for easy access.
* If you decide to use a different LLM you will need to change this line in `src/pages/sidePanel/Summarize.ts`
- Prerequisites:
- Install [Ollama](https://ollama.ai/download) you can also choose to run Ollama in a [Docker container](https://ollama.ai/blog/ollama-is-now-available-as-an-official-docker-image).
- Start Ollama using the following command: `OLLAMA_ORIGINS=* OLLAMA_HOST=127.0.0.1:11435 ollama serve`
- In another terminal you can run `ollama pull llama2:latest` or `ollama pull mistral:latest`
- Choice of model depends on your use case. Here are the models supported by Ollama <https://ollama.ai/library>
- `OLLAMA_ORIGINS=*` is important as it will not block traffic from the extension.

- Clone this repo
- Install pnpm `npm install -g pnpm`
- run `yarn install`
- run `yarn dev`
- Open Chrome and navigate to `chrome://extensions/`
- Enable developer mode (if not already enabled)
- Click on `Load unpacked` and select the `dist` folder from the base of the cloned project.
- You should see the DistiLlama added to your Chrome extensions.
- You may want to pin the extension to your Chrome toolbar for easy access.
- If you decide to use a different LLM you will need to change this line in `src/pages/sidePanel/Summarize.ts`

```typescript
const llm = new ChatOllama({
Expand All @@ -45,7 +47,7 @@ To generate the summary I am using the following approach:
});
```

* If you would like to tweak the summarization chain change these lines in `src/pages/sidePanel/Summarize.ts`
- If you would like to tweak the summarization chain change these lines in `src/pages/sidePanel/Summarize.ts`

```typescript
const chain = loadSummarizationChain(llm, {
Expand All @@ -56,30 +58,38 @@ To generate the summary I am using the following approach:

## Demo

### Summarization
### Chat with LLM

![Summary](./Summary.gif)
![Chat](./Chat.gif)

### Chat with Documents (PDF)

![ChatWithDocs](./ChatWithDocs.gif)

### Q&A/Chat
### Chat with Web Page

![Q&A](./Q&A.gif)
![ChatWithPage](./ChatWithPage.gif)

### Summarization

![Summary](./Summary.gif)

## TODOS

* [ ] Make the summarization chain configurable
* [x] Make LLM model configurable
* [ ] Save summary in local storage
* [ ] Improve the UI (not an expert in this area but will try to learn)
* [ ] Add TTS support
* [ ] Check out performance with different tuned prompts
* [x] Extend to chat with the page (use embeddings and LLMs for RAG)
* [x] Use [transformers.js](https://github.com/xenova/transformers.js) for local in browser embeddings and [Voy](https://github.com/tantaraio/voy) for the storage similar to this [Building LLM-Powered Web Apps with Client-Side Technology](https://ollama.ai/blog/building-llm-powered-web-apps)
* [ ] Focus on improving quality of the summarization and chat
- [ ] Make the summarization chain configurable
- [x] Make LLM model configurable
- [ ] Save summary in local storage
- [ ] Improve the UI (not an expert in this area but will try to learn)
- [ ] Add TTS support
- [ ] Check out performance with different tuned prompts
- [x] Extend to chat with the page (use embeddings and LLMs for RAG)
- [x] Use [transformers.js](https://github.com/xenova/transformers.js) for local in browser embeddings and [Voy](https://github.com/tantaraio/voy) for the storage similar to this [Building LLM-Powered Web Apps with Client-Side Technology](https://ollama.ai/blog/building-llm-powered-web-apps)
- [ ] Focus on improving quality of the summarization and chat

## References and Inspiration

* [LangChain](https://github.com/langchain-ai/langchainjs)
* [Ollama](https://ollama.ai/)
* [Building LLM-Powered Web Apps with Client-Side Technology](https://ollama.ai/blog/building-llm-powered-web-apps)
* [Chrome Extension Template](https://github.com/Jonghakseo/chrome-extension-boilerplate-react-vite)
* Art work generated using [DALL·E 3](https://openai.com/dall-e-3)
- [LangChain](https://github.com/langchain-ai/langchainjs)
- [Ollama](https://ollama.ai/)
- [Building LLM-Powered Web Apps with Client-Side Technology](https://ollama.ai/blog/building-llm-powered-web-apps)
- [Chrome Extension Template](https://github.com/Jonghakseo/chrome-extension-boilerplate-react-vite)
- Art work generated using [DALL·E 3](https://openai.com/dall-e-3)
Binary file modified Summary.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 2 additions & 0 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -33,9 +33,11 @@
"langchain": "^0.0.186",
"pdf-parse": "^1.1.1",
"pdfjs-dist": "^4.0.189",
"prop-types": "^15.8.1",
"react": "18.2.0",
"react-dom": "18.2.0",
"react-icons": "^4.11.0",
"react-markdown": "^9.0.1",
"react-toastify": "^9.1.3",
"styled-components": "^6.1.0",
"voy-search": "^0.6.3"
Expand Down
Loading

0 comments on commit 96db3e0

Please sign in to comment.