Skip to content

Commit

Permalink
refactor to work without starting new server instance (shreyaskarnik#123
Browse files Browse the repository at this point in the history
)

* fixing build instructions

* fixing build instructions

* fixing build instructions

* Update import paths and OLLAMA_BASE_URL

* Update Ollama instructions in README and Instructions.tsx
  • Loading branch information
shreyaskarnik authored Mar 1, 2024
1 parent b62971f commit 34c760c
Show file tree
Hide file tree
Showing 6 changed files with 7 additions and 8 deletions.
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ To generate the summary I am using the following approach:
- Start Ollama using the following command: `OLLAMA_ORIGINS=* OLLAMA_HOST=127.0.0.1:11435 ollama serve`
- In another terminal you can run `ollama pull llama2:latest` or `ollama pull mistral:latest`
- Choice of model depends on your use case. Here are the models supported by Ollama <https://ollama.ai/library>
- `OLLAMA_ORIGINS=*` is important as it will not block traffic from the extension.
- Make sure you set OLLAMA_ORIGINS=* for the Ollama environment by following instructions [here](https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-configure-ollama-server)

- Clone this repo
- Install pnpm `npm install -g pnpm`
Expand Down
5 changes: 1 addition & 4 deletions src/pages/sidePanel/Instructions.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,7 @@ export default function Instructions() {
const markdown = `
# Ollama Not Running
- Download Ollama from [here](https://ollama.ai)
- To start the server, run the following command:
\`\`\`bash
OLLAMA_ORIGINS=* OLLAMA_HOST=127.0.0.1:11435 ollama serve
\`\`\`
- Make sure you set OLLAMA_ORIGINS=* for the Ollama environment by following instructions [here](https://github.com/ollama/ollama/blob/main/docs/faq.md#how-do-i-configure-ollama-server)
- Download models from [here](https://ollama.ai/library)
`;

Expand Down
2 changes: 1 addition & 1 deletion src/pages/sidePanel/QandA.ts
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
import { OLLAMA_BASE_URL } from '@src/pages/utils/constants';
import { getPageContent } from '@src/pages/utils/getPageContent';
import { ConversationChain } from 'langchain/chains';
import { ChatOllama } from 'langchain/chat_models/ollama';
Expand All @@ -20,7 +21,6 @@ import { Voy as VoyClient } from 'voy-search';
import * as pdfWorker from '../../../node_modules/pdfjs-dist/build/pdf.worker.mjs';
PDFLib.GlobalWorkerOptions.workerSrc = pdfWorker;

export const OLLAMA_BASE_URL = 'http://localhost:11435';
export type ConversationalRetrievalQAChainInput = {
question: string;
chat_history: { question: string; answer: string }[];
Expand Down
2 changes: 1 addition & 1 deletion src/pages/sidePanel/Summarize.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ import { getPageContent } from '@src/pages/utils/getPageContent';
import { loadSummarizationChain } from 'langchain/chains';
import { ChatOllama } from 'langchain/chat_models/ollama';
import { RecursiveCharacterTextSplitter } from 'langchain/text_splitter';
import { OLLAMA_BASE_URL } from '@src/pages/sidePanel/QandA';
import { OLLAMA_BASE_URL } from '@src/pages/utils/constants';

export type SummarizationResponse = {
title?: string;
Expand Down
1 change: 1 addition & 0 deletions src/pages/utils/constants.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
export const OLLAMA_BASE_URL = 'http://localhost:11434';
3 changes: 2 additions & 1 deletion src/pages/utils/processing.ts
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
import { OLLAMA_BASE_URL } from '../utils/constants';
export async function getModels() {
try {
const response = await fetch('http://localhost:11435/api/tags');
const response = await fetch(`${OLLAMA_BASE_URL}/api/tags`);
const data = await response.json();
// {"models": [{ "name": "llama2:latest","modified_at": "2023-10-28T17:51:44.867165975-07:00","size": 3825819519,"digest": "fe938a131f40e6f6d40083c9f0f430a515233eb2edaa6d72eb85c50d64f2300e"}]}
return data.models;
Expand Down

0 comments on commit 34c760c

Please sign in to comment.