From 37d1da518ac72b8b6da4c5d45be2cd1a1cb73a3e Mon Sep 17 00:00:00 2001 From: Chandrasekharan M <117059509+chandrasekharan-zipstack@users.noreply.github.com> Date: Fri, 19 Jul 2024 15:54:22 +0530 Subject: [PATCH] fix: Classifier fix to remove stop and max_tokens kwargs in llm call (#501) Classifier fix to remove stop and max_tokens kwargs in llm call Co-authored-by: Deepak K <89829542+Deepak-Kesavan@users.noreply.github.com> --- tools/classifier/src/config/properties.json | 2 +- tools/classifier/src/helper.py | 2 +- 2 files changed, 2 insertions(+), 2 deletions(-) diff --git a/tools/classifier/src/config/properties.json b/tools/classifier/src/config/properties.json index b29db5c35..b095314dd 100644 --- a/tools/classifier/src/config/properties.json +++ b/tools/classifier/src/config/properties.json @@ -2,7 +2,7 @@ "schemaVersion": "0.0.1", "displayName": "File Classifier", "functionName": "classify", - "toolVersion": "0.0.25", + "toolVersion": "0.0.26", "description": "Classifies a file into a bin based on its contents", "input": { "description": "File to be classified" diff --git a/tools/classifier/src/helper.py b/tools/classifier/src/helper.py index dd7e4ac65..488b67b8f 100644 --- a/tools/classifier/src/helper.py +++ b/tools/classifier/src/helper.py @@ -132,7 +132,7 @@ def call_llm(self, prompt: str, llm: LLM) -> Optional[str]: str: Classification """ try: - completion = llm.complete(prompt, max_tokens=50, stop=["\n"])[LLM.RESPONSE] + completion = llm.complete(prompt)[LLM.RESPONSE] classification: str = completion.text.strip() self.tool.stream_log(f"LLM response: {completion}") return classification