Skip to content

Commit

Permalink
fix: Classifier fix to remove stop and max_tokens kwargs in llm call (#…
Browse files Browse the repository at this point in the history
…501)

Classifier fix to remove stop and max_tokens kwargs in llm call

Co-authored-by: Deepak K <[email protected]>
  • Loading branch information
chandrasekharan-zipstack and Deepak-Kesavan authored Jul 19, 2024
1 parent 5d72022 commit 37d1da5
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion tools/classifier/src/config/properties.json
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@
"schemaVersion": "0.0.1",
"displayName": "File Classifier",
"functionName": "classify",
"toolVersion": "0.0.25",
"toolVersion": "0.0.26",
"description": "Classifies a file into a bin based on its contents",
"input": {
"description": "File to be classified"
Expand Down
2 changes: 1 addition & 1 deletion tools/classifier/src/helper.py
Original file line number Diff line number Diff line change
Expand Up @@ -132,7 +132,7 @@ def call_llm(self, prompt: str, llm: LLM) -> Optional[str]:
str: Classification
"""
try:
completion = llm.complete(prompt, max_tokens=50, stop=["\n"])[LLM.RESPONSE]
completion = llm.complete(prompt)[LLM.RESPONSE]
classification: str = completion.text.strip()
self.tool.stream_log(f"LLM response: {completion}")
return classification
Expand Down

0 comments on commit 37d1da5

Please sign in to comment.