-
-
Notifications
You must be signed in to change notification settings - Fork 8.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support open-source models with OpenAI-compliant API interface #865
Comments
imeoneoi can you give a code snippet example of how you are doing this? |
I've created a fork here, and temporarily added some open-source models, see main...imoneoi:openchat-ui:main I think a better solution would be fetching models from
ChatInput.tsx if (maxLength && value.length > maxLength) {
alert(
t(
`Message limit is {{maxLength}} characters. You have entered {{valueLength}} characters.`,
{ maxLength, valueLength: value.length },
),
);
return;
}
Another solution could be not cutting or approximating the number of tokens using number of words if the tokenizer is unknown. chat.ts for (let i = messages.length - 1; i >= 0; i--) {
const message = messages[i];
const tokens = encoding.encode(message.content);
if (tokenCount + tokens.length + 1000 > model.tokenLimit) {
break;
}
tokenCount += tokens.length;
messagesToSend = [message, ...messagesToSend];
} |
I been trying to get it working via https://github.com/lm-sys/FastChat/blob/main/docs/openai_api.md which is a openai API compatible end point. There is also this: https://github.com/go-skynet/LocalAI/tree/master/examples/chatbot-ui. I have tried both with no luck, so I was curious how you did it. |
Here is another example that would be great to get support for chenhunghan/ialacol#7 It seems for now it fails to read the models, and also fails to set the model, so I am manually adding both. Outside of that it seems to work relatively well, but manually stopping fails. Would be great to see something like ialacol and this be merged into a single project :)) |
Could you support open-source LLMs with same API interface as OpenAI's, like FastChat?
I think one viable approach would be parsing model information using /models API instead of hardcode them into
OpenAIModels
class. Currently, I temporarily added the fields of several open-source models intoOpenAIModels
to use the UI with open-source models.The text was updated successfully, but these errors were encountered: