Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add more Cloud LLM's #156

Open
stephanj opened this issue Jul 5, 2024 · 1 comment
Open

Add more Cloud LLM's #156

stephanj opened this issue Jul 5, 2024 · 1 comment
Labels
enhancement New feature or request

Comments

@stephanj
Copy link
Contributor

stephanj commented Jul 5, 2024

image

@stephanj stephanj added the enhancement New feature or request label Jul 5, 2024
@pczekaj
Copy link

pczekaj commented Jul 31, 2024

Since many providers are compatible with OpenAI it would be nice to have possibility to configure any OpenAI compatible LLM by providing base url, api key, model name.

@stephanj stephanj mentioned this issue Aug 1, 2024
kofemann added a commit to kofemann/DevoxxGenieIDEAPlugin that referenced this issue Dec 16, 2024
Many people run local instances of custom OpenAI-API compatible LLMs with
non standard model names. Though, DevoxxGenie allows to specify custom
endpoint, the model names are not configurable.

issue: devoxx#156
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants