You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Since many providers are compatible with OpenAI it would be nice to have possibility to configure any OpenAI compatible LLM by providing base url, api key, model name.
Many people run local instances of custom OpenAI-API compatible LLMs with
non standard model names. Though, DevoxxGenie allows to specify custom
endpoint, the model names are not configurable.
issue: devoxx#156
The text was updated successfully, but these errors were encountered: