Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add "ollama/llama3.3" in "model_prices_and_context_window.json" #7431

Open
Tejhing opened this issue Dec 26, 2024 · 1 comment
Open

Add "ollama/llama3.3" in "model_prices_and_context_window.json" #7431

Tejhing opened this issue Dec 26, 2024 · 1 comment

Comments

@Tejhing
Copy link

Tejhing commented Dec 26, 2024

Source: <SOURCE_URL>

We need to update both model_prices_and_context_window.json and model_prices_and_context_window_backup.json to reflect the new model.

@krrishdholakia
Copy link
Contributor

krrishdholakia commented Dec 27, 2024

why do we need to add it?

we already have a way to get the ollama model specific (E.g. max tokens / function calling support) info via an api call to the hosted endpoint when you run litellm.get_model_info(..) -

def get_model_info(self, model: str) -> ModelInfoBase:

and the pricing is always 0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants