Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: acompletion logs duplicates to langfuse #7477

Open
N13T opened this issue Dec 30, 2024 · 3 comments
Open

[Bug]: acompletion logs duplicates to langfuse #7477

N13T opened this issue Dec 30, 2024 · 3 comments
Assignees
Labels
bug Something isn't working mlops user request

Comments

@N13T
Copy link

N13T commented Dec 30, 2024

What happened?

When using acompletion and logging to langfuse we create duplicate traces. Here's the code im running

import os
import asyncio

os.environ["OPENAI_API_KEY"] = "openai_api_key"
os.environ["ANTHROPIC_API_KEY"] = "anthropic_api_key"

os.environ['LITELLM_LOG'] = 'DEBUG'
os.environ["LANGFUSE_DEBUG"] = "True"

os.environ["LANGFUSE_PUBLIC_KEY"] = "pk"
os.environ["LANGFUSE_SECRET_KEY"] = "sk"

import litellm
litellm.set_verbose = True
litellm.success_callback = ["langfuse"]
litellm.failure_callback = ["langfuse"] # logs errors to langfuse

async def main():
    models = ["gpt-4o-mini", "claude-3-5-haiku-20241022"]

    messages = [
    {
        "role": "user", 
        "content": "Hello, how are you?"},
    ]

    resp = await litellm.acompletion(
        model=models[0],
        messages=messages,
        temperature=0.0,
        fallbacks=models[1:],
        metadata = {
        "generation_name": "test-gen",
        "project": "litellm-test" 
    }
    )
    return resp

if __name__ == "__main__":
    asyncio.run(main())

And here are the traces in langfuse, you can see the completion one is a single trace while the acompletion ones are duplicates.
image

Relevant log output

No response

Are you a ML Ops Team?

Yes

What LiteLLM version are you on ?

v1.56.4

Twitter / LinkedIn details

No response

@N13T N13T added the bug Something isn't working label Dec 30, 2024
@krrishdholakia krrishdholakia self-assigned this Jan 2, 2025
@krrishdholakia
Copy link
Contributor

able to repro with script. investigating

Screenshot 2025-01-01 at 4 31 56 PM

@krrishdholakia
Copy link
Contributor

seems to be caused by the fallbacks: param

investigating

@krrishdholakia
Copy link
Contributor

Fixed - 94fa390

Will be live in today's release.


@N13T Can we do a 10min call sometime this/next week?

Want to learn how you're using litellm sdk, so we can improve for your team.

Attaching my calendly, for your convenience - https://calendly.com/d/4mp-gd3-k5k/litellm-1-1-onboarding-chat

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working mlops user request
Projects
None yet
Development

No branches or pull requests

2 participants