Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

.Net: Using ChatResponseFormat.JsonObject when using function calling fails with gpt-4o-mini #8692

Closed
Simcon opened this issue Sep 12, 2024 · 4 comments
Assignees
Labels
bug Something isn't working .NET Issue or Pull requests regarding .NET code

Comments

@Simcon
Copy link

Simcon commented Sep 12, 2024

Describe the bug
Using ChatResponseFormat.JsonObject when using function calling works with gpt-4-turbo, but fails with gpt-4o-mini-2024-07-18.

To Reproduce
Steps to reproduce the behavior:

  1. Copy the sample code below into a new console project.
  2. Add Microsoft.SemanticKernel 1.19.0
  3. Add your API key to code and run it. It works.
  4. Change the modelId to gpt-4o-mini-2024-07-18 and run again. It fails.

Expected behavior
It should work with either model. I have confirmed function calling is supported by the gpt-4o-mini model here and here.

Platform

  • OS: Win 11
  • IDE: Rider
  • Language: C#
  • Source: Microsoft.SemanticKernel Version=1.19.0

Additional context

using System.ComponentModel;
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.Connectors.OpenAI;
using OpenAI.Chat;

//var modelId = "gpt-4-turbo"; // WORKS
var modelId = "gpt-4o-mini-2024-07-18"; // FAILS
var apiKey =
    "sk-proj-XXXXXXXXX";

var builder = Kernel.CreateBuilder()
    .AddOpenAIChatCompletion(modelId, apiKey);

builder.Plugins.AddFromType<NewsPlugin>();
var kernel1 = builder.Build();

OpenAIPromptExecutionSettings executionSettings = new()
{
    ToolCallBehavior = ToolCallBehavior.EnableFunctions(
        kernel1.Plugins
            .SelectMany(p => p)
            .Select(f => f.Metadata.ToOpenAIFunction()),
        autoInvoke: true),
    ChatSystemPrompt =
        "You are a smart AI assistant that have access to latest news and are able to help user with summaries. Reply in JSON.",
#pragma warning disable SKEXP0010
    ResponseFormat = ChatResponseFormat.JsonObject
#pragma warning restore SKEXP0010
};

var streamingFunc = kernel1.CreateFunctionFromPrompt("What happened today? Reply in JSON.", executionSettings);
var streamingEnumerable = streamingFunc.InvokeStreamingAsync(kernel1, new KernelArguments());
await foreach (var content in streamingEnumerable)
{
    var contentToWrite = content.ToString();
    Console.WriteLine(contentToWrite);
}

public class NewsPlugin
{
    [KernelFunction("get_news")]
    [Description("Gets the news")]
    public string[] GetNews()
    {
        return
        [
            "Trump and Kamala are in the election race.",
            "Oasis have reformed.",
            "SpaceX launch another rocket."
        ];
    }
}

@Simcon Simcon added the bug Something isn't working label Sep 12, 2024
@markwallace-microsoft markwallace-microsoft added .NET Issue or Pull requests regarding .NET code triage labels Sep 12, 2024
@github-actions github-actions bot changed the title Using ChatResponseFormat.JsonObject when using function calling fails with gpt-4o-mini .Net: Using ChatResponseFormat.JsonObject when using function calling fails with gpt-4o-mini Sep 12, 2024
@markwallace-microsoft
Copy link
Member

Hi @Simcon,
you are using OpenAI directly so you need to refer to the OpenAI documentation, see https://openai.com/index/introducing-structured-outputs-in-the-api/. This document mentions Structured Outputs with response formats is available on gpt-4o-mini and gpt-4o-2024-08-06 and any fine tunes based on these models. . Can you try gpt-4o-mini instead of gpt-4o-mini-2024-07-18?

@Simcon
Copy link
Author

Simcon commented Sep 12, 2024

Hi @markwallace-microsoft

I did some further testing based on your suggestion. It looks like it affectes 4o-mini and older 4o models.

//var modelId = "gpt-4-turbo"; // WORKS
//var modelId = "gpt-4o-mini-2024-07-18"; // FAILS
//var modelId = "gpt-4o-mini"; // FAILS (Points to 2024-07-18)
//var modelId = "gpt-4o-2024-08-06"; // WORKS
//var modelId = "gpt-4o-2024-05-13"; // FAILS
//var modelId = "gpt-4o"; // FAILS (Points to 2024-05-13)

However, the OpenAI documentation state these versions are supported: https://platform.openai.com/docs/guides/function-calling/which-models-support-function-calling.

The fix for me is to just use gpt-4o-2024-08-06 but it would be good to get the 4o-mini model working since it is much cheaper.

@markwallace-microsoft
Copy link
Member

Thanks for the update @Simcon. I will look to see is there somewhere to post issues against the Azure OpenAI models.

@markwallace-microsoft
Copy link
Member

@Simcon you can go to the follow site to find Azure AI services support and help options

@github-project-automation github-project-automation bot moved this from Sprint: In Review to Sprint: Done in Semantic Kernel Sep 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working .NET Issue or Pull requests regarding .NET code
Projects
Archived in project
Development

No branches or pull requests

2 participants