-
Notifications
You must be signed in to change notification settings - Fork 19
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: OpenAI: JSON mode #14
Conversation
Hi @Butch78! Thanks again! I've been a bit busy but will review this when I have time sometime this week! |
All tests seem to fail for the same reason.
Did API change? Did they previously send I assume we have a structure to represent OpenAI's Response body. We may remove the field called id. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I suggested a change to test, consistent naming, increase type safety as well as pointed to possible bug fix.
orca/src/llm/openai.rs
Outdated
@@ -382,6 +465,8 @@ mod test { | |||
let prompt = prompt.render_context("my template", &context).unwrap(); | |||
let response = client.generate(prompt).await.unwrap(); | |||
assert!(response.to_string().to_lowercase().contains("berlin")); | |||
// Assert response is a JSON object | |||
assert!(response.to_string().starts_with("{")); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like how concrete this "{" is. It is easy to read.
There is a comment above, is something not obvious from the test name. I would C\consider an assert instead of a comment.
Something like
assert!(response.is_valid_json())
I would leave the first assert too assert!(response.to_string().starts_with("{"));
[]
is a JSON. May open AI return list of dictionaries in JSON?
If so the comment is not accurate better without it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
second thought: I remember Uncle Bob's suggestion source code should be general, tests be concrete.
Maybe assert_eq!(response, "{ \"country\": "berlin"}")
orca/src/llm/openai.rs
Outdated
|
||
/// The format of the returned data. With the new update, the response can be set to a JSON object. | ||
/// https://platform.openai.com/docs/guides/text-generation/json-mode | ||
response_format: Option<ResponseFormat>, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nitpick:
Hmm, I am wondering if it will be better without Option...
response_format: Option<ResponseFormat>, | |
response_format: ResponseFormat, |
Maybe it is possible to override the default zero
value. So users don't have to set it, at the same time may read more about default format.
@Butch78 I was pretty sick last week and then it was Thanksgiving, so didn't have time to review. I will try and review it this week. Sorry for the delay! |
I hope you are feeling better and had a great Thanksgiving 🦃 @santiagomed, I've been busy myself but I tried to simplify the chat-completions functionality :) I still wasn't able to get my teststo pass though :/ It's something to do with the ResponseFormat but I can't figure out what breaks the chat compeletions |
Hi @Butch78 . I found the issue. The raw response body returns this: {
"error": {
"message": "'json_object' is not of type 'object' - 'response_format'",
"type": "invalid_request_error",
"param": null,
"code": null
}
} This is what the payload currently looks like: Payload {
model: "gpt-3.5-turbo-1106",
prompt: None,
max_tokens: 1024,
temperature: 1.0,
stop: None,
messages: [
Message {
role: User,
content: "What is the capital of France?",
},
Message {
role: Assistant,
content: "Paris",
},
Message {
role: User,
content: "What is the capital of Germany?",
},
],
stream: false,
response_format: JsonObject, // serializes to "json_object"
} You are just passing the enum to the payload which serializes to |
Thank you @santiagomed for you help! I just modified the prompt to include it but maybe in the future when this implemented: |
Hi @santiagomed,
Congrats again on 100 Stars ⭐⭐
I tried to implement OpenAi's new JSON mode, unfortunately, I couldn't get my test to work, I'm not sure why but I think I'm using the
response_format: Option<ResponseFormat>
on the Payload struct incorrectly. Any help would be greatly appreciated!