Skip to content

Commit

Permalink
Merge pull request #114 from jamesrochabrun/jroch-reasoning-object
Browse files Browse the repository at this point in the history
DeepSeek reasoning content support.
  • Loading branch information
jamesrochabrun authored Feb 2, 2025
2 parents c581d02 + 328ad63 commit d72e7a7
Show file tree
Hide file tree
Showing 3 changed files with 67 additions and 5 deletions.
66 changes: 61 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -134,7 +134,6 @@ let service = OpenAIServiceFactory.service(apiKey: apiKey, organizationID: ogani

That's all you need to begin accessing the full range of OpenAI endpoints.


### How to get the status code of network errors

You may want to build UI around the type of error that the API returns.
Expand Down Expand Up @@ -3289,19 +3288,76 @@ For more inofrmation about the `OpenRouter` api visit its [documentation](https:

The [DeepSeek](https://api-docs.deepseek.com/) API uses an API format compatible with OpenAI. By modifying the configuration, you can use SwiftOpenAI to access the DeepSeek API.

Creating the service

```swift
// Creating the service

let apiKey = "your_api_key"
let service = OpenAIServiceFactory.service(
apiKey: apiKey,
overrideBaseURL: "https://api.deepseek.com")
```

// Making a request
Non-Streaming Example

```swift
let prompt = "What is the Manhattan project?"
let parameters = ChatCompletionParameters(messages: [.init(role: .user, content: .text(prompt))], model: .custom("deepseek-reasoner"))
let stream = service.startStreamedChat(parameters: parameters)
let parameters = ChatCompletionParameters(
messages: [.init(role: .user, content: .text(prompt))],
model: .custom("deepseek-reasoner")
)

do {
let result = try await service.chat(parameters: parameters)

// Access the response content
if let content = result.choices.first?.message.content {
print("Response: \(content)")
}

// Access reasoning content if available
if let reasoning = result.choices.first?.message.reasoningContent {
print("Reasoning: \(reasoning)")
}
} catch {
print("Error: \(error)")
}
```

Streaming Example

```swift
let prompt = "What is the Manhattan project?"
let parameters = ChatCompletionParameters(
messages: [.init(role: .user, content: .text(prompt))],
model: .custom("deepseek-reasoner")
)

// Start the stream
do {
let stream = try await service.startStreamedChat(parameters: parameters)
for try await result in stream {
let content = result.choices.first?.delta.content ?? ""
self.message += content

// Optional: Handle reasoning content if available
if let reasoning = result.choices.first?.delta.reasoningContent {
self.reasoningMessage += reasoning
}
}
} catch APIError.responseUnsuccessful(let description, let statusCode) {
self.errorMessage = "Network error with status code: \(statusCode) and description: \(description)"
} catch {
self.errorMessage = error.localizedDescription
}
```

Notes

- The DeepSeek API is compatible with OpenAI's format but uses different model names
- Use .custom("deepseek-reasoner") to specify the DeepSeek model
- The `reasoningContent` field is optional and specific to DeepSeek's API
- Error handling follows the same pattern as standard OpenAI requests.
```

For more inofrmation about the `DeepSeek` api visit its [documentation](https://api-docs.deepseek.com).
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,8 @@ public struct ChatCompletionChunkObject: Decodable {

/// The contents of the chunk message.
public let content: String?
/// The reasoning content generated by the model, if available.
public let reasoningContent: String?
/// The tool calls generated by the model, such as function calls.
public let toolCalls: [ToolCall]?
/// The name and arguments of a function that should be called, as generated by the model.
Expand All @@ -57,6 +59,7 @@ public struct ChatCompletionChunkObject: Decodable {

enum CodingKeys: String, CodingKey {
case content
case reasoningContent = "reasoning_content"
case toolCalls = "tool_calls"
case functionCall = "function_call"
case role
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,8 @@ public struct ChatCompletionObject: Decodable {
public let functionCall: FunctionCall?
/// The role of the author of this message.
public let role: String
/// The reasoning content generated by the model, if available.
public let reasoningContent: String?
/// Provided by the Vision API.
public let finishDetails: FinishDetails?
/// The refusal message generated by the model.
Expand Down Expand Up @@ -86,6 +88,7 @@ public struct ChatCompletionObject: Decodable {
case functionCall = "function_call"
case role
case finishDetails = "finish_details"
case reasoningContent = "reasoning_content"
case refusal
case audio
}
Expand Down

0 comments on commit d72e7a7

Please sign in to comment.