Skip to content

Releases: jamesrochabrun/SwiftOpenAI

Making debug prints optional.

20 Jul 04:15
Compare
Choose a tag to compare

Now consumers of the library can opt in to print events on DEBUG builds.

e.g:

let service = OpenAIServiceFactory.service(apiKey: YOUR_API_KEY, debugEnabled: true)

Support for GPT-4o mini: advancing cost-efficient intelligence

19 Jul 20:02
e6bb9a3
Compare
Choose a tag to compare

GPT-4o mini: advancing cost-efficient intelligence
Introducing our most cost-efficient small model

Screenshot 2024-07-19 at 1 00 36 PM

AIProxy updates.

26 Jun 22:04
6bc5fe0
Compare
Choose a tag to compare
  • The factory method OpenAIServiceFactory.ollama has been changed to OpenAIServiceFactory.service, where you specify the url of the service that is OpenAI-API compatible. To specify the URL and api key (for Bearer API authentication), use:
OpenAIServiceFactory.service(apiKey: "YOUR_API_KEY", baseURL: "http://<DOMAIN>:<PORT>")
  • The AIProxy integration now uses certificate pinning to prevent threat actors from snooping on your traffic. There are no changes to your client code necessary to take advantage of this security improvement

  • The AIProxy integration has a changed method for hiding the DeviceCheck bypass token. This token is only intended to be used on iOS simulators, and the previous method of hiding was too easy to leak into production builds of the app. Please change your integration code from:

#if DEBUG && targetEnvironment(simulator)
	OpenAIServiceFactory.service(
		aiproxyPartialKey: "hardcode-partial-key-here",
		aiproxyDeviceCheckBypass: "hardcode-device-check-bypass-here"
	)
#else
	OpenAIServiceFactory.service(aiproxyPartialKey: "hardcode-partial-key-here")
#endif

To this:

OpenAIServiceFactory.service(
   aiproxyPartialKey: "hardcode-partial-key-here"
)

And use the method described in the README for adding the bypass token as an env variable to your Xcode project.

Ollama OpenAI compatibility.

25 Jun 07:17
Compare
Choose a tag to compare

Ollama

Ollama now has built-in compatibility with the OpenAI Chat Completions API, making it possible to use more tooling and applications with Ollama locally.

Screenshot 2024-06-25 at 12 16 19 AM

⚠️ Important

Remember that these models run locally, so you need to download them. If you want to use llama3, you can open the terminal and run the following command:

ollama pull llama3

you can follow Ollama documentation for more.

How to use this models locally using SwiftOpenAI?

To use local models with an OpenAIService in your application, you need to provide a URL.

let service = OpenAIServiceFactory.ollama(baseURL: "http://localhost:11434")

Then you can use the completions API as follows:

let prompt = "Tell me a joke"
let parameters = ChatCompletionParameters(messages: [.init(role: .user, content: .text(prompt))], model: .custom("llama3"))
let chatCompletionObject = service.startStreamedChat(parameters: parameters)

Resources:

Changelog Jun6th, 2024

21 Jun 05:42
Compare
Choose a tag to compare

Streaming chat completions now support usage detail.

10 Jun 18:39
e8b912b
Compare
Choose a tag to compare

On June 6th OpenAI announced that streaming chat completions now support usage details. Previously, usage details were only available on non-streaming chat completions.
This patch makes streaming chat completions default to include usage details. The final chunk of each streaming response looks like this (note the prompt_tokens, completion_tokens, and total_tokens lines):
data: {"id":"chatcmpl-9YM1lpTbJLDBnrawPqt2CjT3gnoVA","object":"chat.completion.chunk","created":1717974853,"model":"gpt-4o-2024-05-13","system_fingerprint":"fp_319be4768e","choices":[],"usage":{"prompt_tokens":11,"completion_tokens":20,"total_tokens":31}}

Run Status Decoded

05 Jun 05:37
cca9a9e
Compare
Choose a tag to compare

Added support for streaming Run events status.

let decoded = try self.decoder.decode(RunObject.self, from: data)
switch RunObject.Status(rawValue: decoded.status) {
case .queued:
   continuation.yield(.threadRunQueued(decoded))
case .inProgress:
   continuation.yield(.threadRunInProgress(decoded))
case .requiresAction:
   continuation.yield(.threadRunRequiresAction(decoded))
case .cancelling:
   continuation.yield(.threadRunCancelling(decoded))
case .cancelled:
   continuation.yield(.threadRunCancelled(decoded))
case .failed:
   continuation.yield(.threadRunFailed(decoded))
case .completed:
   continuation.yield(.threadRunCompleted(decoded))
case .expired:
   continuation.yield(.threadRunExpired(decoded))
}

Vision support for GPT4o

01 Jun 05:56
3be54cb
Compare
Choose a tag to compare

Vision support for GPT4o

Adapting Message content to latest changes for Vision

Previous payload

"image_url": "https://upload.wikimedia.org/wikipedia/commons/thumb/d/dd/Gfp-wisconsin-madison-the-nature-boardwalk.jpg/2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg"

Current Payload

          "image_url": {
            "url": "https://upload.wikimedia.org/wikipedia/commons/thumb/d/dd/Gfp-wisconsin-madison-the-nature-boardwalk.jpg/2560px-Gfp-wisconsin-madison-the-nature-boardwalk.jpg",
            "detail": "high"
          }

Assistants API support for Azure

29 May 19:59
7821552
Compare
Choose a tag to compare

Getting started with Azure OpenAI Assistants (Preview)

Screenshot 2024-05-29 at 12 58 56 PM

https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/assistant

enum AzureOpenAIAPI {
   
   static var azureOpenAIResource: String = ""
   
   /// https://learn.microsoft.com/en-us/azure/ai-services/openai/assistants-reference?tabs=python
   /// https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/assistant
   case assistant(AssistantCategory)
   /// https://learn.microsoft.com/en-us/azure/ai-services/openai/reference#chat-completions
   case chat(deploymentID: String)
   /// https://learn.microsoft.com/en-us/azure/ai-services/openai/assistants-reference-messages?tabs=python
   case message(MessageCategory)
   /// https://learn.microsoft.com/en-us/azure/ai-services/openai/assistants-reference-runs?tabs=python
   case run(RunCategory)
   /// https://learn.microsoft.com/en-us/azure/ai-services/openai/assistants-reference-runs?tabs=python#list-run-steps
   case runStep(RunStepCategory)
   /// https://learn.microsoft.com/en-us/azure/ai-services/openai/assistants-reference-threads?tabs=python#create-a-thread
   case thread(ThreadCategory)
   /// https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/file-search?tabs=python#vector-stores
   case vectorStore(VectorStoreCategory)
   /// https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/file-search?tabs=python#vector-stores
   case vectorStoreFile(VectorStoreFileCategory)
   
   enum AssistantCategory {
      case create
      case list
      case retrieve(assistantID: String)
      case modify(assistantID: String)
      case delete(assistantID: String)
   }

   enum MessageCategory {
      case create(threadID: String)
      case retrieve(threadID: String, messageID: String)
      case modify(threadID: String, messageID: String)
      case list(threadID: String)
   }
   
   enum RunCategory {
      case create(threadID: String)
      case retrieve(threadID: String, runID: String)
      case modify(threadID: String, runID: String)
      case list(threadID: String)
      case cancel(threadID: String, runID: String)
      case submitToolOutput(threadID: String, runID: String)
      case createThreadAndRun
   }
   
   enum RunStepCategory {
      case retrieve(threadID: String, runID: String, stepID: String)
      case list(threadID: String, runID: String)
   }
   
   enum ThreadCategory {
      case create
      case retrieve(threadID: String)
      case modify(threadID: String)
      case delete(threadID: String)
   }
   
   enum VectorStoreCategory {
      case create
      case list
      case retrieve(vectorStoreID: String)
      case modify(vectorStoreID: String)
      case delete(vectorStoreID: String)
   }
   
   /// https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/file-search?tabs=python#file-search-support
   enum VectorStoreFileCategory {
      case create(vectorStoreID: String)
      case list(vectorStoreID: String)
      case retrieve(vectorStoreID: String, fileID: String)
      case delete(vectorStoreID: String, fileID: String)
   }
}

Support for gpt-4o

14 May 04:18
f0edddc
Compare
Choose a tag to compare

Support for gpt-4o.