diff --git a/README.md b/README.md
index 2c62969b..5ef3c6a5 100644
--- a/README.md
+++ b/README.md
@@ -1,20 +1,10 @@
# Wingman AI Core
-Wingman AI allows you to use your voice to talk to various AI providers and LLMs, process your conversations, and ultimately trigger actions such as pressing buttons or reading answers. Our _Wingmen_ are like characters and your interface to this world, and you can easily control their behavior and characteristics, even if you're not a developer.
-
-**1.5.0 Showreel:**
+Official website: [https://www.wingman-ai.com](https://www.wingman-ai.com)
[![Wingman AI 1.5 Showreel](https://img.youtube.com/vi/qR8FjmQJRGE/0.jpg)](https://youtu.be/qR8FjmQJRGE 'Wingman AI Showreel')
-**Release trailer:**
-
-[![Wingman AI 1.0 Release Trailer](https://img.youtube.com/vi/HR1Zc9QD1jE/0.jpg)](https://www.youtube.com/watch?v=HR1Zc9QD1jE 'Wingman AI Release Trailer')
-
-**In-depth tutorial:**
-
-[![Wingman AI 1.0 Tutorial](https://img.youtube.com/vi/--GkXcA5msw/0.jpg)](https://www.youtube.com/watch?v=--GkXcA5msw 'Wingman AI Tutorial')
-
-AI is complex and it scares people. It's also **not just ChatGPT**. We want to make it as easy as possible for you to get started. That's what _Wingman AI_ is all about. It's a **framework** that allows you to build your own Wingmen and use them in your games and programs.
+Wingman AI allows you to use your voice to talk to various AI providers and LLMs, process your conversations, and ultimately trigger actions such as pressing buttons or reading answers. Our _Wingmen_ are like characters and your interface to this world, and you can easily control their behavior and characteristics, even if you're not a developer. AI is complex and it scares people. It's also **not just ChatGPT**. We want to make it as easy as possible for you to get started. That's what _Wingman AI_ is all about. It's a **framework** that allows you to build your own Wingmen and use them in your games and programs.
![Wingman Flow](assets/wingman-flow.png)
@@ -29,20 +19,35 @@ The idea is simple, but the possibilities are endless. For example, you could:
## Features
+
+
+
+
Since version 2.0, Wingman AI Core acts as a "backend" API (using FastAPI and Pydantic) with the following features:
- **Push-to-talk or voice activation** to capture user audio
-- OpenAI **text generation** and **function calling**
-- **Speech-to-text** providers (STT) for transcription:
+- **AI providers** with different models:
+ - OpenAI
+ - Google (Gemini)
+ - Azure
+ - Groq (llama3 with function calling)
+ - Mistral Cloud
+ - Open Router
+ - Cerebras
+ - Groq
+ - Perplexity
+ - Wingman Pro (unlimited access to several providers and models)
+- **Speech-to-text providers** (STT) for transcription:
- OpenAI Whisper
- - OpenAI Whisper via Azure
+ - Azure Whisper
- Azure Speech
- - whispercpp (local)
+ - whispercpp (local, bundled with Wingman AI)
+ - Wingman Pro (Azure Speech or Azure Whisper)
- **Text-to-speech** (TTS) providers:
- OpenAI TTS
- Azure TTS
- - Elevenlabs
- Edge TTS (free)
+ - Elevenlabs
- XVASynth (local)
- **Sound effects** that work with every supported TTS provider
- **Multilingual** by default
@@ -54,10 +59,14 @@ Since version 2.0, Wingman AI Core acts as a "backend" API (using FastAPI and Py
- **Skills** that can do almost anything. Think Alexa... but better.
- **directory/file-based configuration** for different use cases (e.g. games) and Wingmen. No database needed.
- Wingman AI Core exposes a lot of its functionality via **REST services** (with an OpenAPI/Swagger spec) and can send and receive messages from clients, games etc. using **WebSockets**.
+- Sound Library to play mp3 or wav files in commands or Skills (similar to HCS Voice Packs for Voice Attack)
+- AI instant sound effects generation with Elevenlabs
We (Team ShipBit) offer an additional [client with a neat GUI](https://www.wingman-ai.com) that you can use to configure everything in Wingman AI Core.
-
+
+
+
## Is this a "Star Citizen" thing?
@@ -110,11 +119,11 @@ Our Wingmen use OpenAI's APIs and they charge by usage. That means: You don't pa
#### ElevenLabs
-You don't have to use [ElevenLabs](https://elevenlabs.io/) as TTS provider, but their voices are great. You can also clone your own with less than 5 minutes of sample audio, e.g. your friend, an actor or a recording of an NPC in your game.
+You don't have to use Elevenlabs as TTS provider, but their voices are great and you can generate instant sound effects with their API - fully integrated into Wingman AI. You can clone any voice with 3 minutes of clean audio, e.g. your friend, an actor or a recording of an NPC in your game.
-They have a free tier with a limited number of characters generated per month so you can try it out first. You can find more information on their [pricing page](https://elevenlabs.io/pricing).
+Elevenlabs offers a $5 tier with 30k characters and a $22 tier with 100k characters. Characters roll over each month with a max of 3 months worth of credits. If you're interested in the service, please consider using our [referral link here](https://elevenlabs.io/pricing?from=partnerlewis2510). It costs you nothing extra and supports Wingman AI. We get 22% of all payments in your first year. Thank you!
-Signing up is very similar to OpenAI: Create your account, set up your payment method, and create an API key.
+Signing up is very similar to OpenAI: Create your account, set up your payment method, and create an API key. Enter that API key in Wingman AI when asked.
#### Edge TTS (Free)
@@ -122,9 +131,7 @@ Microsoft Edge TTS is actually free and you don't need an API key to use it. How
### Are local LLMs replacing OpenAI supported?
-Wingman AI exposes the `base_url` property that the OpenAI Python client uses. So if you have a plug-in replacement for OpenAI's client, you can easily connect it to Wingman AI Core. You can also write your own custom Wingman that uses your local LLM.
-
-Integrating specific LLMs oder models is currently not on our (ShipBit) priority list [as explained here](https://github.com/ShipBit/wingman-ai/issues/108) and we do not offer live support for it. Check out or Discord server if you're interested in local LLMs - there is a vibrant community discussing and testing different solutions and if we ever find one that satisfies our requirements, we might consider supporting it officially.
+You can use any LLM offering an OpenAI-compatible API and connect it to Wingman AI Core easily.
## Installing Wingman AI
@@ -133,6 +140,7 @@ Integrating specific LLMs oder models is currently not on our (ShipBit) priority
- Download the installer of the latest version from [wingman-ai.com](https://www.wingman-ai.com).
- Install it to a directory of your choice and start the client `Wingman AI.exe`.
- The client will will auto-start `Wingman AI Core.exe` in the background
+ - The client will auto-start `whispercpp` in the background. If you have an NVIDIA RTX GPU, install the latest CUDA driver from NVIDIA and enable GPU acceleration in the Settings view.
If that doesn't work for some reason, try starting `Wingman AI Core.exe` manually and check the terminal or your **logs** directory for errors.
@@ -142,29 +150,23 @@ If that doesn't work for some reason, try starting `Wingman AI Core.exe` manuall
Wingman runs well on MacOS. While we don't offer a precompiled package for it, you can [run it from source](#develop-with-wingman-ai). Note that the TTS provider XVASynth is Windows-only and therefore not supported on MacOS.
-### Linux
-
-Linux is not officially supported but some of our community members were able to run it anyways. Check out [their documentation](docs/develop-linux.md).
-
## Who are these Wingmen?
-Our default Wingmen serve as examples and starting points for your own Wingmen, and you can easily reconfigure them using the client. You can also add your own Wingmen.
+Our default Wingmen serve as examples and starting points for your own Wingmen, and you can easily reconfigure them using the client. You can also add your own Wingmen very easily.
### Computer & ATC
Our first two default Wingmen are using OpenAI's APIs. The basic process is as follows:
- Your speech is transcribed by the configured TTS provider.
-- The transcript is then sent as text to the **GPT-3.5 Turbo API**, which responds with a text and maybe function calls.
-- Wingman AI Core executes function calls which equals a command execution.
+- The transcript is then sent as text to the configured LLM, which responds with text and maybe function calls.
+- Wingman AI Core executes function calls which can be command executions or skill functions.
- The response is then read out to you by the configured TTS provider.
- Clients connected to Wingman AI Core are notified about progress and changes live and display them in the UI.
-Talking to a Wingman is like chatting with ChatGPT. This means that you can customize their behavior by giving them a `context` (or `system`) prompt as starting point for your conversation. You can also just tell them how to behave and they will remember that during your conversation. ATC and Computer use very different prompts, so they behave very differently.
-
-The magic happens when you configure _commands_ or key bindings. GPT will then try to match your request with the configured commands and execute them for you. It will automatically choose the best matching command based only on its name, so make sure you give it a good one (e.g. `RequestLandingPermission`).
+Talking to a Wingman is like chatting with ChatGPT but with your voice. And it can actually do everything that Python can do. This means that you can customize their behavior by giving them a backstory as starting point for your conversation. You can also just tell them how to behave and they will remember that during your conversation.
-More information about the API can be found in the [OpenAI API documentation](https://beta.openai.com/docs/introduction).
+The magic happens when you configure _commands_ or key bindings. GPT will then try to match your request with the configured commands and execute them for you. It will automatically choose the best matching command based only on its name, so make sure you give it a good one (e.g. `Request landing permission`).
### StarHead
@@ -183,17 +185,17 @@ For updates and more information, visit the [StarHead website](https://star-head
### Noteworthy community projects
-- [UEXCorp](https://discord.com/channels/1173573578604687360/1179594417926066196) by @JayMatthew: A former Custom Wingman, now Skill that utilizes the UEX Corp API to pull live data for Star Citizen. Think StarHead on steroids.
-- [Clippy](https://discord.com/channels/1173573578604687360/1241854342282219662) by @teddybear082: A tribute Skill to the sketchy Microsoft assistant we all used to hate.
-- [WebSearch](https://discord.com/channels/1173573578604687360/1245432544946688081) by @teddybear082: A Skill that can pull data from websites (and quote the sources) for you.
+- [Community Wingmen](https://discord.com/channels/1173573578604687360/1176141176974360627)
+- [Community Skills](https://discord.com/channels/1173573578604687360/1254811139867410462)
+- [Different Games with Wingman AI](https://discord.com/channels/1173573578604687360/1254868009940418572)
## Can I configure Wingman AI Core without using your client?
Yes, you can! You can edit all the configs in your `%APP_DATA%\ShipBit\WingmanAI\[version]` directory.
-The YAML configs are very indentation-sensitive, so please be careful. We recommend using [VSCode](https://code.visualstudio.com/) with the [YAML extension](https://marketplace.visualstudio.com/items?itemName=redhat.vscode-yaml) to edit them.
+The YAML configs are very indentation-sensitive, so please be careful.
-**There is no hot reloading**, so you have to restart Wingman AI Core after you made changes to the configs.
+**There is no hot reloading**, so you have to restart Wingman AI Core after you made manual changes to the configs.
### Directory/file-based configuration
@@ -217,15 +219,13 @@ Access secrets in code by using `secret_keeper.py`. You can access everything el
Wingman supports all languages that OpenAI (or your configured AI provider) supports. Setting this up in Wingman is really easy:
-Find the `context` setting for the Wingman you want to change.
+Some STT providers need a simple configuration to specifiy a non-English language. Use might also have to find a voice that speaks the desired language.
-Now add a simple sentence to the `context` prompt: `Always answer in the language I'm using to talk to you.`
+Then find the `backstory` setting for the Wingman you want to change and add a simple sentence to the `backstory` prompt: `Always answer in the language I'm using to talk to you.`
or something like `Always answer in Portuguese.`
The cool thing is that you can now trigger commands in the language of your choice without changing/translating the `name` of the commands - the AI will do that for you.
-Also note that depending on your TTS provider, you might have to pick a voice that can actually speak your desired language or you'll end up with something really funny (like an American voice trying to speak German).
-
## Develop with Wingman AI
Are you ready to build your own Wingman or implement new features to the framework?
@@ -279,7 +279,7 @@ This list will inevitably remain incomplete. If you miss your name here, please
#### Special thanks
-- [**JayMatthew aka SawPsyder**](https://robertsspaceindustries.com/citizens/JayMatthew), @teddybear082 and @Thaendril for outstanding moderation in Discord, constant feedback and valuable Core & Skill contributions
+- [**JayMatthew aka SawPsyder**](https://robertsspaceindustries.com/citizens/JayMatthew), @teddybear082, @Thaendril and @Xul for outstanding moderation in Discord, constant feedback and valuable Core & Skill contributions
- @lugia19 for developing and improving the amazing [elevenlabslib](https://github.com/lugia19/elevenlabslib).
- [Knebel](https://www.youtube.com/@Knebel_DE) who helped us kickstart Wingman AI by showing it on stream and grants us access to the [StarHead API](https://star-head.de/) for Star Citizen.
- @Zatecc from [UEX Corp](https://uexcorp.space/) who supports our community developers and Wingmen with live trading data for Star Citizen using the [UEX Corp API](https://uexcorp.space/api.html).
@@ -300,7 +300,3 @@ To our greatest Patreon supporters we say: `o7` Commanders!
- Paradox
- Gopalfreak aka Rockhound
- [Averus](https://robertsspaceindustries.com/citizens/Averus)
-
-#### Wingmen (Patreons)
-
-[Ira Robinson aka Serene/BlindDadDoes](http://twitch.tv/BlindDadDoes), Zenith, DiVille, [Hiwada], Hades aka Architeutes, Raziel317, [CptToastey](https://www.twitch.tv/cpttoastey), NeyMR AKA EagleOne (Capt.Epic), a Bit Brutal, AlexeiX, [Dragon Aura](https://robertsspaceindustries.com/citizens/Dragon_Aura), Perry-x-Rhodan, DoublarThackery, SilentCid, Bytebool, Exaust A.K.A Nikoyevitch, Tycoon3000, N.T.G, Jolan97, Greywolfe, [Dayel Ostraco aka BlakeSlate](https://dayelostra.co/), Nielsjuh01, Manasy, Sierra-Noble, Simon says aka Asgard, JillyTheSnail, [Admiral-Chaos aka Darth-Igi], The Don, Tristan Import Error, Munkey the pirate, Norman Pham aka meMidgety, [meenie](https://github.com/meenie), [Tilawan](https://github.com/jlaluces123), Mr. Moo42, Geekdomo, Jenpai, Blitz, [Aaron Sadler](https://github.com/AaronSadler687), [SleeperActual](https://vngd.net/), parawho, [HypeMunkey](https://robertsspaceindustries.com/citizens/HypeMunkey), Huniken, SuperTruck, [NozDog], Skipster [Skipster Actual], Fredek, Ruls-23, Dexonist, Captain Manga
diff --git a/api/commands.py b/api/commands.py
index 1fb9134b..d551591f 100644
--- a/api/commands.py
+++ b/api/commands.py
@@ -26,11 +26,12 @@ class SaveSecretCommand(WebSocketCommandModel):
class RecordKeyboardActionsCommand(WebSocketCommandModel):
command: Literal["record_keyboard_actions"] = "record_keyboard_actions"
- recording_type: KeyboardRecordingType = KeyboardRecordingType.SINGLE
+ recording_type: KeyboardRecordingType
class StopRecordingCommand(WebSocketCommandModel):
command: Literal["stop_recording"] = "stop_recording"
+ recording_type: KeyboardRecordingType
# SENT TO CLIENT
diff --git a/api/enums.py b/api/enums.py
index 655cb14d..091bb624 100644
--- a/api/enums.py
+++ b/api/enums.py
@@ -50,6 +50,7 @@ class CustomPropertyType(Enum):
SINGLE_SELECT = "single_select"
VOICE_SELECTION = "voice_selection"
SLIDER = "slider"
+ AUDIO_FILES = "audio_files"
class AzureApiVersion(Enum):
@@ -68,13 +69,6 @@ class TtsVoiceGender(Enum):
FEMALE = "Female"
-class OpenAiModel(Enum):
- """https://platform.openai.com/docs/models/overview"""
-
- GPT_4O = "gpt-4o"
- GPT_4O_MINI = "gpt-4o-mini"
-
-
class MistralModel(Enum):
"""https://docs.mistral.ai/getting-started/models/"""
@@ -85,6 +79,16 @@ class MistralModel(Enum):
MISTRAL_MEDIUM = "mistral-medium-latest"
MISTRAL_LARGE = "mistral-large-latest"
+class PerplexityModel(Enum):
+ """https://docs.perplexity.ai/guides/model-cards"""
+
+ SONAR_SMALL = "llama-3.1-sonar-small-128k-online"
+ SONAR_MEDIUM = "llama-3.1-sonar-large-128k-online"
+ SONAR_LARGE = "llama-3.1-sonar-huge-128k-online"
+ CHAT_SMALL = "llama-3.1-sonar-small-128k-chat"
+ CHAT_LARGE = "llama-3.1-sonar-large-128k-chat"
+ LLAMA3_8B = "llama-3.1-8b-instruct"
+ LLAMA3_70B = "llama-3.1-70b-instruct"
class GoogleAiModel(Enum):
GEMINI_1_5_FLASH = "gemini-1.5-flash"
@@ -151,6 +155,8 @@ class ConversationProvider(Enum):
AZURE = "azure"
WINGMAN_PRO = "wingman_pro"
GOOGLE = "google"
+ CEREBRAS = "cerebras"
+ PERPLEXITY = "perplexity"
class ImageGenerationProvider(Enum):
@@ -184,6 +190,7 @@ class SkillCategory(Enum):
STAR_CITIZEN = "star_citizen"
TRUCK_SIMULATOR = "truck_simulator"
NO_MANS_SKY = "no_mans_sky"
+ FLIGHT_SIMULATOR = "flight_simulator"
# Pydantic models for enums
@@ -231,13 +238,11 @@ class TtsVoiceGenderEnumModel(BaseEnumModel):
gender: TtsVoiceGender
-class OpenAiModelEnumModel(BaseEnumModel):
- model: OpenAiModel
-
-
class MistralModelEnumModel(BaseEnumModel):
model: MistralModel
+class PerplexityModelEnumModel(BaseEnumModel):
+ model: PerplexityModel
class GoogleAiModelEnumModel(BaseEnumModel):
model: GoogleAiModel
@@ -309,7 +314,6 @@ class SkillCategoryModel(BaseEnumModel):
"AzureApiVersion": AzureApiVersionEnumModel,
"AzureRegion": AzureRegionEnumModel,
"TtsVoiceGender": TtsVoiceGenderEnumModel,
- "OpenAiModel": OpenAiModelEnumModel,
"MistralModel": MistralModelEnumModel,
"GoogleAiModel": GoogleAiModelEnumModel,
"WingmanProAzureDeployment": WingmanProAzureDeploymentEnumModel,
@@ -324,6 +328,7 @@ class SkillCategoryModel(BaseEnumModel):
"WingmanProSttProvider": WingmanProSttProviderModel,
"WingmanProTtsProvider": WingmanProTtsProviderModel,
"SkillCategory": SkillCategoryModel,
+ "PerplexityModel": PerplexityModelEnumModel,
# Add new enums here as key-value pairs
}
diff --git a/api/interface.py b/api/interface.py
index 28c49f7e..586c8668 100644
--- a/api/interface.py
+++ b/api/interface.py
@@ -11,7 +11,6 @@
CustomPropertyType,
SkillCategory,
TtsVoiceGender,
- OpenAiModel,
OpenAiTtsVoice,
SoundEffect,
SttProvider,
@@ -22,6 +21,7 @@
WingmanProRegion,
WingmanProSttProvider,
WingmanProTtsProvider,
+ PerplexityModel,
)
@@ -278,7 +278,7 @@ class XVASynthTtsConfig(BaseModel):
class OpenAiConfig(BaseModel):
- conversation_model: OpenAiModel
+ conversation_model: str
""" The model to use for conversations aka "chit-chat" and for function calls.
"""
@@ -311,11 +311,21 @@ class MistralConfig(BaseModel):
endpoint: str
+class PerplexityConfig(BaseModel):
+ conversation_model: PerplexityModel
+ endpoint: str
+
+
class GroqConfig(BaseModel):
conversation_model: str
endpoint: str
+class CerebrasConfig(BaseModel):
+ conversation_model: str
+ endpoint: str
+
+
class GoogleConfig(BaseModel):
conversation_model: GoogleAiModel
@@ -391,6 +401,25 @@ class FeaturesConfig(BaseModel):
use_generic_instant_responses: bool
+class AudioFile(BaseModel):
+ path: str
+ """The audio file to play. Required."""
+
+ name: str
+ """The name of the audio file."""
+
+
+class AudioFileConfig(BaseModel):
+ files: list[AudioFile]
+ """The audio file(s) to play. If there are multiple, a random file will be played."""
+
+ volume: float
+ """The volume to play the audio file at."""
+
+ wait: bool
+ """Whether to wait for the audio file to finish playing before continuing."""
+
+
class CommandKeyboardConfig(BaseModel):
hotkey: str
"""The hotkey. Can be a single key like 'a' or a combination like 'ctrl+shift+a'."""
@@ -441,6 +470,9 @@ class CommandActionConfig(BaseModel):
write: Optional[str] = None
"""The word or phrase to type, for example, to type text in a login screen. Must have associated button press to work. May need special formatting for special characters."""
+ audio: Optional[AudioFileConfig] = None
+ """The audio file to play. Optional."""
+
class CommandConfig(BaseModel):
name: str
@@ -507,7 +539,15 @@ class CustomProperty(BaseModel):
"""The name of the property. Has to be unique"""
name: str
"""The "friendly" name of the property, displayed in the UI."""
- value: str | int | float | bool | VoiceSelection | list[VoiceSelection]
+ value: (
+ str
+ | int
+ | float
+ | bool
+ | VoiceSelection
+ | list[VoiceSelection]
+ | AudioFileConfig
+ )
"""The value of the property"""
property_type: CustomPropertyType
"""Determines the type of the property and which controls to render in the UI."""
@@ -553,6 +593,7 @@ class NestedConfig(BaseModel):
openai: OpenAiConfig
mistral: MistralConfig
groq: GroqConfig
+ cerebras: CerebrasConfig
google: GoogleConfig
openrouter: OpenRouterConfig
local_llm: LocalLlmConfig
@@ -562,6 +603,7 @@ class NestedConfig(BaseModel):
xvasynth: XVASynthTtsConfig
whispercpp: WhispercppSttConfig
wingman_pro: WingmanProConfig
+ perplexity: PerplexityConfig
commands: Optional[list[CommandConfig]] = None
skills: Optional[list[SkillConfig]] = None
diff --git a/assets/wingman-ui-1.png b/assets/wingman-ui-1.png
index 597e83c5..049a1836 100644
Binary files a/assets/wingman-ui-1.png and b/assets/wingman-ui-1.png differ
diff --git a/assets/wingman-ui-2.png b/assets/wingman-ui-2.png
index 83ed6874..f5e2a859 100644
Binary files a/assets/wingman-ui-2.png and b/assets/wingman-ui-2.png differ
diff --git a/assets/wingman-ui-3.png b/assets/wingman-ui-3.png
index 2e651505..be5a69f9 100644
Binary files a/assets/wingman-ui-3.png and b/assets/wingman-ui-3.png differ
diff --git a/assets/wingman-ui-4.png b/assets/wingman-ui-4.png
index 11294c23..07189a97 100644
Binary files a/assets/wingman-ui-4.png and b/assets/wingman-ui-4.png differ
diff --git a/assets/wingman-ui-5.png b/assets/wingman-ui-5.png
new file mode 100644
index 00000000..f2ebd47e
Binary files /dev/null and b/assets/wingman-ui-5.png differ
diff --git a/assets/wingman-ui-6.png b/assets/wingman-ui-6.png
new file mode 100644
index 00000000..ed93b958
Binary files /dev/null and b/assets/wingman-ui-6.png differ
diff --git a/assets/wingman-ui-7.png b/assets/wingman-ui-7.png
new file mode 100644
index 00000000..a4c4f898
Binary files /dev/null and b/assets/wingman-ui-7.png differ
diff --git a/assets/wingman-ui-8.png b/assets/wingman-ui-8.png
new file mode 100644
index 00000000..f248e918
Binary files /dev/null and b/assets/wingman-ui-8.png differ
diff --git a/main.py b/main.py
index 269b2144..6e7277dc 100644
--- a/main.py
+++ b/main.py
@@ -131,6 +131,29 @@ def custom_openapi():
# Ensure the components.schemas key exists
openapi_schema.setdefault("components", {}).setdefault("schemas", {})
+ # Add enums to schema
+ for enum_name, enum_model in ENUM_TYPES.items():
+ enum_field_name, enum_type = next(iter(enum_model.__annotations__.items()))
+ if issubclass(enum_type, Enum):
+ enum_values = [e.value for e in enum_type]
+ enum_schema = {
+ "type": "string",
+ "enum": enum_values,
+ "description": f"Possible values for {enum_name}",
+ }
+ openapi_schema["components"]["schemas"][enum_name] = enum_schema
+
+ openapi_schema["components"]["schemas"]["CommandActionConfig"] = {
+ "type": "object",
+ "properties": {
+ "keyboard": {"$ref": "#/components/schemas/CommandKeyboardConfig"},
+ "wait": {"type": "number"},
+ "mouse": {"$ref": "#/components/schemas/CommandMouseConfig"},
+ "write": {"type": "string"},
+ "audio": {"$ref": "#/components/schemas/AudioFileConfig"},
+ },
+ }
+
# Add WebSocket command models to schema
for cls in WebSocketCommandModel.__subclasses__():
cls_schema_dict = cls.model_json_schema(
@@ -156,18 +179,6 @@ def custom_openapi():
cls_schema_dict.setdefault("required", []).append(field_name)
openapi_schema["components"]["schemas"][cls.__name__] = cls_schema_dict
- # Add enums to schema
- for enum_name, enum_model in ENUM_TYPES.items():
- enum_field_name, enum_type = next(iter(enum_model.__annotations__.items()))
- if issubclass(enum_type, Enum):
- enum_values = [e.value for e in enum_type]
- enum_schema = {
- "type": "string",
- "enum": enum_values,
- "description": f"Possible values for {enum_name}",
- }
- openapi_schema["components"]["schemas"][enum_name] = enum_schema
-
app.openapi_schema = openapi_schema
return app.openapi_schema
@@ -244,7 +255,7 @@ async def async_main(host: str, port: int, sidecar: bool):
secret = input(f"Please enter your '{error.secret_name}' API key/secret: ")
if secret:
secret_keeper.secrets[error.secret_name] = secret
- secret_keeper.save()
+ await secret_keeper.save()
saved_secrets.append(error.secret_name)
else:
return
diff --git a/providers/elevenlabs.py b/providers/elevenlabs.py
index 5a6cfaf9..de10c77a 100644
--- a/providers/elevenlabs.py
+++ b/providers/elevenlabs.py
@@ -1,8 +1,6 @@
-from elevenlabslib import (
- User,
- GenerationOptions,
- PlaybackOptions,
-)
+import asyncio
+from typing import Optional
+from elevenlabslib import User, GenerationOptions, PlaybackOptions, SFXGenerationOptions
from api.enums import SoundEffect, WingmanInitializationErrorType
from api.interface import ElevenlabsConfig, SoundConfig, WingmanInitializationError
from services.audio_player import AudioPlayer
@@ -55,12 +53,12 @@ def notify_playback_finished():
contains_high_end_radio = SoundEffect.HIGH_END_RADIO in sound_config.effects
if contains_high_end_radio:
- audio_player.play_wav("Radio_Static_Beep.wav", sound_config.volume)
+ audio_player.play_wav_sample("Radio_Static_Beep.wav", sound_config.volume)
if sound_config.play_beep:
- audio_player.play_wav("beep.wav", sound_config.volume)
+ audio_player.play_wav_sample("beep.wav", sound_config.volume)
elif sound_config.play_beep_apollo:
- audio_player.play_wav("Apollo_Beep.wav", sound_config.volume)
+ audio_player.play_wav_sample("Apollo_Beep.wav", sound_config.volume)
WebSocketUser.ensure_async(
audio_player.notify_playback_finished(wingman_name)
@@ -68,13 +66,13 @@ def notify_playback_finished():
def notify_playback_started():
if sound_config.play_beep:
- audio_player.play_wav("beep.wav", sound_config.volume)
+ audio_player.play_wav_sample("beep.wav", sound_config.volume)
elif sound_config.play_beep_apollo:
- audio_player.play_wav("Apollo_Beep.wav", sound_config.volume)
+ audio_player.play_wav_sample("Apollo_Beep.wav", sound_config.volume)
contains_high_end_radio = SoundEffect.HIGH_END_RADIO in sound_config.effects
if contains_high_end_radio:
- audio_player.play_wav("Radio_Static_Beep.wav", sound_config.volume)
+ audio_player.play_wav_sample("Radio_Static_Beep.wav", sound_config.volume)
WebSocketUser.ensure_async(
audio_player.notify_playback_started(wingman_name)
@@ -142,6 +140,32 @@ def playback_finished(wingman_name):
audio_player.playback_events.subscribe("finished", playback_finished)
+ async def generate_sound_effect(
+ self,
+ prompt: str,
+ duration_seconds: Optional[float] = None,
+ prompt_influence: Optional[float] = None,
+ ):
+ user = User(self.api_key)
+ options = SFXGenerationOptions(
+ duration_seconds=duration_seconds, prompt_influence=prompt_influence
+ )
+ req, _ = user.generate_sfx(prompt, options)
+
+ result_ready = asyncio.Event()
+ audio: bytes = None
+
+ def get_result(future: asyncio.Future[bytes]):
+ nonlocal audio
+ audio = future.result()
+ result_ready.set() # Signal that the result is ready
+
+ req.add_done_callback(get_result)
+
+ # Wait for the result to be ready
+ await result_ready.wait()
+ return audio
+
def get_available_voices(self):
user = User(self.api_key)
return user.get_available_voices()
@@ -149,3 +173,7 @@ def get_available_voices(self):
def get_available_models(self):
user = User(self.api_key)
return user.get_models()
+
+ def get_subscription_data(self):
+ user = User(self.api_key)
+ return user.get_subscription_data()
diff --git a/requirements.txt b/requirements.txt
index 913e655e..adca6269 100644
--- a/requirements.txt
+++ b/requirements.txt
@@ -1,6 +1,6 @@
azure-cognitiveservices-speech==1.38.0
edge-tts==6.1.12
-elevenlabslib==0.22.5
+elevenlabslib==0.22.6
fastapi==0.111.0
google-generativeai==0.7.0
markdown==3.6
diff --git a/services/audio_library.py b/services/audio_library.py
new file mode 100644
index 00000000..79159d00
--- /dev/null
+++ b/services/audio_library.py
@@ -0,0 +1,231 @@
+import asyncio
+import os
+import threading
+import time
+from os import path
+from random import randint
+from api.interface import AudioFile, AudioFileConfig
+from services.printr import Printr
+from services.audio_player import AudioPlayer
+from services.file import get_writable_dir
+
+printr = Printr()
+DIR_AUDIO_LIBRARY = "audio_library"
+
+class AudioLibrary:
+ def __init__(
+ self,
+ callback_playback_started: callable = None,
+ callback_playback_finished: callable = None,
+ ):
+ # Configurable settings
+ self.callback_playback_started = callback_playback_started # Parameters: AudioFileConfig, AudioPlayer, volume(float)
+ self.callback_playback_finished = (
+ callback_playback_finished # Parameters: AudioFileConfig
+ )
+
+ # Internal settings
+ self.audio_library_path = get_writable_dir(DIR_AUDIO_LIBRARY)
+ self.current_playbacks = {}
+
+ ##########################
+ ### Playback functions ###
+ ##########################
+
+ async def start_playback(
+ self, audio_file: AudioFile | AudioFileConfig, volume_modifier: float = 1.0
+ ):
+ audio_file = self.__get_audio_file_config(audio_file)
+ audio_player = AudioPlayer(
+ asyncio.get_event_loop(), self.on_playback_started, self.on_playback_finish
+ )
+
+ selected_file = self.__get_random_audio_file_from_config(audio_file)
+ playback_key = self.__get_playback_key(selected_file)
+
+ # skip if file does not exist
+ if not path.exists(path.join(self.audio_library_path, selected_file.path, selected_file.name)):
+ printr.toast_error(
+ f"Skipping playback of {selected_file.name} as it does not exist in the audio library."
+ )
+ return
+
+ # stop running playbacks of configured files
+ await self.stop_playback(audio_file, 0.1)
+
+ async def actual_start_playback(
+ audio_file: AudioFile,
+ audio_player: AudioPlayer,
+ volume: list,
+ ):
+ full_path = path.join(
+ self.audio_library_path, audio_file.path, audio_file.name
+ )
+ await audio_player.play_audio_file(
+ filename=full_path,
+ volume=volume,
+ wingman_name=playback_key,
+ publish_event=False,
+ )
+
+ volume = [(audio_file.volume or 1.0) * volume_modifier]
+ self.current_playbacks[playback_key] = [
+ audio_player,
+ volume,
+ selected_file,
+ ]
+ self.__threaded_execution(
+ actual_start_playback, selected_file, audio_player, volume
+ )
+ if audio_file.wait:
+ while True:
+ time.sleep(0.1)
+ status = self.get_playback_status(selected_file)
+ if not status[1]: # no audio player
+ break
+
+ async def stop_playback(
+ self,
+ audio_file: AudioFile | AudioFileConfig,
+ fade_out_time: float = 0.5,
+ fade_out_resolution: int = 20,
+ ):
+ async def fade_out(
+ audio_player: AudioPlayer,
+ volume: list[float],
+ fade_out_time: int,
+ fade_out_resolution: int,
+ ):
+ original_volume = volume[0]
+ step_size = original_volume / fade_out_resolution
+ step_duration = fade_out_time / fade_out_resolution
+ while audio_player.is_playing and volume[0] > 0.0001:
+ volume[0] -= step_size
+ await asyncio.sleep(step_duration)
+ await asyncio.sleep(0.05) # 50ms grace period
+ await audio_player.stop_playback()
+
+ for file in self.__get_audio_file_config(audio_file).files:
+ status = self.get_playback_status(file)
+ audio_player = status[1]
+ volume = status[2]
+ if audio_player:
+ if fade_out_time > 0:
+ self.__threaded_execution(
+ fade_out,
+ audio_player,
+ volume,
+ fade_out_time,
+ fade_out_resolution,
+ )
+ else:
+ await audio_player.stop_playback()
+
+ self.current_playbacks.pop(self.__get_playback_key(file), None)
+
+ def get_playback_status(
+ self, audio_file: AudioFile
+ ) -> list[bool, AudioPlayer | None, list[float] | None]:
+ playback_key = self.__get_playback_key(audio_file)
+
+ if playback_key in self.current_playbacks:
+ audio_player = self.current_playbacks[playback_key][0]
+ return [
+ audio_player.is_playing, # Is playing
+ audio_player, # AudioPlayer
+ self.current_playbacks[playback_key][1], # Current Volume list
+ ]
+ return [False, None, None]
+
+ async def change_playback_volume(
+ self, audio_file: AudioFile | AudioFileConfig, volume: float
+ ):
+ audio_file = self.__get_audio_file_config(audio_file)
+ playback_keys = [
+ self.__get_playback_key(current_file)
+ for current_file in audio_file.files
+ ]
+
+ for playback_key in playback_keys:
+ if playback_key in self.current_playbacks:
+ self.current_playbacks[playback_key][1][0] = volume
+
+ async def on_playback_started(self, file_path: str):
+ self.notify_playback_started(file_path)
+ # Placeholder for future implementations
+
+ async def on_playback_finish(self, file_path: str):
+ self.notify_playback_finished(file_path)
+ self.current_playbacks.pop(file_path, None)
+
+ def notify_playback_started(self, file_path: str):
+ if self.callback_playback_started:
+ # Give the callback the audio file that started playing and current volume
+ audio_file = self.current_playbacks[file_path][2]
+ audio_player = self.current_playbacks[file_path][0]
+ volume = self.current_playbacks[file_path][1][0]
+ self.callback_playback_started(audio_file, audio_player, volume)
+
+ def notify_playback_finished(self, file_path: str):
+ if self.callback_playback_finished:
+ # Give the callback the audio file that finished playing
+ audio_file = self.current_playbacks[file_path][2]
+ self.callback_playback_finished(audio_file)
+
+ ###############################
+ ### Audio Library functions ###
+ ###############################
+
+ def get_audio_files(self) -> list[AudioFile]:
+ audio_files = []
+ try:
+ for root, _, files in os.walk(self.audio_library_path):
+ for file in files:
+ if file.endswith((".wav", ".mp3")):
+ rel_path = path.relpath(root, self.audio_library_path)
+ rel_path = "" if rel_path == "." else rel_path
+ audio_files.append(AudioFile(path=rel_path, name=file))
+ except Exception:
+ pass
+ return audio_files
+
+ ########################
+ ### Helper functions ###
+ ########################
+
+ def __threaded_execution(self, function, *args) -> threading.Thread:
+ """Execute a function in a separate thread."""
+
+ def start_thread(function, *args):
+ if asyncio.iscoroutinefunction(function):
+ new_loop = asyncio.new_event_loop()
+ asyncio.set_event_loop(new_loop)
+ new_loop.run_until_complete(function(*args))
+ new_loop.close()
+ else:
+ function(*args)
+
+ thread = threading.Thread(target=start_thread, args=(function, *args))
+ thread.start()
+ return thread
+
+ def __get_audio_file_config(
+ self, audio_file: AudioFile | AudioFileConfig
+ ) -> AudioFileConfig:
+ if isinstance(audio_file, AudioFile):
+ return AudioFileConfig(files=[audio_file], volume=1, wait=False)
+ return audio_file
+
+ def __get_random_audio_file_from_config(
+ self, audio_file: AudioFileConfig
+ ) -> AudioFile:
+ size = len(audio_file.files)
+ if size > 1:
+ index = randint(0, size - 1)
+ else:
+ index = 0
+
+ return audio_file.files[index]
+
+ def __get_playback_key(self, audio_file: AudioFile) -> str:
+ return path.join(audio_file.path, audio_file.name)
\ No newline at end of file
diff --git a/services/audio_player.py b/services/audio_player.py
index e1634cb5..799eba67 100644
--- a/services/audio_player.py
+++ b/services/audio_player.py
@@ -1,5 +1,6 @@
import asyncio
import io
+import wave
from os import path
from threading import Thread
from typing import Callable
@@ -16,7 +17,6 @@
get_sound_effects,
)
-
class AudioPlayer:
def __init__(
self,
@@ -42,14 +42,21 @@ def set_event_loop(self, loop: asyncio.AbstractEventLoop):
self.event_loop = loop
def start_playback(
- self, audio, sample_rate, channels, finished_callback, volume: float
+ self,
+ audio,
+ sample_rate,
+ channels,
+ finished_callback,
+ volume: list[float] | float,
):
def callback(outdata, frames, time, status):
+ # this is a super hacky way to update volume while the playback is running
+ local_volume = volume[0] if isinstance(volume, list) else volume
nonlocal playhead
chunksize = frames * channels
# If we are at the end of the audio buffer, stop playback
- if playhead * channels >= len(audio):
+ if playhead >= len(audio):
if np.issubdtype(outdata.dtype, np.floating):
outdata.fill(0.0) # Fill with zero for floats
else:
@@ -66,10 +73,10 @@ def callback(outdata, frames, time, status):
if channels > 1 and current_chunk.ndim == 1:
current_chunk = np.tile(current_chunk[:, np.newaxis], (1, channels))
- # Flat the chunk
+ # Flatten the chunk
current_chunk = current_chunk.ravel()
- required_length = frames * channels
+ required_length = chunksize
# Ensure current_chunk has the required length
if len(current_chunk) < required_length:
@@ -81,7 +88,7 @@ def callback(outdata, frames, time, status):
# Reshape current_chunk to match outdata's shape, only if size matches
try:
current_chunk = current_chunk.reshape((frames, channels))
- current_chunk = current_chunk * volume
+ current_chunk = current_chunk * local_volume
if np.issubdtype(outdata.dtype, np.floating):
outdata[:] = current_chunk.astype(outdata.dtype)
else:
@@ -95,7 +102,7 @@ def callback(outdata, frames, time, status):
) # Safely fill zero to avoid noise
# Update playhead
- playhead = end
+ playhead += frames
# Check if playback should stop (end of audio)
if playhead >= len(audio):
@@ -107,6 +114,7 @@ def callback(outdata, frames, time, status):
# Initial playhead position
playhead = 0
+ self.is_playing = True
# Create and start the audio stream
self.stream = sd.OutputStream(
@@ -195,21 +203,49 @@ def finished_callback():
await self.notify_playback_started(wingman_name)
- async def notify_playback_started(self, wingman_name: str):
- await self.playback_events.publish("started", wingman_name)
+ async def notify_playback_started(
+ self, wingman_name: str, publish_event: bool = True
+ ):
+ if publish_event:
+ await self.playback_events.publish("started", wingman_name)
if callable(self.on_playback_started):
await self.on_playback_started(wingman_name)
- async def notify_playback_finished(self, wingman_name: str):
- await self.playback_events.publish("finished", wingman_name)
+ async def notify_playback_finished(
+ self, wingman_name: str, publish_event: bool = True
+ ):
+ if publish_event:
+ await self.playback_events.publish("finished", wingman_name)
if callable(self.on_playback_finished):
await self.on_playback_finished(wingman_name)
- def play_wav(self, audio_sample_file: str, volume: float):
- beep_audio, beep_sample_rate = self.get_audio_from_file(
- path.join(self.sample_dir, audio_sample_file)
- )
- self.start_playback(beep_audio, beep_sample_rate, 1, None, volume)
+ def play_wav_sample(self, audio_sample_file: str, volume: float):
+ file_path = path.join(self.sample_dir, audio_sample_file)
+ self.play_wav(file_path, volume)
+
+ def play_wav(self, audio_file: str, volume: list[float] | float):
+ audio, sample_rate = self.get_audio_from_file(audio_file)
+ with wave.open(audio_file, "rb") as audio_file:
+ num_channels = audio_file.getnchannels()
+ self.start_playback(audio, sample_rate, num_channels, None, volume)
+
+ def play_mp3(self, audio_sample_file: str, volume: list[float] | float):
+ audio, sample_rate = self.get_audio_from_file(audio_sample_file)
+ self.start_playback(audio, sample_rate, 2, None, volume)
+
+ async def play_audio_file(
+ self,
+ filename: str,
+ volume: list[float] | float,
+ wingman_name: str = None,
+ publish_event: bool = True,
+ ):
+ await self.notify_playback_started(wingman_name, publish_event)
+ if filename.endswith(".mp3"):
+ self.play_mp3(filename, volume)
+ elif filename.endswith(".wav"):
+ self.play_wav(filename, volume)
+ await self.notify_playback_finished(wingman_name, publish_event)
def get_audio_from_file(self, filename: str) -> tuple:
audio, sample_rate = sf.read(filename, dtype="float32")
@@ -345,9 +381,9 @@ def get_mixed_chunk(length):
if num_samples_to_copy > remaining:
num_samples_to_copy = remaining
- chunk[
- length - remaining : length - remaining + num_samples_to_copy
- ] = noise_audio[mixed_pos:(mixed_pos + num_samples_to_copy)]
+ chunk[length - remaining : length - remaining + num_samples_to_copy] = (
+ noise_audio[mixed_pos : (mixed_pos + num_samples_to_copy)]
+ )
remaining -= num_samples_to_copy
mixed_pos = mixed_pos + num_samples_to_copy
return chunk
@@ -404,13 +440,13 @@ def callback(outdata, frames, time, status):
await self.notify_playback_started(wingman_name)
if config.play_beep:
- self.play_wav("beep.wav", config.volume)
+ self.play_wav_sample("beep.wav", config.volume)
elif config.play_beep_apollo:
- self.play_wav("Apollo_Beep.wav", config.volume)
+ self.play_wav_sample("Apollo_Beep.wav", config.volume)
contains_high_end_radio = SoundEffect.HIGH_END_RADIO in config.effects
if contains_high_end_radio:
- self.play_wav("Radio_Static_Beep.wav", config.volume)
+ self.play_wav_sample("Radio_Static_Beep.wav", config.volume)
self.raw_stream.start()
@@ -447,12 +483,12 @@ def callback(outdata, frames, time, status):
contains_high_end_radio = SoundEffect.HIGH_END_RADIO in config.effects
if contains_high_end_radio:
- self.play_wav("Radio_Static_Beep.wav", config.volume)
+ self.play_wav_sample("Radio_Static_Beep.wav", config.volume)
if config.play_beep:
- self.play_wav("beep.wav", config.volume)
+ self.play_wav_sample("beep.wav", config.volume)
elif config.play_beep_apollo:
- self.play_wav("Apollo_Beep.wav", config.volume)
+ self.play_wav_sample("Apollo_Beep.wav", config.volume)
self.is_playing = False
await self.notify_playback_finished(wingman_name)
diff --git a/services/command_handler.py b/services/command_handler.py
index fb0aa941..52c25f5e 100644
--- a/services/command_handler.py
+++ b/services/command_handler.py
@@ -43,10 +43,8 @@ async def dispatch(self, message, websocket: WebSocket):
RecordKeyboardActionsCommand(**command), websocket
)
elif command_name == "stop_recording":
- # Get Enum from string
- recording_type = KeyboardRecordingType(command["recording_type"])
await self.handle_stop_recording(
- StopRecordingCommand(**command), websocket, recording_type
+ StopRecordingCommand(**command), websocket
)
else:
raise ValueError("Unknown command")
@@ -66,7 +64,7 @@ async def handle_secret(self, command: SaveSecretCommand, websocket: WebSocket):
secret_name = command.secret_name
secret_value = command.secret_value
self.secret_keeper.secrets[secret_name] = secret_value
- self.secret_keeper.save()
+ await self.secret_keeper.save()
if command.show_message:
await self.printr.print_async(
@@ -110,7 +108,7 @@ async def handle_record_keyboard_actions(
def _on_key_event(event):
if event.event_type == "down" and event.name == "esc":
WebSocketUser.ensure_async(
- self.handle_stop_recording(None, None, command.recording_type)
+ self.handle_stop_recording(command, websocket)
)
if (
event.scan_code == 58
@@ -129,7 +127,7 @@ def _on_key_event(event):
and self._is_hotkey_recording_finished(self.recorded_keys)
):
WebSocketUser.ensure_async(
- self.handle_stop_recording(None, None, command.recording_type)
+ self.handle_stop_recording(command, websocket)
)
self.hook_callback = keyboard.hook(_on_key_event, suppress=True)
@@ -138,7 +136,6 @@ async def handle_stop_recording(
self,
command: StopRecordingCommand,
websocket: WebSocket,
- recording_type: KeyboardRecordingType = KeyboardRecordingType.SINGLE,
):
if self.hook_callback:
keyboard.unhook(self.hook_callback)
@@ -148,7 +145,7 @@ async def handle_stop_recording(
actions = (
self._get_actions_from_recorded_keys(recorded_keys)
- if recording_type == KeyboardRecordingType.MACRO_ADVANCED
+ if command.recording_type == KeyboardRecordingType.MACRO_ADVANCED
else self._get_actions_from_recorded_hotkey(recorded_keys)
)
command = ActionsRecordedCommand(command="actions_recorded", actions=actions)
diff --git a/services/config_manager.py b/services/config_manager.py
index ac89d3a8..c4eb6401 100644
--- a/services/config_manager.py
+++ b/services/config_manager.py
@@ -942,6 +942,7 @@ def merge_configs(self, default: Config, wingman):
"openai",
"mistral",
"groq",
+ "cerebras",
"google",
"openrouter",
"local_llm",
@@ -951,6 +952,7 @@ def merge_configs(self, default: Config, wingman):
"whispercpp",
"xvasynth",
"wingman_pro",
+ "perplexity",
]:
if key in default:
# Use copy.deepcopy to ensure a full deep copy is made and original is untouched.
diff --git a/services/config_migration_service.py b/services/config_migration_service.py
index 5b4040f9..fd2e99d4 100644
--- a/services/config_migration_service.py
+++ b/services/config_migration_service.py
@@ -15,41 +15,141 @@
from services.file import get_users_dir
from services.printr import Printr
from services.secret_keeper import SecretKeeper
+from services.system_manager import SystemManager
MIGRATION_LOG = ".migration"
class ConfigMigrationService:
- def __init__(self, config_manager: ConfigManager):
+ def __init__(self, config_manager: ConfigManager, system_manager: SystemManager):
self.config_manager = config_manager
+ self.system_manager = system_manager
self.printr = Printr()
self.log_message: str = datetime.now().strftime("%Y-%m-%d-%H-%M-%S") + "\n"
+ self.users_dir = get_users_dir()
+ self.latest_version = MIGRATIONS[-1][1]
+ self.latest_config_path = path.join(
+ self.users_dir, self.latest_version, CONFIGS_DIR
+ )
+
+ def migrate_to_latest(self):
+
+ # Find the earliest existing version that needs migration
+ earliest_version = self.find_earliest_existing_version(self.users_dir)
+
+ if not earliest_version:
+ self.log("No valid version directories found for migration.", True)
+ return
+
+ # Check if the latest version is already migrated
+
+ migration_file = path.join(self.latest_config_path, MIGRATION_LOG)
+
+ if path.exists(migration_file):
+ self.log(
+ f"Found {self.latest_version} configs. No migrations needed.", True
+ )
+ return
+
+ self.log(
+ f"Starting migration from version {earliest_version.replace('_', '.')} to {self.latest_version.replace('_', '.')}",
+ True,
+ )
+
+ # Perform migrations
+ current_version = earliest_version
+ while current_version != self.latest_version:
+ next_version = self.find_next_version(current_version)
+ self.perform_migration(current_version, next_version)
+ current_version = next_version
+
+ self.log(
+ f"Migration completed successfully. Current version: {self.latest_version.replace('_', '.')}",
+ True,
+ )
+
+ def find_earliest_existing_version(self, users_dir):
+ versions = self.get_valid_versions(users_dir)
+ versions.sort(key=lambda v: [int(n) for n in v.split("_")])
+
+ for version in versions:
+ if version != self.latest_version:
+ return version
+
+ return None
+
+ def find_next_version(self, current_version):
+ for old, new, _ in MIGRATIONS:
+ if old == current_version:
+ return new
+ return None
+
+ def perform_migration(self, old_version, new_version):
+ migration_func = next(
+ (m[2] for m in MIGRATIONS if m[0] == old_version and m[1] == new_version),
+ None,
+ )
+
+ if migration_func:
+ self.log(
+ f"Migrating from {old_version.replace('_', '.')} to {new_version.replace('_', '.')}",
+ True,
+ )
+ migration_func(self)
+ else:
+ self.err(f"No migration path found from {old_version} to {new_version}")
+ raise ValueError(
+ f"No migration path found from {old_version} to {new_version}"
+ )
+
+ def find_previous_version(self, users_dir, current_version):
+ versions = self.get_valid_versions(users_dir)
+ versions.sort(key=lambda v: [int(n) for n in v.split("_")])
+ index = versions.index(current_version)
+ return versions[index - 1] if index > 0 else None
+
+ def get_valid_versions(self, users_dir):
+ versions = next(os.walk(users_dir))[1]
+ return [v for v in versions if self.is_valid_version(v)]
+
+ def find_latest_user_version(self, users_dir):
+ valid_versions = self.get_valid_versions(users_dir)
+ return max(
+ valid_versions,
+ default=None,
+ key=lambda v: [int(n) for n in v.split("_")],
+ )
+
+ def is_valid_version(self, version):
+ return any(version in migration[:2] for migration in MIGRATIONS)
# MIGRATIONS
def migrate_140_to_150(self):
- def migrate_settings(old: dict, new: SettingsConfig) -> dict:
+ def migrate_settings(old: dict, new: dict) -> dict:
old["voice_activation"]["whispercpp_config"] = {
- "temperature": new.voice_activation.whispercpp_config.temperature
+ "temperature": new["voice_activation"]["whispercpp_config"][
+ "temperature"
+ ]
}
- old["voice_activation"]["whispercpp"] = self.config_manager.convert_to_dict(
- new.voice_activation.whispercpp
- )
+ old["voice_activation"]["whispercpp"] = new["voice_activation"][
+ "whispercpp"
+ ]
self.log("- applied new split whispercpp settings/config structure")
- old["xvasynth"] = self.config_manager.convert_to_dict(new.xvasynth)
+ old["xvasynth"] = new["xvasynth"]
self.log("- adding new XVASynth settings")
old.pop("audio", None)
self.log("- removed audio device settings because DirectSound was removed")
return old
- def migrate_defaults(old: dict, new: NestedConfig) -> dict:
+ def migrate_defaults(old: dict, new: dict) -> dict:
# add new properties
- old["sound"]["volume"] = new.sound.volume
+ old["sound"]["volume"] = new["sound"]["volume"]
if old["sound"].get("play_beep_apollo", None) is None:
- old["sound"]["play_beep_apollo"] = new.sound.play_beep_apollo
- old["google"] = self.config_manager.convert_to_dict(new.google)
+ old["sound"]["play_beep_apollo"] = new["sound"]["play_beep_apollo"]
+ old["google"] = new["google"]
self.log("- added new properties: sound.volume, google")
# remove obsolete properties
@@ -65,24 +165,24 @@ def migrate_defaults(old: dict, new: NestedConfig) -> dict:
)
# rest of whispercpp moved to settings.yaml
- old["whispercpp"] = {"temperature": new.whispercpp.temperature}
+ old["whispercpp"] = {"temperature": new["whispercpp"]["temperature"]}
self.log("- cleaned up whispercpp properties")
# xvasynth was restructured
- old["xvasynth"] = self.config_manager.convert_to_dict(new.xvasynth)
+ old["xvasynth"] = new["xvasynth"]
self.log("- resetting and restructuring XVASynth")
# patching new default values
- old["features"]["stt_provider"] = new.features.stt_provider.value
+ old["features"]["stt_provider"] = new["features"]["stt_provider"]
self.log("- set whispercpp as new default STT provider")
- old["openai"]["conversation_model"] = new.openai.conversation_model.value
- old["azure"]["conversation"][
- "deployment_name"
- ] = new.azure.conversation.deployment_name
- old["wingman_pro"][
+ old["openai"]["conversation_model"] = new["openai"]["conversation_model"]
+ old["azure"]["conversation"]["deployment_name"] = new["azure"][
+ "conversation"
+ ]["deployment_name"]
+ old["wingman_pro"]["conversation_deployment"] = new["wingman_pro"][
"conversation_deployment"
- ] = new.wingman_pro.conversation_deployment.value
+ ]
self.log("- set gpt-4o-mini as new default LLM model")
return old
@@ -186,6 +286,30 @@ def find_skill(skills, module_name):
migrate_wingman=migrate_wingman,
)
+ def migrate_150_to_160(self):
+ def migrate_settings(old: dict, new: dict) -> dict:
+ return old
+
+ def migrate_defaults(old: dict, new: dict) -> dict:
+ # add new properties
+ old["cerebras"] = new["cerebras"]
+ old["perplexity"] = new["perplexity"]
+
+ self.log("- added new properties: cerebras, perplexity")
+
+ return old
+
+ def migrate_wingman(old: dict, new: Optional[dict]) -> dict:
+ return old
+
+ self.migrate(
+ old_version="1_5_0",
+ new_version="1_6_0",
+ migrate_settings=migrate_settings,
+ migrate_defaults=migrate_defaults,
+ migrate_wingman=migrate_wingman,
+ )
+
# INTERNAL
def log(self, message: str, highlight: bool = False):
@@ -217,14 +341,23 @@ def migrate(
self,
old_version: str,
new_version: str,
- migrate_settings: Callable[[dict, SettingsConfig], dict],
- migrate_defaults: Callable[[dict, NestedConfig], dict],
+ migrate_settings: Callable[[dict, dict], dict],
+ migrate_defaults: Callable[[dict, dict], dict],
migrate_wingman: Callable[[dict, Optional[dict]], dict],
) -> None:
users_dir = get_users_dir()
old_config_path = path.join(users_dir, old_version, CONFIGS_DIR)
new_config_path = path.join(users_dir, new_version, CONFIGS_DIR)
+ if not path.exists(path.join(users_dir, new_version)):
+ shutil.copytree(
+ path.join(users_dir, self.latest_version, "migration", new_version),
+ path.join(users_dir, new_version),
+ )
+ self.log(
+ f"{new_version} configs not found during multi-step migration. Copied migration templates."
+ )
+
already_migrated = path.exists(path.join(new_config_path, MIGRATION_LOG))
if already_migrated:
self.log(
@@ -242,24 +375,27 @@ def migrate(
old_file = path.join(root, filename)
new_file = old_file.replace(old_config_path, new_config_path)
- if filename == ".DS_Store":
+ if filename == ".DS_Store" or filename == MIGRATION_LOG:
continue
# secrets
if filename == "secrets.yaml":
self.copy_file(old_file, new_file)
- secret_keeper = SecretKeeper()
- secret_keeper.secrets = secret_keeper.load()
+
+ if new_config_path == self.latest_config_path:
+ secret_keeper = SecretKeeper()
+ secret_keeper.secrets = secret_keeper.load()
# settings
elif filename == "settings.yaml":
self.log("Migrating settings.yaml...", True)
migrated_settings = migrate_settings(
old=self.config_manager.read_config(old_file),
- new=self.config_manager.settings_config,
+ new=self.config_manager.read_config(new_file),
)
try:
- self.config_manager.settings_config = SettingsConfig(
- **migrated_settings
- )
+ if new_config_path == self.latest_config_path:
+ self.config_manager.settings_config = SettingsConfig(
+ **migrated_settings
+ )
self.config_manager.save_settings_config()
except ValidationError as e:
self.err(f"Unable to migrate settings.yaml:\n{str(e)}")
@@ -268,7 +404,7 @@ def migrate(
self.log("Migrating defaults.yaml...", True)
migrated_defaults = migrate_defaults(
old=self.config_manager.read_config(old_file),
- new=self.config_manager.default_config,
+ new=self.config_manager.read_config(new_file),
)
try:
self.config_manager.default_config = NestedConfig(
@@ -292,9 +428,10 @@ def migrate(
),
)
# validate the merged config
- _wingman_config = self.config_manager.merge_configs(
- default_config, migrated_wingman
- )
+ if new_config_path == self.latest_config_path:
+ _wingman_config = self.config_manager.merge_configs(
+ default_config, migrated_wingman
+ )
# diff it
wingman_diff = self.config_manager.deep_diff(
default_config, migrated_wingman
@@ -357,3 +494,10 @@ def migrate(
path.join(new_config_path, MIGRATION_LOG), "w", encoding="UTF-8"
) as stream:
stream.write(self.log_message)
+
+
+MIGRATIONS = [
+ ("1_4_0", "1_5_0", ConfigMigrationService.migrate_140_to_150),
+ ("1_5_0", "1_6_0", ConfigMigrationService.migrate_150_to_160),
+ # Add new migrations here in order
+]
diff --git a/services/config_service.py b/services/config_service.py
index 1a1eea88..a02ee49c 100644
--- a/services/config_service.py
+++ b/services/config_service.py
@@ -502,35 +502,7 @@ async def save_defaults_config(
)
async def migrate_configs(self, system_manager: SystemManager):
- current_version: str = system_manager.get_local_version()
- current_version_str = str(current_version).replace(".", "_")
- version_path = get_users_dir()
-
- def version_key(version_str):
- try:
- return Version(version_str.replace("_", "."))
- except InvalidVersion:
- return Version("0.0.0") # Placeholder for invalid versions
-
- valid_versions = [
- v for v in os.listdir(version_path) if version_key(v) > Version("0.0.0")
- ]
- all_versions = sorted(valid_versions, key=version_key)
-
- if current_version_str not in all_versions:
- raise ValueError(
- f"Current version {current_version} not found in the directory."
- )
-
- current_index = all_versions.index(current_version_str)
-
- if current_index == 0:
- return # Current version is the oldest, no migrations needed
-
- previous_version_str = all_versions[current_index - 1]
- migration_func_name = f"migrate_{previous_version_str.replace('_', '')}_to_{current_version_str.replace('_', '')}"
-
- migraton_service = ConfigMigrationService(config_manager=self.config_manager)
- if hasattr(migraton_service, migration_func_name):
- migration_func = getattr(migraton_service, migration_func_name)
- migration_func()
+ migration_service = ConfigMigrationService(
+ config_manager=self.config_manager, system_manager=system_manager
+ )
+ migration_service.migrate_to_latest()
diff --git a/services/module_manager.py b/services/module_manager.py
index e080b65e..bbf32fa9 100644
--- a/services/module_manager.py
+++ b/services/module_manager.py
@@ -179,7 +179,6 @@ def read_available_skills() -> list[SkillBase]:
skill = SkillBase(
name=skill_config["name"],
config=skill_config,
- description=skill_config["description"],
logo=logo,
)
skills.append(skill)
diff --git a/services/pub_sub.py b/services/pub_sub.py
index 0326069a..51e18cb4 100644
--- a/services/pub_sub.py
+++ b/services/pub_sub.py
@@ -1,4 +1,5 @@
import asyncio
+import inspect
class PubSub:
@@ -14,10 +15,26 @@ def unsubscribe(self, event_type, fn):
if event_type in self.subscribers:
self.subscribers[event_type].remove(fn)
- async def publish(self, event_type, data):
+ async def publish(self, event_type, data=None):
if event_type in self.subscribers:
for fn in self.subscribers[event_type]:
+ # Get the number of parameters the function expects
+ params = inspect.signature(fn).parameters
+ param_count = len(params)
+
+ # Determine if the function is a method (has 'self' parameter)
+ is_method = "self" in params
+
+ # Determine if the function expects an argument (excluding 'self' for methods)
+ expects_arg = (param_count > 1) if is_method else (param_count > 0)
+
if asyncio.iscoroutinefunction(fn):
- await fn(data)
+ if expects_arg and data is not None:
+ await fn(data)
+ else:
+ await fn()
else:
- fn(data)
+ if expects_arg and data is not None:
+ fn(data)
+ else:
+ fn()
diff --git a/services/secret_keeper.py b/services/secret_keeper.py
index 593f98e2..da60a34b 100644
--- a/services/secret_keeper.py
+++ b/services/secret_keeper.py
@@ -7,6 +7,7 @@
from services.file import get_writable_dir
from services.websocket_user import WebSocketUser
from services.printr import Printr
+from services.pub_sub import PubSub
class SecretKeeper(WebSocketUser):
@@ -19,6 +20,7 @@ class SecretKeeper(WebSocketUser):
config_file: str
secrets: Dict[str, Any]
prompted_secrets: list[str] = []
+ secret_events: PubSub
def __new__(cls):
if cls._instance is None:
@@ -44,6 +46,7 @@ def __new__(cls):
get_writable_dir(CONFIGS_DIR), SECRETS_FILE
)
cls._instance.secrets = cls._instance.load() or {}
+ cls._instance.secret_events = PubSub()
return cls._instance
@@ -60,7 +63,7 @@ def load(self) -> Dict[str, Any]:
self.printr.toast_error(f"Could not load ({SECRETS_FILE})\n{str(e)}")
return {}
- def save(self) -> bool:
+ async def save(self) -> bool:
if not self.config_file:
self.printr.toast_error("No config file path provided.")
return False
@@ -68,6 +71,7 @@ def save(self) -> bool:
with open(self.config_file, "w", encoding="UTF-8") as stream:
yaml.dump(self.secrets, stream)
self.load()
+ await self.secret_events.publish("secrets_saved", self.secrets)
return True
except yaml.YAMLError as e:
self.printr.toast_error(f"Could not write ({SECRETS_FILE})\n{str(e)}")
@@ -105,5 +109,5 @@ async def post_secrets(self, secrets: dict[str, Any]):
for key, value in secrets.items():
self.secrets[key] = value
- if self.save():
+ if await self.save():
self.printr.print("Secrets updated.", server_only=True)
diff --git a/services/system_manager.py b/services/system_manager.py
index c47e608e..cd10e321 100644
--- a/services/system_manager.py
+++ b/services/system_manager.py
@@ -4,7 +4,7 @@
from packaging import version
from api.interface import SystemCore, SystemInfo
-LOCAL_VERSION = "1.5.0"
+LOCAL_VERSION = "1.6.0"
VERSION_ENDPOINT = "https://wingman-ai.com/api/version"
diff --git a/services/tower.py b/services/tower.py
index 3372df29..6c6d1a28 100644
--- a/services/tower.py
+++ b/services/tower.py
@@ -8,6 +8,7 @@
from providers.whispercpp import Whispercpp
from providers.xvasynth import XVASynth
from services.audio_player import AudioPlayer
+from services.audio_library import AudioLibrary
from services.module_manager import ModuleManager
from services.printr import Printr
from wingmen.open_ai_wingman import OpenAiWingman
@@ -22,10 +23,12 @@ def __init__(
self,
config: Config,
audio_player: AudioPlayer,
+ audio_library: AudioLibrary,
whispercpp: Whispercpp,
xvasynth: XVASynth,
):
self.audio_player = audio_player
+ self.audio_library = audio_library
self.config = config
self.mouse_wingman_dict: dict[str, Wingman] = {}
self.wingmen: list[Wingman] = []
@@ -84,6 +87,7 @@ async def __instantiate_wingman(
config=wingman_config,
settings=settings,
audio_player=self.audio_player,
+ audio_library=self.audio_library,
whispercpp=self.whispercpp,
xvasynth=self.xvasynth,
)
@@ -93,6 +97,7 @@ async def __instantiate_wingman(
config=wingman_config,
settings=settings,
audio_player=self.audio_player,
+ audio_library=self.audio_library,
whispercpp=self.whispercpp,
xvasynth=self.xvasynth,
)
diff --git a/services/voice_service.py b/services/voice_service.py
index e11df008..68b1726e 100644
--- a/services/voice_service.py
+++ b/services/voice_service.py
@@ -1,3 +1,5 @@
+import asyncio
+from concurrent.futures import ThreadPoolExecutor
from fastapi import APIRouter
from api.enums import AzureRegion, OpenAiTtsVoice
from api.interface import (
@@ -15,6 +17,7 @@
from providers.xvasynth import XVASynth
from services.audio_player import AudioPlayer
from services.config_manager import ConfigManager
+from services.printr import Printr
class VoiceService:
@@ -24,6 +27,7 @@ def __init__(
audio_player: AudioPlayer,
xvasynth: XVASynth,
):
+ self.printr = Printr()
self.config_manager = config_manager
self.audio_player = audio_player
self.xvasynth = xvasynth
@@ -114,13 +118,22 @@ def __convert_azure_voice(self, voice):
)
# GET /voices/elevenlabs
- def get_elevenlabs_voices(self, api_key: str):
+ async def get_elevenlabs_voices(self, api_key: str) -> list[VoiceInfo]:
elevenlabs = ElevenLabs(api_key=api_key, wingman_name="")
- voices = elevenlabs.get_available_voices()
- convert = lambda voice: VoiceInfo(id=voice.voiceID, name=voice.name)
- result = [convert(voice) for voice in voices]
+ try:
+ # Run the synchronous method in a separate thread
+ loop = asyncio.get_running_loop()
+ with ThreadPoolExecutor() as pool:
+ voices = await loop.run_in_executor(
+ pool, elevenlabs.get_available_voices
+ )
- return result
+ convert = lambda voice: VoiceInfo(id=voice.voiceID, name=voice.name)
+ result = [convert(voice) for voice in voices]
+ return result
+ except ValueError as e:
+ self.printr.toast_error(f"Elevenlabs: \n{str(e)}")
+ return []
# GET /voices/azure
def get_azure_voices(self, api_key: str, region: AzureRegion, locale: str = ""):
diff --git a/skills/ask_perplexity/default_config.yaml b/skills/ask_perplexity/default_config.yaml
new file mode 100644
index 00000000..ff992dfd
--- /dev/null
+++ b/skills/ask_perplexity/default_config.yaml
@@ -0,0 +1,31 @@
+name: AskPerplexity
+module: skills.ask_perplexity.main
+category: general
+description:
+ en: Uses the Perplexity API to get up-to-date information on a wide range of topics. Perplexity is a paid service, you will need a funded account with an active API key, see https://www.perplexity.ai/settings/api
+ de: Verwendet die Perplexity-API, um aktuelle Informationen zu einer Vielzahl von Themen zu erhalten. Perplexity ist ein kostenpflichtiger Dienst, ein Konto mit Guthaben und aktiven API key ist notwendig, siehe https://www.perplexity.ai/settings/api
+examples:
+ - question:
+ en: How is the weather today in Berlin?
+ de: Wie ist das Wetter heute?
+ answer:
+ en: Today, the weather in Berlin is cloudy with a high of 20°C and a ... (more details)
+ de: Heute ist das Wetter in Berlin bewölkt, mit einer Höchsttemperatur von 20°C und ... (mehr Details)
+ - question:
+ en: In Star Citizen mining, what is currently the best way to find quantanium?
+ de: Beim Mining in Star Citizen, wie finde ich aktuell am besten Quantanium?
+ answer:
+ en: To find Quantanium for mining in Star Citizen, your best bet is Lyria, as it offers ... (more details)
+ de: Um Quantanium im Star Citizen Universum zu finden, ist Lyria der beste Ort, da dort ... (mehr Details)
+prompt: |
+ There is a new function: 'ask_perplexity'
+ Perplexity is a powerful tool that can provide you with up-to-date information on a wide range of topics.
+ Use it everytime the user asks a question that implies the need for up-to-date information.
+ Always use this if no other available skill matches the request better to get up-to-date information.
+custom_properties:
+ - id: instant_response
+ name: Instant Response
+ hint: If set, the Perplexity answer will be used instantly and unprocessed. This is faster but will not include format and/or language guidelines set in your wingman.
+ value: False
+ required: false
+ property_type: boolean
diff --git a/skills/ask_perplexity/logo.png b/skills/ask_perplexity/logo.png
new file mode 100644
index 00000000..3479ef77
Binary files /dev/null and b/skills/ask_perplexity/logo.png differ
diff --git a/skills/ask_perplexity/main.py b/skills/ask_perplexity/main.py
new file mode 100644
index 00000000..869d3c97
--- /dev/null
+++ b/skills/ask_perplexity/main.py
@@ -0,0 +1,84 @@
+from api.interface import (
+ WingmanInitializationError,
+)
+from skills.skill_base import Skill
+
+class AskPerplexity(Skill):
+
+ def __init__(
+ self,
+ *args,
+ **kwargs,
+ ) -> None:
+ super().__init__(*args, **kwargs)
+
+ self.instant_response = False
+
+ async def validate(self) -> list[WingmanInitializationError]:
+ errors = await super().validate()
+
+ if not self.wingman.perplexity:
+ await self.wingman.validate_and_set_perplexity(errors)
+
+ self.instant_response = self.retrieve_custom_property_value("instant_response", errors)
+
+ return errors
+
+ def get_tools(self) -> list[tuple[str, dict]]:
+ tools = [
+ (
+ "ask_perplexity",
+ {
+ "type": "function",
+ "function": {
+ "name": "ask_perplexity",
+ "description": "Expects a question that is answered with up-to-date information from the internet.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "question": {"type": "string"},
+ },
+ "required": ["question"],
+ },
+ },
+
+ },
+ ),
+ ]
+ return tools
+
+ async def execute_tool(
+ self, tool_name: str, parameters: dict[str, any]
+ ) -> tuple[str, str]:
+ function_response = ""
+ instant_response = ""
+
+ if tool_name in ["ask_perplexity"]:
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+
+ if tool_name == "ask_perplexity" and "question" in parameters:
+ function_response = self.ask_perplexity(parameters["question"])
+ if self.instant_response:
+ instant_response = function_response
+
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Perplexity answer: {function_response}"
+ )
+ await self.print_execution_time()
+
+ return function_response, instant_response
+
+ def ask_perplexity(self, question: str) -> str:
+ """Uses the Perplexity API to answer a question."""
+
+ completion = self.wingman.perplexity.ask(
+ messages=[{"role": "user", "content": question}],
+ model=self.wingman.config.perplexity.conversation_model.value,
+ )
+
+ if completion and completion.choices:
+ return completion.choices[0].message.content
+ else:
+ return "Error: Unable to retrieve a response from Perplexity API."
diff --git a/skills/file_manager/default_config.yaml b/skills/file_manager/default_config.yaml
index 16dc408b..deb80aaf 100644
--- a/skills/file_manager/default_config.yaml
+++ b/skills/file_manager/default_config.yaml
@@ -2,11 +2,11 @@ name: FileManager
module: skills.file_manager.main
category: general
description:
- en: Manage local files, save, load and create directories. Supports various text-based file formats.
- de: Verwalte lokale Dateien, speichere, lade oder erstelle Verzeichnisse. Unterstützt verschiedene text-basierte Formate.
+ en: Manage local files, save, load and create directories. Supports various text-based file formats and reading PDFs.
+ de: Verwalte lokale Dateien, speichere, lade und erstelle Verzeichnisse. Unterstützt verschiedene text-basierte Formate und das Lesen von PDFs.
hint:
- en:
You should provide an exact file path and name for where you want to create a directory or save or load a text file. For example "save that text to a file called samplefile in my C drive in the directory called Documents."
If you do not, a directory called "files" in your Wingman config dir will be created and used.
Supported file formats are plain text file formats, such as txt, md, log, yaml, py, json, etc.
- de:
Gib einen möglichst genauen Speicherort für deine Verzeichnisse oder Dateien an, beispielsweise "Speichere diesen Text in eine Daten namens beispieldatei auf meinem C-Laufwerk im Verzeichnis Dokumente".
Wenn du das nicht machst, wird ein Verzeichnis namens "files" in deinem Wingman-Konfigurationsverzeichnis erstellt und verwendet.
Unterstützte Dateiformate sind einfache Textdateiformate wie txt, md, log, yaml, py, json usw
+ en:
You should provide an exact file path and name for where you want to create a directory or save or load a text file. For example "save that text to a file called samplefile in my C drive in the directory called Documents."
If you do not, a directory called "files" in your Wingman config dir will be created and used.
Supported file formats are plain text file formats, such as txt, md, log, yaml, py, json, etc., and PDFs.
+ de:
Gib einen möglichst genauen Speicherort für deine Verzeichnisse oder Dateien an, beispielsweise "Speichere diesen Text in eine Daten namens beispieldatei auf meinem C-Laufwerk im Verzeichnis Dokumente".
Wenn du das nicht machst, wird ein Verzeichnis namens "files" in deinem Wingman-Konfigurationsverzeichnis erstellt und verwendet.
Unterstützte Dateiformate sind einfache Textdateiformate wie txt, md, log, yaml, py, json usw., und PDFs.
examples:
- question:
en: Save 'Hello, World!' to hello.txt.
@@ -26,9 +26,15 @@ examples:
answer:
en: (creates a directory named 'Projects' in the default directory)
de: (erstellt ein Verzeichnis namens 'Projekte' im Standardverzeichnis)
+ - question:
+ en: Read page 5 of example.pdf.
+ de: Lies Seite 5 von example.pdf.
+ answer:
+ en: (loads page 5 of example.pdf and reads it into memory)
+ de: (lädt Seite 5 von example.pdf und liest sie in den Speicher)
prompt: |
You can also save text to various file formats, load text from files, or create directories as specified by the user.
- You support all plain text file formats.
+ You support reading and writing all plain text file formats and reading PDF files.
When adding text to an existing file, you follow these rules:
(1) determine if it is appropriate to add a new line before the added text or ask the user if you do not know.
(2) only add content to an existing file if you are sure that is what the user wants.
@@ -46,4 +52,4 @@ custom_properties:
name: Allow overwrite existing files
property_type: boolean
required: true
- value: false
+ value: false
\ No newline at end of file
diff --git a/skills/file_manager/main.py b/skills/file_manager/main.py
index 653c94ce..d5be988f 100644
--- a/skills/file_manager/main.py
+++ b/skills/file_manager/main.py
@@ -6,10 +6,11 @@
from skills.skill_base import Skill
from services.file import get_writable_dir
from showinfm import show_in_file_manager
+from pdfminer.high_level import extract_text
if TYPE_CHECKING:
from wingmen.open_ai_wingman import OpenAiWingman
-DEFAULT_MAX_TEXT_SIZE = 15000
+DEFAULT_MAX_TEXT_SIZE = 24000
SUPPORTED_FILE_EXTENSIONS = [
"adoc",
"asc",
@@ -46,6 +47,7 @@
"m3u",
"map",
"md",
+ "pdf",
"pyd",
"plist",
"pl",
@@ -63,11 +65,13 @@
"sql",
"svg",
"ts",
+ "tscn",
"tcl",
"tex",
"tmpl",
"toml",
"tpl",
+ "tres",
"tsv",
"txt",
"vtt",
@@ -129,6 +133,10 @@ def get_tools(self) -> list[tuple[str, dict]]:
"type": "string",
"description": "The directory from where the file should be loaded. Defaults to the configured directory.",
},
+ "pdf_page_number_to_load": {
+ "type": "number",
+ "description": "The page number of a pdf to load, if expressly specified by the user.",
+ },
},
"required": ["file_name"],
},
@@ -237,25 +245,31 @@ async def execute_tool(
)
file_name = parameters.get("file_name")
directory = parameters.get("directory_path", self.default_directory)
+ pdf_page_number = parameters.get("pdf_page_number_to_load")
if directory == "":
directory = self.default_directory
if not file_name or file_name == "":
function_response = "File name not provided."
else:
file_extension = file_name.split(".")[-1]
- if file_extension not in self.allowed_file_extensions:
+ if file_extension.lower() not in self.allowed_file_extensions:
function_response = f"Unsupported file extension: {file_extension}"
else:
file_path = os.path.join(directory, file_name)
try:
- with open(file_path, "r", encoding="utf-8") as file:
- file_content = file.read()
- if len(file_content) > self.max_text_size:
- function_response = (
- "File content exceeds the maximum allowed size."
- )
- else:
- function_response = f"File content loaded from {file_path}:\n{file_content}"
+ # if PDF, use pdfminer.six's extract text to read (optionally passing the specific page to read - zero-indexed so subtract 1), otherwise open and parse file
+ file_content = ""
+ if file_extension.lower() == "pdf":
+ file_content = extract_text(file_path, page_numbers=[pdf_page_number-1]) if pdf_page_number else extract_text(file_path)
+ else:
+ with open(file_path, "r", encoding="utf-8") as file:
+ file_content = file.read()
+ if len(file_content) > self.max_text_size:
+ function_response = (
+ "File content exceeds the maximum allowed size."
+ )
+ else:
+ function_response = f"File content loaded from {file_path}:\n{file_content}"
except FileNotFoundError:
function_response = (
f"File '{file_name}' not found in '{directory}'."
@@ -282,12 +296,12 @@ async def execute_tool(
function_response = "File name or text content not provided."
else:
file_extension = file_name.split(".")[-1]
- if file_extension not in self.allowed_file_extensions:
+ if file_extension.lower() not in self.allowed_file_extensions:
file_name += f".{self.default_file_extension}"
if len(text_content) > self.max_text_size:
function_response = "Text content exceeds the maximum allowed size."
else:
- if file_extension == "json":
+ if file_extension.lower() == "json":
try:
json_content = json.loads(text_content)
text_content = json.dumps(json_content, indent=4)
diff --git a/skills/file_manager/requirements.txt b/skills/file_manager/requirements.txt
new file mode 100644
index 00000000..7f9421c5
Binary files /dev/null and b/skills/file_manager/requirements.txt differ
diff --git a/skills/image_generation/default_config.yaml b/skills/image_generation/default_config.yaml
new file mode 100644
index 00000000..36aa62ac
--- /dev/null
+++ b/skills/image_generation/default_config.yaml
@@ -0,0 +1,18 @@
+name: ImageGeneration
+module: skills.image_generation.main
+category: general
+description:
+ en: Use Wingamn AI to generate images based on your input. It uses DALL-E 3.
+ de: Verwende Wingamn AI, um Bilder basierend auf deinen Eingaben zu generieren. Es verwendet DALL-E 3.
+# hint:
+# en:
+# de:
+examples:
+ - question:
+ en: Generate an image of a cat.
+ de: Generiere ein Bild einer Katze.
+ answer:
+ en: Here is an image of a cat.
+ de: Hier ist ein Bild einer Katze.
+prompt: |
+ You can also generate images.
diff --git a/skills/image_generation/logo.png b/skills/image_generation/logo.png
new file mode 100644
index 00000000..15ef76b5
Binary files /dev/null and b/skills/image_generation/logo.png differ
diff --git a/skills/image_generation/main.py b/skills/image_generation/main.py
new file mode 100644
index 00000000..13aeb355
--- /dev/null
+++ b/skills/image_generation/main.py
@@ -0,0 +1,71 @@
+from typing import TYPE_CHECKING
+from api.enums import LogSource, LogType
+from api.interface import SettingsConfig, SkillConfig, WingmanInitializationError
+from skills.skill_base import Skill
+
+if TYPE_CHECKING:
+ from wingmen.open_ai_wingman import OpenAiWingman
+
+
+class ImageGeneration(Skill):
+
+ def __init__(
+ self,
+ config: SkillConfig,
+ settings: SettingsConfig,
+ wingman: "OpenAiWingman",
+ ) -> None:
+ super().__init__(config=config, settings=settings, wingman=wingman)
+
+ async def validate(self) -> list[WingmanInitializationError]:
+ errors = await super().validate()
+
+ return errors
+
+ async def execute_tool(
+ self, tool_name: str, parameters: dict[str, any]
+ ) -> tuple[str, str]:
+ instant_response = ""
+ function_response = "I can't generate an image, sorry. Try another provider."
+
+ if tool_name == "generate_image":
+ prompt = parameters["prompt"]
+ if self.settings.debug_mode:
+ await self.printr.print_async(f"Generate image with prompt: {prompt}.", color=LogType.INFO)
+ image = await self.wingman.generate_image(prompt)
+ await self.printr.print_async(
+ "",
+ color=LogType.INFO,
+ source=LogSource.WINGMAN,
+ source_name=self.wingman.name,
+ skill_name=self.name,
+ additional_data={"image_url": image},
+ )
+ function_response = "Here is an image based on your prompt."
+ return function_response, instant_response
+
+ async def is_waiting_response_needed(self, tool_name: str) -> bool:
+ return True
+
+ def get_tools(self) -> list[tuple[str, dict]]:
+ tools = [
+ (
+ "generate_image",
+ {
+ "type": "function",
+ "function": {
+ "name": "generate_image",
+ "description": "Generate an image based on the users prompt.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "prompt": {"type": "string"},
+ },
+ "required": ["prompt"],
+ },
+ },
+ },
+ ),
+ ]
+
+ return tools
diff --git a/skills/msfs2020_control/default_config.yaml b/skills/msfs2020_control/default_config.yaml
new file mode 100644
index 00000000..b153588e
--- /dev/null
+++ b/skills/msfs2020_control/default_config.yaml
@@ -0,0 +1,632 @@
+name: Msfs2020Control
+module: skills.msfs2020_control.main
+category: flight_simulator
+description:
+ en: Control and retrieve data from MSFS2020 dynamically using SimConnect.
+ de: Steuern und Abrufen von Daten aus MSFS2020 dynamisch mit SimConnect.
+hint:
+ en: This skill can retrieve data and perform actions in the MSFS2020 simulator based on dynamic inputs.
+ de: Dieser Skill kann Daten abrufen und Aktionen im MSFS2020-Simulator basierend auf dynamischen Eingaben ausführen.
+examples:
+ - question:
+ en: What is my current altitude?
+ de: Wie hoch bin ich gerade?
+ answer:
+ en: (retrieves 'PLANE_ALTITUDE' from MSFS2020)
+ de: (ruft 'PLANE_ALTITUDE' aus MSFS2020 ab)
+ - question:
+ en: Start the engine.
+ de: Starte den Motor.
+ answer:
+ en: (executes the engine start sequence in MSFS2020)
+ de: (führt die Motorstartsequenz in MSFS2020 aus)
+prompt: |
+ You can interact with the MSFS2020 simulator using the Python SimConnect API.
+ To retrieve data, use 'get_data_from_sim'. To set data or perform actions, use 'set_data_or_perform_action_in_sim'.
+ Example:
+ User: Set throttle to full!
+ Response: (uses the set_data_or_perform_action_in_sim function with THROTTLE_FULL as the action)
+ User: Let's raise the flaps.
+ Response: (uses the set_data_or_perform_action_in_sim function with FLAPS_UP as the action)
+ User: How far are we above ground right now?
+ Response: (uses the get_data_from_sim function with PLANE_ALT_ABOVE_GROUND as the data point)
+ Here are some dictionaries with common action names and their explanations for use with the set_data_or_perform_action_in_sim function:
+ Engine Events:
+ {
+ "THROTTLE_FULL": "Set throttles max",
+ "THROTTLE_INCR": "Increment throttles",
+ "THROTTLE_INCR_SMALL": "Increment throttles small",
+ "THROTTLE_DECR": "Decrement throttles",
+ "THROTTLE_DECR_SMALL": "Decrease throttles small",
+ "THROTTLE_CUT": "Set throttles to idle",
+ "INCREASE_THROTTLE": "Increment throttles",
+ "DECREASE_THROTTLE": "Decrement throttles",
+ "THROTTLE_SET": "Set throttles exactly (0- 16383),",
+ "THROTTLE1_FULL": "Set throttle 1 max",
+ "THROTTLE1_INCR": "Increment throttle 1",
+ "THROTTLE1_INCR_SMALL": "Increment throttle 1 small",
+ "THROTTLE1_DECR": "Decrement throttle 1",
+ "THROTTLE1_CUT": "Set throttle 1 to idle",
+ "THROTTLE2_FULL": "Set throttle 2 max",
+ "THROTTLE2_INCR": "Increment throttle 2",
+ "THROTTLE2_INCR_SMALL": "Increment throttle 2 small",
+ "THROTTLE2_DECR": "Decrement throttle 2",
+ "THROTTLE2_CUT": "Set throttle 2 to idle",
+ "THROTTLE3_FULL": "Set throttle 3 max",
+ "THROTTLE3_INCR": "Increment throttle 3",
+ "THROTTLE3_INCR_SMALL": "Increment throttle 3 small",
+ "THROTTLE3_DECR": "Decrement throttle 3",
+ "THROTTLE3_CUT": "Set throttle 3 to idle",
+ "THROTTLE4_FULL": "Set throttle 1 max",
+ "THROTTLE4_INCR": "Increment throttle 4",
+ "THROTTLE4_INCR_SMALL": "Increment throttle 4 small",
+ "THROTTLE4_DECR": "Decrement throttle 4",
+ "THROTTLE4_CUT": "Set throttle 4 to idle",
+ "THROTTLE_10": "Set throttles to 10%",
+ "THROTTLE_20": "Set throttles to 20%",
+ "THROTTLE_30": "Set throttles to 30%",
+ "THROTTLE_40": "Set throttles to 40%",
+ "THROTTLE_50": "Set throttles to 50%",
+ "THROTTLE_60": "Set throttles to 60%",
+ "THROTTLE_70": "Set throttles to 70%",
+ "THROTTLE_80": "Set throttles to 80%",
+ "THROTTLE_90": "Set throttles to 90%",
+ "THROTTLE1_DECR_SMALL": "Decrease throttle 1 small",
+ "THROTTLE2_DECR_SMALL": "Decrease throttle 2 small",
+ "THROTTLE3_DECR_SMALL": "Decrease throttle 3 small",
+ "THROTTLE4_DECR_SMALL": "Decrease throttle 4 small",
+ "PROP_PITCH_DECR_SMALL": "Decrease prop levers small",
+ "PROP_PITCH1_DECR_SMALL": "Decrease prop lever 1 small",
+ "PROP_PITCH2_DECR_SMALL": "Decrease prop lever 2 small",
+ "PROP_PITCH3_DECR_SMALL": "Decrease prop lever 3 small",
+ "PROP_PITCH4_DECR_SMALL": "Decrease prop lever 4 small",
+ "MIXTURE1_RICH": "Set mixture lever 1 to max rich",
+ "MIXTURE1_INCR": "Increment mixture lever 1",
+ "MIXTURE1_INCR_SMALL": "Increment mixture lever 1 small",
+ "MIXTURE1_DECR": "Decrement mixture lever 1",
+ "MIXTURE1_LEAN": "Set mixture lever 1 to max lean",
+ "MIXTURE2_RICH": "Set mixture lever 2 to max rich",
+ "MIXTURE2_INCR": "Increment mixture lever 2",
+ "MIXTURE2_INCR_SMALL": "Increment mixture lever 2 small",
+ "MIXTURE2_DECR": "Decrement mixture lever 2",
+ "MIXTURE2_LEAN": "Set mixture lever 2 to max lean",
+ "MIXTURE3_RICH": "Set mixture lever 3 to max rich",
+ "MIXTURE3_INCR": "Increment mixture lever 3",
+ "MIXTURE3_INCR_SMALL": "Increment mixture lever 3 small",
+ "MIXTURE3_DECR": "Decrement mixture lever 3",
+ "MIXTURE3_LEAN": "Set mixture lever 3 to max lean",
+ "MIXTURE4_RICH": "Set mixture lever 4 to max rich",
+ "MIXTURE4_INCR": "Increment mixture lever 4",
+ "MIXTURE4_INCR_SMALL": "Increment mixture lever 4 small",
+ "MIXTURE4_DECR": "Decrement mixture lever 4",
+ "MIXTURE4_LEAN": "Set mixture lever 4 to max lean",
+ "MIXTURE_RICH": "Set mixture levers to max rich",
+ "MIXTURE_INCR": "Increment mixture levers",
+ "MIXTURE_INCR_SMALL": "Increment mixture levers small",
+ "MIXTURE_DECR": "Decrement mixture levers",
+ "MIXTURE_LEAN": "Set mixture levers to max lean",
+ "MIXTURE1_SET": "Set mixture lever 1 exact value (0 to 16383),",
+ "MIXTURE2_SET": "Set mixture lever 2 exact value (0 to 16383),",
+ "MIXTURE3_SET": "Set mixture lever 3 exact value (0 to 16383),",
+ "MIXTURE4_SET": "Set mixture lever 4 exact value (0 to 16383),",
+ "MIXTURE_SET_BEST": "Set mixture levers to current best power setting",
+ "MIXTURE_DECR_SMALL": "Decrement mixture levers small",
+ "MIXTURE1_DECR_SMALL": "Decrement mixture lever 1 small",
+ "MIXTURE2_DECR_SMALL": "Decrement mixture lever 4 small",
+ "MIXTURE3_DECR_SMALL": "Decrement mixture lever 4 small",
+ "MIXTURE4_DECR_SMALL": "Decrement mixture lever 4 small",
+ "PROP_PITCH_SET": "Set prop pitch levers (0 to 16383),",
+ "PROP_PITCH_LO": "Set prop pitch levers max (lo pitch),",
+ "PROP_PITCH_INCR": "Increment prop pitch levers",
+ "PROP_PITCH_INCR_SMALL": "Increment prop pitch levers small",
+ "PROP_PITCH_DECR": "Decrement prop pitch levers",
+ "PROP_PITCH_HI": "Set prop pitch levers min (hi pitch),",
+ "PROP_PITCH1_LO": "Set prop pitch lever 1 max (lo pitch),",
+ "PROP_PITCH1_INCR": "Increment prop pitch lever 1",
+ "PROP_PITCH1_INCR_SMALL": "Increment prop pitch lever 1 small",
+ "PROP_PITCH1_DECR": "Decrement prop pitch lever 1",
+ "PROP_PITCH1_HI": "Set prop pitch lever 1 min (hi pitch),",
+ "PROP_PITCH2_LO": "Set prop pitch lever 2 max (lo pitch),",
+ "PROP_PITCH2_INCR": "Increment prop pitch lever 2",
+ "PROP_PITCH2_INCR_SMALL": "Increment prop pitch lever 2 small",
+ "PROP_PITCH2_DECR": "Decrement prop pitch lever 2",
+ "PROP_PITCH2_HI": "Set prop pitch lever 2 min (hi pitch),",
+ "PROP_PITCH3_LO": "Set prop pitch lever 3 max (lo pitch),",
+ "PROP_PITCH3_INCR": "Increment prop pitch lever 3",
+ "PROP_PITCH3_INCR_SMALL": "Increment prop pitch lever 3 small",
+ "PROP_PITCH3_DECR": "Decrement prop pitch lever 3",
+ "PROP_PITCH3_HI": "Set prop pitch lever 3 min (hi pitch),",
+ "PROP_PITCH4_LO": "Set prop pitch lever 4 max (lo pitch),",
+ "PROP_PITCH4_INCR": "Increment prop pitch lever 4",
+ "PROP_PITCH4_INCR_SMALL": "Increment prop pitch lever 4 small",
+ "PROP_PITCH4_DECR": "Decrement prop pitch lever 4",
+ "PROP_PITCH4_HI": "Set prop pitch lever 4 min (hi pitch),",
+ "JET_STARTER": "Selects jet engine starter (for +/- sequence),",
+ "MAGNETO_SET": "Sets magnetos (0,1),",
+ "TOGGLE_STARTER1": "Toggle starter 1",
+ "TOGGLE_STARTER2": "Toggle starter 2",
+ "TOGGLE_STARTER3": "Toggle starter 3",
+ "TOGGLE_STARTER4": "Toggle starter 4",
+ "TOGGLE_ALL_STARTERS": "Toggle starters",
+ "ENGINE_AUTO_START": "Triggers auto-start",
+ "ENGINE_AUTO_SHUTDOWN": "Triggers auto-shutdown",
+ "MAGNETO": "Selects magnetos (for +/- sequence),",
+ "MAGNETO_DECR": "Decrease magneto switches positions",
+ "MAGNETO_INCR": "Increase magneto switches positions",
+ "MAGNETO1_OFF": "Set engine 1 magnetos off",
+ "MAGNETO1_RIGHT": "Toggle engine 1 right magneto",
+ "MAGNETO1_LEFT": "Toggle engine 1 left magneto",
+ "MAGNETO1_BOTH": "Set engine 1 magnetos on",
+ "MAGNETO1_START": "Set engine 1 magnetos on and toggle starter",
+ "MAGNETO2_OFF": "Set engine 2 magnetos off",
+ "MAGNETO2_RIGHT": "Toggle engine 2 right magneto",
+ "MAGNETO2_LEFT": "Toggle engine 2 left magneto",
+ "MAGNETO2_BOTH": "Set engine 2 magnetos on",
+ "MAGNETO2_START": "Set engine 2 magnetos on and toggle starter",
+ "MAGNETO3_OFF": "Set engine 3 magnetos off",
+ "MAGNETO3_RIGHT": "Toggle engine 3 right magneto",
+ "MAGNETO3_LEFT": "Toggle engine 3 left magneto",
+ "MAGNETO3_BOTH": "Set engine 3 magnetos on",
+ "MAGNETO3_START": "Set engine 3 magnetos on and toggle starter",
+ "MAGNETO4_OFF": "Set engine 4 magnetos off",
+ "MAGNETO4_RIGHT": "Toggle engine 4 right magneto",
+ "MAGNETO4_LEFT": "Toggle engine 4 left magneto",
+ "MAGNETO4_BOTH": "Set engine 4 magnetos on",
+ "MAGNETO4_START": "Set engine 4 magnetos on and toggle starter",
+ "MAGNETO_OFF": "Set engine magnetos off",
+ "MAGNETO_RIGHT": "Set engine right magnetos on",
+ "MAGNETO_LEFT": "Set engine left magnetos on",
+ "MAGNETO_BOTH": "Set engine magnetos on",
+ "MAGNETO_START": "Set engine magnetos on and toggle starters",
+ "MAGNETO1_DECR": "Decrease engine 1 magneto switch position",
+ "MAGNETO1_INCR": "Increase engine 1 magneto switch position",
+ "MAGNETO2_DECR": "Decrease engine 2 magneto switch position",
+ "MAGNETO2_INCR": "Increase engine 2 magneto switch position",
+ "MAGNETO3_DECR": "Decrease engine 3 magneto switch position",
+ "MAGNETO3_INCR": "Increase engine 3 magneto switch position",
+ "MAGNETO4_DECR": "Decrease engine 4 magneto switch position",
+ "MAGNETO4_INCR": "Increase engine 4 magneto switch position",
+ "MAGNETO1_SET": "Set engine 1 magneto switch",
+ "MAGNETO2_SET": "Set engine 2 magneto switch",
+ "MAGNETO3_SET": "Set engine 3 magneto switch",
+ "MAGNETO4_SET": "Set engine 4 magneto switch",
+ "ANTI_ICE_ON": "Sets anti-ice switches on",
+ "ANTI_ICE_OFF": "Sets anti-ice switches off",
+ "ANTI_ICE_TOGGLE_ENG1": "Toggle engine 1 anti-ice switch",
+ "ANTI_ICE_TOGGLE_ENG2": "Toggle engine 2 anti-ice switch",
+ "ANTI_ICE_TOGGLE_ENG3": "Toggle engine 3 anti-ice switch",
+ "ANTI_ICE_TOGGLE_ENG4": "Toggle engine 4 anti-ice switch",,
+ "TOGGLE_FUEL_VALVE_ALL": "Toggle engine fuel valves",
+ "TOGGLE_FUEL_VALVE_ENG1": "Toggle engine 1 fuel valve",
+ "TOGGLE_FUEL_VALVE_ENG2": "Toggle engine 2 fuel valve",
+ "TOGGLE_FUEL_VALVE_ENG3": "Toggle engine 3 fuel valve",
+ "TOGGLE_FUEL_VALVE_ENG4": "Toggle engine 4 fuel valve",
+ "INC_COWL_FLAPS": "Increment cowl flap levers",
+ "DEC_COWL_FLAPS": "Decrement cowl flap levers",
+ "INC_COWL_FLAPS1": "Increment engine 1 cowl flap lever",
+ "DEC_COWL_FLAPS1": "Decrement engine 1 cowl flap lever",
+ "INC_COWL_FLAPS2": "Increment engine 2 cowl flap lever",
+ "DEC_COWL_FLAPS2": "Decrement engine 2 cowl flap lever",
+ "INC_COWL_FLAPS3": "Increment engine 3 cowl flap lever",
+ "DEC_COWL_FLAPS3": "Decrement engine 3 cowl flap lever",
+ "INC_COWL_FLAPS4": "Increment engine 4 cowl flap lever",
+ "DEC_COWL_FLAPS4": "Decrement engine 4 cowl flap lever",
+ "FUEL_PUMP": "Toggle electric fuel pumps",
+ "TOGGLE_ELECT_FUEL_PUMP": "Toggle electric fuel pumps",
+ "TOGGLE_ELECT_FUEL_PUMP1": "Toggle engine 1 electric fuel pump",
+ "TOGGLE_ELECT_FUEL_PUMP2": "Toggle engine 2 electric fuel pump",
+ "TOGGLE_ELECT_FUEL_PUMP3": "Toggle engine 3 electric fuel pump",
+ "TOGGLE_ELECT_FUEL_PUMP4": "Toggle engine 4 electric fuel pump",
+ "ENGINE_PRIMER": "Trigger engine primers",
+ "TOGGLE_PRIMER": "Trigger engine primers",
+ "TOGGLE_PRIMER1": "Trigger engine 1 primer",
+ "TOGGLE_PRIMER2": "Trigger engine 2 primer",
+ "TOGGLE_PRIMER3": "Trigger engine 3 primer",
+ "TOGGLE_PRIMER4": "Trigger engine 4 primer",
+ "TOGGLE_FEATHER_SWITCHES": "Trigger propeller switches",
+ "TOGGLE_FEATHER_SWITCH_1": "Trigger propeller 1 switch",
+ "TOGGLE_FEATHER_SWITCH_2": "Trigger propeller 2 switch",
+ "TOGGLE_FEATHER_SWITCH_3": "Trigger propeller 3 switch",
+ "TOGGLE_FEATHER_SWITCH_4": "Trigger propeller 4 switch",
+ "TOGGLE_PROPELLER_SYNC": "Turns propeller synchronization switch on",
+ "TOGGLE_AUTOFEATHER_ARM": "Turns auto-feather arming switch on.",
+ "TOGGLE_AFTERBURNER": "Toggles afterburners",
+ "ENGINE": "Sets engines for 1,2,3,4 selection (to be followed by SELECT_n),"
+ }
+ Flight Controls Events:
+ {
+ "FLAPS_UP": "Sets flap handle to full retract position",
+ "FLAPS_1": "Sets flap handle to first extension position",
+ "FLAPS_2": "Sets flap handle to second extension position",
+ "FLAPS_3": "Sets flap handle to third extension position",
+ "FLAPS_DOWN": "Sets flap handle to full extension position",
+ "ELEV_TRIM_DN": "Increments elevator trim down",
+ "ELEV_DOWN": "Increments elevator down",
+ "AILERONS_LEFT": "Increments ailerons left",
+ "CENTER_AILER_RUDDER": "Centers aileron and rudder positions",
+ "AILERONS_RIGHT": "Increments ailerons right",
+ "ELEV_TRIM_UP": "Increment elevator trim up",
+ "ELEV_UP": "Increments elevator up",
+ "RUDDER_LEFT": "Increments rudder left",
+ "RUDDER_CENTER": "Centers rudder position",
+ "RUDDER_RIGHT": "Increments rudder right",
+ "ELEVATOR_SET": "Sets elevator position (-16383 - +16383),",
+ "AILERON_SET": "Sets aileron position (-16383 - +16383),",
+ "RUDDER_SET": "Sets rudder position (-16383 - +16383),",
+ "FLAPS_INCR": "Increments flap handle position",
+ "FLAPS_DECR": "Decrements flap handle position",
+ "SPOILERS_ON": "Sets spoiler handle to full extend position",
+ "SPOILERS_OFF": "Sets spoiler handle to full retract position",
+ "SPOILERS_ARM_ON": "Sets auto-spoiler arming on",
+ "SPOILERS_ARM_OFF": "Sets auto-spoiler arming off",
+ "AILERON_TRIM_LEFT": "Increments aileron trim left",
+ "AILERON_TRIM_RIGHT": "Increments aileron trim right",
+ "RUDDER_TRIM_LEFT": "Increments rudder trim left",
+ "RUDDER_TRIM_RIGHT": "Increments aileron trim right",
+ "ELEVATOR_TRIM_SET": "Sets elevator trim position (0 to 16383),",
+ }
+ Autopilot and Avionics Events:
+ {
+ "AP_MASTER": "Toggles AP on/off",
+ "AUTOPILOT_OFF": "Turns AP off",
+ "AUTOPILOT_ON": "Turns AP on",
+ "YAW_DAMPER_TOGGLE": "Toggles yaw damper on/off",
+ "AP_PANEL_HEADING_HOLD": "Toggles heading hold mode on/off",
+ "AP_PANEL_ALTITUDE_HOLD": "Toggles altitude hold mode on/off",
+ "AP_ATT_HOLD_ON": "Turns on AP wing leveler and pitch hold mode",
+ "AP_LOC_HOLD_ON": "Turns AP localizer hold on/armed and glide-slope hold mode off",
+ "AP_APR_HOLD_ON": "Turns both AP localizer and glide-slope modes on/armed",
+ "AP_HDG_HOLD_ON": "Turns heading hold mode on",
+ "AP_ALT_HOLD_ON": "Turns altitude hold mode on",
+ "AP_WING_LEVELER_ON": "Turns wing leveler mode on",
+ "AP_BC_HOLD_ON": "Turns localizer back course hold mode on/armed",
+ "AP_NAV1_HOLD_ON": "Turns lateral hold mode on",
+ "AP_ATT_HOLD_OFF": "Turns off attitude hold mode",
+ "AP_LOC_HOLD_OFF": "Turns off localizer hold mode",
+ "AP_APR_HOLD_OFF": "Turns off approach hold mode",
+ "AP_HDG_HOLD_OFF": "Turns off heading hold mode",
+ "AP_ALT_HOLD_OFF": "Turns off altitude hold mode",
+ "AP_WING_LEVELER_OFF": "Turns off wing leveler mode",
+ "AP_BC_HOLD_OFF": "Turns off backcourse mode for localizer hold",
+ "AP_NAV1_HOLD_OFF": "Turns off nav hold mode",
+ "AP_AIRSPEED_HOLD": "Toggles airspeed hold mode",
+ "AUTO_THROTTLE_ARM": "Toggles autothrottle arming mode",
+ "AUTO_THROTTLE_TO_GA": "Toggles Takeoff/Go Around mode",
+ "HEADING_BUG_INC": "Increments heading hold reference bug",
+ "HEADING_BUG_DEC": "Decrements heading hold reference bug",
+ "HEADING_BUG_SET": "Set heading hold reference bug (degrees),",
+ "AP_PANEL_SPEED_HOLD": "Toggles airspeed hold mode",
+ "AP_ALT_VAR_INC": "Increments reference altitude",
+ "AP_ALT_VAR_DEC": "Decrements reference altitude",
+ "AP_VS_VAR_INC": "Increments vertical speed reference",
+ "AP_VS_VAR_DEC": "Decrements vertical speed reference",
+ "AP_SPD_VAR_INC": "Increments airspeed hold reference",
+ "AP_SPD_VAR_DEC": "Decrements airspeed hold reference",
+ "AP_PANEL_MACH_HOLD": "Toggles mach hold",
+ "AP_MACH_VAR_INC": "Increments reference mach",
+ "AP_MACH_VAR_DEC": "Decrements reference mach",
+ "AP_MACH_HOLD": "Toggles mach hold",
+ "AP_ALT_VAR_SET_METRIC": "Sets reference altitude in meters",
+ "AP_VS_VAR_SET_ENGLISH": "Sets reference vertical speed in feet per minute",
+ "AP_SPD_VAR_SET": "Sets airspeed reference in knots",
+ "AP_MACH_VAR_SET": "Sets mach reference",
+ "YAW_DAMPER_ON": "Turns yaw damper on",
+ "YAW_DAMPER_OFF": "Turns yaw damper off",
+ "YAW_DAMPER_SET": "Sets yaw damper on/off (1,0),",
+ "AP_AIRSPEED_ON": "Turns airspeed hold on",
+ "AP_AIRSPEED_OFF": "Turns airspeed hold off",
+ "AP_MACH_ON": "Turns mach hold on",
+ "AP_MACH_OFF": "Turns mach hold off",
+ "AP_MACH_SET": "Sets mach hold on/off (1,0),",
+ "AP_PANEL_ALTITUDE_ON": "Turns altitude hold mode on (without capturing current altitude),",
+ "AP_PANEL_ALTITUDE_OFF": "Turns altitude hold mode off",
+ "AP_PANEL_HEADING_ON": "Turns heading mode on (without capturing current heading),",
+ "AP_PANEL_HEADING_OFF": "Turns heading mode off",
+ "AP_PANEL_MACH_ON": "Turns on mach hold",
+ "AP_PANEL_MACH_OFF": "Turns off mach hold",
+ "AP_PANEL_SPEED_ON": "Turns on speed hold mode",
+ "AP_PANEL_SPEED_OFF": "Turns off speed hold mode",
+ "AP_ALT_VAR_SET_ENGLISH": "Sets altitude reference in feet",
+ "AP_VS_VAR_SET_METRIC": "Sets vertical speed reference in meters per minute",
+ "TOGGLE_FLIGHT_DIRECTOR": "Toggles flight director on/off",
+ "SYNC_FLIGHT_DIRECTOR_PITCH": "Synchronizes flight director pitch with current aircraft pitch",
+ "INCREASE_AUTOBRAKE_CONTROL": "Increments autobrake level",
+ "DECREASE_AUTOBRAKE_CONTROL": "Decrements autobrake level",
+ "AP_PANEL_SPEED_HOLD_TOGGLE": "Turns airspeed hold mode on with current airspeed",
+ "AP_PANEL_MACH_HOLD_TOGGLE": "Sets mach hold reference to current mach",
+ "AP_NAV_SELECT_SET": "Sets the nav (1 or 2), which is used by the Nav hold modes",
+ "HEADING_BUG_SELECT": "Selects the heading bug for use with +/-",
+ "ALTITUDE_BUG_SELECT": "Selects the altitude reference for use with +/-",
+ "VSI_BUG_SELECT": "Selects the vertical speed reference for use with +/-",
+ "AIRSPEED_BUG_SELECT": "Selects the airspeed reference for use with +/-",
+ "AP_PITCH_REF_INC_UP": "Increments the pitch reference for pitch hold mode",
+ "AP_PITCH_REF_INC_DN": "Decrements the pitch reference for pitch hold mode",
+ "AP_PITCH_REF_SELECT": "Selects pitch reference for use with +/-",
+ "AP_ATT_HOLD": "Toggle attitude hold mode",
+ "AP_LOC_HOLD": "Toggles localizer (only), hold mode",
+ "AP_APR_HOLD": "Toggles approach hold (localizer and glide-slope),",
+ "AP_HDG_HOLD": "Toggles heading hold mode",
+ "AP_ALT_HOLD": "Toggles altitude hold mode",
+ "AP_WING_LEVELER": "Toggles wing leveler mode",
+ "AP_BC_HOLD": "Toggles the backcourse mode for the localizer hold",
+ "AP_NAV1_HOLD": "Toggles the nav hold mode",
+ "AP_MAX_BANK_INC": "Autopilot max bank angle increment.",
+ "AP_MAX_BANK_DEC": "Autopilot max bank angle decrement.",
+ "AP_N1_HOLD": "Autopilot, hold the N1 percentage at its current level.",
+ "AP_N1_REF_INC": "Increment the autopilot N1 reference.",
+ "AP_N1_REF_DEC": "Decrement the autopilot N1 reference.",
+ "AP_N1_REF_SET": "Sets the autopilot N1 reference.",
+ "FLY_BY_WIRE_ELAC_TOGGLE": "Turn on or off the fly by wire Elevators and Ailerons computer.",
+ "FLY_BY_WIRE_FAC_TOGGLE": "Turn on or off the fly by wire Flight Augmentation computer.",
+ "FLY_BY_WIRE_SEC_TOGGLE": "Turn on or off the fly by wire Spoilers and Elevators computer.",
+ "AP_VS_HOLD": "Toggle VS hold mode",
+ "FLIGHT_LEVEL_CHANGE": "Toggle FLC mode",
+ "COM_RADIO_SET": "Sets COM frequency (Must convert integer to BCD16 Hz),",
+ "NAV1_RADIO_SET": "Sets NAV 1 frequency (Must convert integer to BCD16 Hz),",
+ "NAV2_RADIO_SET": "Sets NAV 2 frequency (Must convert integer to BCD16 Hz),",
+ "ADF_SET": "Sets ADF frequency (Must convert integer to BCD16 Hz),",
+ "XPNDR_SET": "Sets transponder code (Must convert integer to BCD16),",
+ "VOR1_SET": "Sets OBS 1 (0 to 360),",
+ "VOR2_SET": "Sets OBS 2 (0 to 360),",
+ "DME1_TOGGLE": "Sets DME display to Nav 1",
+ "DME2_TOGGLE": "Sets DME display to Nav 2",
+ "RADIO_VOR1_IDENT_DISABLE": "Turns NAV 1 ID off",
+ "RADIO_VOR2_IDENT_DISABLE": "Turns NAV 2 ID off",
+ "RADIO_DME1_IDENT_DISABLE": "Turns DME 1 ID off",
+ "RADIO_DME2_IDENT_DISABLE": "Turns DME 2 ID off",
+ "RADIO_ADF_IDENT_DISABLE": "Turns ADF 1 ID off",
+ "RADIO_VOR1_IDENT_ENABLE": "Turns NAV 1 ID on",
+ "RADIO_VOR2_IDENT_ENABLE": "Turns NAV 2 ID on",
+ "RADIO_DME1_IDENT_ENABLE": "Turns DME 1 ID on",
+ "RADIO_DME2_IDENT_ENABLE": "Turns DME 2 ID on",
+ "RADIO_ADF_IDENT_ENABLE": "Turns ADF 1 ID on",
+ "RADIO_VOR1_IDENT_TOGGLE": "Toggles NAV 1 ID",
+ "RADIO_VOR2_IDENT_TOGGLE": "Toggles NAV 2 ID",
+ "RADIO_DME1_IDENT_TOGGLE": "Toggles DME 1 ID",
+ "RADIO_DME2_IDENT_TOGGLE": "Toggles DME 2 ID",
+ "RADIO_ADF_IDENT_TOGGLE": "Toggles ADF 1 ID",
+ "RADIO_VOR1_IDENT_SET": "Sets NAV 1 ID (on/off),",
+ "RADIO_VOR2_IDENT_SET": "Sets NAV 2 ID (on/off),",
+ "RADIO_DME1_IDENT_SET": "Sets DME 1 ID (on/off),",
+ "RADIO_DME2_IDENT_SET": "Sets DME 2 ID (on/off),",
+ "RADIO_ADF_IDENT_SET": "Sets ADF 1 ID (on/off),",
+ "ADF_CARD_INC": "Increments ADF card",
+ "ADF_CARD_DEC": "Decrements ADF card",
+ "ADF_CARD_SET": "Sets ADF card (0-360),",
+ "TOGGLE_DME": "Toggles between NAV 1 and NAV 2",
+ "TOGGLE_AVIONICS_MASTER": "Toggles the avionics master switch",
+ "COM_STBY_RADIO_SET": "Sets COM 1 standby frequency (Must convert integer to BCD16 Hz),",
+ "COM_STBY_RADIO_SWAP": "Swaps COM 1 frequency with standby",
+ "COM2_RADIO_SET": "Sets COM 2 frequency (BCD Hz),",
+ "COM2_STBY_RADIO_SET": "Sets COM 2 standby frequency (Must convert integer to BCD16 Hz),",
+ "COM2_RADIO_SWAP": "Swaps COM 2 frequency with standby",
+ "NAV1_STBY_SET": "Sets NAV 1 standby frequency (Must convert integer to BCD16 Hz),",
+ "NAV1_RADIO_SWAP": "Swaps NAV 1 frequency with standby",
+ "NAV2_STBY_SET": "Sets NAV 2 standby frequency (Must convert integer to BCD16 Hz),",
+ "NAV2_RADIO_SWAP": "Swaps NAV 2 frequency with standby",
+ "COM1_TRANSMIT_SELECT": "Selects COM 1 to transmit",
+ "COM2_TRANSMIT_SELECT": "Selects COM 2 to transmit",
+ "COM_RECEIVE_ALL_TOGGLE": "Toggles all COM radios to receive on",
+ "MARKER_SOUND_TOGGLE": "Toggles marker beacon sound on/off",
+ "ADF_COMPLETE_SET": "Sets ADF 1 frequency (Must convert integer to BCD16 Hz),",
+ "RADIO_ADF2_IDENT_DISABLE": "Turns ADF 2 ID off",
+ "RADIO_ADF2_IDENT_ENABLE": "Turns ADF 2 ID on",
+ "RADIO_ADF2_IDENT_TOGGLE": "Toggles ADF 2 ID",
+ "RADIO_ADF2_IDENT_SET": "Sets ADF 2 ID on/off (1,0),",
+ "FREQUENCY_SWAP": "Swaps frequency with standby on whichever NAV or COM radio is selected.",
+ "TOGGLE_GPS_DRIVES_NAV1": "Toggles between GPS and NAV 1 driving NAV 1 OBS display (and AP),",
+ "GPS_ACTIVATE_BUTTON": "Activates GPS Autopilot mode",
+ "GPS_POWER_BUTTON": "Toggles power button",
+ "GPS_NEAREST_BUTTON": "Selects Nearest Airport Page",
+ "GPS_OBS_BUTTON": "Toggles automatic sequencing of waypoints",
+ "GPS_MSG_BUTTON": "Toggles the Message Page",
+ "GPS_MSG_BUTTON_DOWN": "Triggers the pressing of the message button.",
+ "GPS_MSG_BUTTON_UP": "Triggers the release of the message button",
+ "GPS_FLIGHTPLAN_BUTTON": "Displays the programmed flightplan.",
+ "GPS_TERRAIN_BUTTON": "Displays terrain information on default display",
+ "GPS_PROCEDURE_BUTTON": "Displays the approach procedure page.",
+ "GPS_ZOOMIN_BUTTON": "Zooms in default display",
+ "GPS_ZOOMOUT_BUTTON": "Zooms out default display",
+ "GPS_DIRECTTO_BUTTON": "Brings up the \"Direct To\" page",
+ "GPS_MENU_BUTTON": "Brings up page to select active legs in a flightplan.",
+ "GPS_CLEAR_BUTTON": "Clears entered data on a page",
+ "GPS_CLEAR_ALL_BUTTON": "Clears all data immediately",
+ "GPS_CLEAR_BUTTON_DOWN": "Triggers the pressing of the Clear button",
+ "GPS_CLEAR_BUTTON_UP": "Triggers the release of the Clear button.",
+ "GPS_ENTER_BUTTON": "Approves entered data.",
+ "GPS_CURSOR_BUTTON": "Selects GPS cursor",
+ "GPS_GROUP_KNOB_INC": "Increments cursor",
+ "GPS_GROUP_KNOB_DEC": "Decrements cursor",
+ "GPS_PAGE_KNOB_INC": "Increments through pages",
+ "GPS_PAGE_KNOB_DEC": "Decrements through pages",
+ "DME_SELECT": "Selects one of the two DME systems (1,2),.",
+ "KOHLSMAN_SET": "Sets altimeter setting (Millibars * 16),",
+ "BAROMETRIC": "Syncs altimeter setting to sea level pressure, or 29.92 if above 18000 feet",
+ }
+ ATC Events:
+ {
+ "ATC": "Activates ATC window",
+ "ATC_MENU_1": "Selects ATC option 1, use other numbers for other options 2-10",
+ }
+ Other Miscellaneous Events:
+ {
+ "PARKING_BRAKES": "Toggles parking brake on/off",
+ "GEAR_PUMP": "Increments emergency gear extension",
+ "PITOT_HEAT_ON": "Turns pitot heat switch on",
+ "PITOT_HEAT_OFF": "Turns pitot heat switch off",
+ "GEAR_UP": "Sets gear handle in UP position",
+ "GEAR_DOWN": "Sets gear handle in DOWN position",
+ "TOGGLE_MASTER_BATTERY": "Toggles main battery switch",
+ "TOGGLE_MASTER_ALTERNATOR": "Toggles main alternator/generator switch",
+ "TOGGLE_ELECTRIC_VACUUM_PUMP": "Toggles backup electric vacuum pump",
+ "TOGGLE_ALTERNATE_STATIC": "Toggles alternate static pressure port",
+ "DECREASE_DECISION_HEIGHT": "Decrements decision height reference",
+ "INCREASE_DECISION_HEIGHT": "Increments decision height reference",
+ "TOGGLE_STRUCTURAL_DEICE": "Toggles structural deice switch",
+ "TOGGLE_PROPELLER_DEICE": "Toggles propeller deice switch",
+ "TOGGLE_ALTERNATOR1": "Toggles alternator/generator 1 switch",
+ "TOGGLE_ALTERNATOR2": "Toggles alternator/generator 2 switch",
+ "TOGGLE_ALTERNATOR3": "Toggles alternator/generator 3 switch",
+ "TOGGLE_ALTERNATOR4": "Toggles alternator/generator 4 switch",
+ "TOGGLE_MASTER_BATTERY_ALTERNATOR": "Toggles master battery and alternator switch",
+ "TOGGLE_AIRCRAFT_EXIT": "Toggles primary door open/close. Follow by KEY_SELECT_2, etc for subsequent doors.",
+ "SET_WING_FOLD": "Sets the wings into the folded position suitable for storage, typically on a carrier. Takes a value:\n 1 - fold wings,\n 0 - unfold wings",
+ "TOGGLE_TAIL_HOOK_HANDLE": "Toggles tail hook",
+ "TOGGLE_WATER_RUDDER": "Toggles water rudders",
+ "TOGGLE_PUSHBACK": "Toggles pushback.",
+ "TOGGLE_MASTER_IGNITION_SWITCH": "Toggles master ignition switch",
+ "TOGGLE_TAILWHEEL_LOCK": "Toggles tail wheel lock",
+ "ADD_FUEL_QUANTITY": "Adds fuel to the aircraft, 25% of capacity by default. 0 to 65535 (max fuel), can be passed.",
+ "RETRACT_FLOAT_SWITCH_DEC": "If the plane has retractable floats, moves the retract position from Extend to Neutral, or Neutral to Retract.",
+ "RETRACT_FLOAT_SWITCH_INC": "If the plane has retractable floats, moves the retract position from Retract to Neutral, or Neutral to Extend.",
+ "TOGGLE_WATER_BALLAST_VALVE": "Turn the water ballast valve on or off.",
+ "TOGGLE_VARIOMETER_SWITCH": "Turn the variometer on or off.",
+ "APU_STARTER": "Start up the auxiliary power unit (APU),.",
+ "APU_OFF_SWITCH": "Turn the APU off.",
+ "APU_GENERATOR_SWITCH_TOGGLE": "Turn the auxiliary generator on or off.",
+ "APU_GENERATOR_SWITCH_SET": "Set the auxiliary generator switch (0,1),.",
+ "HYDRAULIC_SWITCH_TOGGLE": "Turn the hydraulic switch on or off.",
+ "BLEED_AIR_SOURCE_CONTROL_INC": "Increases the bleed air source control.",
+ "BLEED_AIR_SOURCE_CONTROL_DEC": "Decreases the bleed air source control.",
+ "BLEED_AIR_SOURCE_CONTROL_SET": "Set to one of:\n 0: auto\n 1: off\n 2: apu\n 3: engines",
+ "TURBINE_IGNITION_SWITCH_TOGGLE": "Turn the turbine ignition switch on or off.",
+ "CABIN_NO_SMOKING_ALERT_SWITCH_TOGGLE": "Turn the \"No smoking\" alert on or off.",
+ "CABIN_SEATBELTS_ALERT_SWITCH_TOGGLE": "Turn the \"Fasten seatbelts\" alert on or off.",
+ "ANTISKID_BRAKES_TOGGLE": "Turn the anti-skid braking system on or off.",
+ "GPWS_SWITCH_TOGGLE": "Turn the g round proximity warning system (GPWS), on or off.",
+ "MANUAL_FUEL_PRESSURE_PUMP": "Activate the manual fuel pressure pump.",
+ "PAUSE_ON": "Turns pause on",
+ "PAUSE_OFF": "Turns pause off",
+ "SIM_RATE_INCR": "Increase sim rate",
+ "SIM_RATE_DECR": "Decrease sim rate",
+ "INVOKE_HELP": "Brings up Help system",
+ "FLIGHT_MAP": "Brings up flight map",
+ }
+ And here are some common data points to otabin aircraft state information using the get_data_from_sim function:
+ {
+ "GROUND_VELOCITY": ["Speed relative to the earths surface", b'Knots'],
+ "TOTAL_WORLD_VELOCITY": ["Speed relative to the earths center", b'Feet per second'],
+ "ACCELERATION_WORLD_X": ["Acceleration relative to earth, in east/west direction", b'Feet per second squared'],
+ "ACCELERATION_WORLD_Y": ["Acceleration relative to earch, in vertical direction", b'Feet per second squared'],
+ "ACCELERATION_WORLD_Z": ["Acceleration relative to earth, in north/south direction", b'Feet per second squared'],
+ "ACCELERATION_BODY_X": ["Acceleration relative to aircraft axix, in east/west direction", b'Feet per second squared'],
+ "ACCELERATION_BODY_Y": ["Acceleration relative to aircraft axis, in vertical direction", b'Feet per second squared',],
+ "ACCELERATION_BODY_Z": ["Acceleration relative to aircraft axis, in north/south direction", b'Feet per second squared'],
+ "ROTATION_VELOCITY_BODY_X": ["Rotation relative to aircraft axis", b'Feet per second'],
+ "ROTATION_VELOCITY_BODY_Y": ["Rotation relative to aircraft axis", b'Feet per second'],
+ "ROTATION_VELOCITY_BODY_Z": ["Rotation relative to aircraft axis", b'Feet per second'],
+ "RELATIVE_WIND_VELOCITY_BODY_X": ["Lateral speed relative to wind", b'Feet per second'],
+ "RELATIVE_WIND_VELOCITY_BODY_Y": ["Vertical speed relative to wind", b'Feet per second'],
+ "RELATIVE_WIND_VELOCITY_BODY_Z": ["Longitudinal speed relative to wind", b'Feet per second'],
+ "PLANE_ALT_ABOVE_GROUND": ["Altitude above the surface", b'Feet'],
+ "PLANE_LATITUDE": ["Latitude of aircraft, North is positive, South negative", b'Degrees'],
+ "PLANE_LONGITUDE": ["Longitude of aircraft, East is positive, West negative", b'Degrees'],
+ "PLANE_ALTITUDE": ["Altitude of aircraft", b'Feet'],
+ "PLANE_PITCH_DEGREES": ["Pitch angle, although the name mentions degrees the units used are radians", b'Radians'],
+ "PLANE_BANK_DEGREES": ["Bank angle, although the name mentions degrees the units used are radians", b'Radians'],
+ "PLANE_HEADING_DEGREES_TRUE": ["Heading relative to true north, although the name mentions degrees the units used are radians", b'Radians'],
+ "PLANE_HEADING_DEGREES_MAGNETIC": ["Heading relative to magnetic north, although the name mentions degrees the units used are radians", b'Radians'],
+ "MAGVAR": ["Magnetic variation", b'Degrees'],
+ "GROUND_ALTITUDE": ["Altitude of surface", b'Meters'],
+ "SIM_ON_GROUND": ["On ground flag", b'Bool'],
+ "INCIDENCE_ALPHA": ["Angle of attack", b'Radians'],
+ "INCIDENCE_BETA": ["Sideslip angle", b'Radians'],
+ "AIRSPEED_TRUE": ["True airspeed", b'Knots'],
+ "AIRSPEED_INDICATED": ["Indicated airspeed", b'Knots'],
+ "AIRSPEED_TRUE_CALIBRATE": ["Angle of True calibration scale on airspeed indicator", b'Degrees'],
+ "AIRSPEED_BARBER_POLE": ["Redline airspeed (dynamic on some aircraft)", b'Knots'],
+ "AIRSPEED_MACH": ["Current mach", b'Mach'],
+ "VERTICAL_SPEED": ["Vertical speed indication", b'feet/minute'],
+ "MACH_MAX_OPERATE": ["Maximum design mach", b'Mach'],
+ "STALL_WARNING": ["Stall warning state", b'Bool'],
+ "OVERSPEED_WARNING": ["Overspeed warning state", b'Bool'],
+ "INDICATED_ALTITUDE": ["Altimeter indication", b'Feet'],
+ "ATTITUDE_INDICATOR_PITCH_DEGREES": ["AI pitch indication", b'Radians'],
+ "ATTITUDE_INDICATOR_BANK_DEGREES": ["AI bank indication", b'Radians'],
+ "ATTITUDE_BARS_POSITION": ["AI reference pitch reference bars", b'Percent Over 100'],
+ "ATTITUDE_CAGE": ["AI caged state", b'Bool'],
+ "WISKEY_COMPASS_INDICATION_DEGREES": ["Magnetic compass indication", b'Degrees'],
+ "PLANE_HEADING_DEGREES_GYRO": ["Heading indicator (directional gyro) indication", b'Radians'],
+ "HEADING_INDICATOR": ["Heading indicator (directional gyro) indication", b'Radians'],
+ "GYRO_DRIFT_ERROR": ["Angular error of heading indicator", b'Radians',],
+ "DELTA_HEADING_RATE": ["Rate of turn of heading indicator", b'Radians per second'],
+ "TURN_COORDINATOR_BALL": ["Turn coordinator ball position", b'Position'],
+ "ANGLE_OF_ATTACK_INDICATOR": ["AoA indication", b'Radians'],
+ "RADIO_HEIGHT": ["Radar altitude", b'Feet'],
+ "ABSOLUTE_TIME": ["Time, as referenced from 12:00 AM January 1, 0000", b'Seconds'],
+ "ZULU_TIME": ["Greenwich Mean Time (GMT)", b'Seconds'],
+ "ZULU_DAY_OF_WEEK": ["GMT day of week", b'Number'],
+ "ZULU_DAY_OF_MONTH": ["GMT day of month", b'Number'],
+ "ZULU_MONTH_OF_YEAR": ["GMT month of year", b'Number',],
+ "ZULU_DAY_OF_YEAR": ["GMT day of year", b'Number'],
+ "ZULU_YEAR": ["GMT year", b'Number'],
+ "LOCAL_TIME": ["Local time", b'Seconds'],
+ "LOCAL_DAY_OF_WEEK": ["Local day of week", b'Number'],
+ "LOCAL_DAY_OF_MONTH": ["Local day of month", b'Number',],
+ "LOCAL_MONTH_OF_YEAR": ["Local month of year", b'Number',],
+ "LOCAL_DAY_OF_YEAR": ["Local day of year", b'Number'],
+ "LOCAL_YEAR": ["Local year", b'Number'],
+ "TIME_ZONE_OFFSET": ["Local time difference from GMT", b'Seconds'],
+ "ATC_TYPE": ["Type used by ATC", b'String'],
+ "ATC_MODEL": ["Model used by ATC", b'String'],
+ "ATC_ID": ["ID used by ATC", b'String'],
+ "ATC_AIRLINE": ["Airline used by ATC", b'String'],
+ "ATC_FLIGHT_NUMBER": ["Flight Number used by ATC", b'String'],
+ "TITLE": ["Title from aircraft.cfg", b'String'],
+ "HSI_STATION_IDENT": ["Tuned station identifier", b'String'],
+ "GPS_WP_PREV_ID": ["ID of previous GPS waypoint", b'String'],
+ "GPS_WP_NEXT_ID": ["ID of next GPS waypoint", b'String'],
+ "GPS_APPROACH_AIRPORT_ID": ["ID of airport", b'String'],
+ "GPS_APPROACH_APPROACH_ID": ["ID of approach", b'String'],
+ "GPS_APPROACH_TRANSITION_ID": ["ID of approach transition", b'String'],
+ "GPS_ETA": ["Estimated time of arrival at destination in seconds"]
+ "GPS_ETE": ["Estimated time en route to destination in seconds"]
+ "GPS_TARGET_DISTANCE": ["Estimated distance to destination in meters"]
+ "NAV_LOC_AIRPORT_IDENT": ["Airport ICAO code for airport tuned in Nav radio"]
+ "AMBIENT_DENSITY": ["Ambient density", b'Slugs per cubic feet'],
+ "AMBIENT_TEMPERATURE": ["Ambient temperature", b'Celsius'],
+ "AMBIENT_PRESSURE": ["Ambient pressure", b'inHg'],
+ "AMBIENT_WIND_VELOCITY": ["Wind velocity", b'Knots'],
+ "AMBIENT_WIND_DIRECTION": ["Wind direction", b'Degrees'],
+ "AMBIENT PRECIP STATE" : [State of current precipitation, b'String'],
+ "BAROMETER_PRESSURE": ["Barometric pressure", b'Millibars'],
+ "SEA_LEVEL_PRESSURE": ["Barometric pressure at sea level", b'Millibars'],
+ "TOTAL_AIR_TEMPERATURE": ["Total air temperature is the air temperature at the front of the aircraft where the ram pressure from the speed of the aircraft is taken into account.", b'Celsius'],
+ "AMBIENT_IN_CLOUD": ["True if the aircraft is in a cloud.", b'Bool'],
+ "AMBIENT_VISIBILITY": ["Ambient visibility", b'Meters'],
+ "GENERAL_ENG_RPM:index": ["Engine rpm", b'Rpm'],
+ "GENERAL_ENG_PCT_MAX_RPM:index": ["Percent of max rated rpm", b'Percent'],
+ "GENERAL_ENG_EXHAUST_GAS_TEMPERATURE:index": ["Engine exhaust gas temperature.", b'Rankine'],
+ "GENERAL_ENG_OIL_PRESSURE:index": ["Engine oil pressure", b'Psf'],
+ "GENERAL_ENG_OIL_TEMPERATURE:index": ["Engine oil temperature", b'Rankine'],
+ "GENERAL_ENG_FUEL_PRESSURE:index": ["Engine fuel pressure", b'Psi'],
+ "ENG_OIL_TEMPERATURE:index": ["Engine oil temperature", b'Rankine',],
+ "FUEL_TOTAL_QUANTITY": ["Current fuel in volume", b'Gallons'],
+ "FUEL_TOTAL_CAPACITY": ["Total fuel capacity of the aircraft", b'Gallons'],
+ }
+custom_properties:
+ - hint: Whether to try to autostart data monitoring mode (tour guide mode), which automatically monitors for latitude, longitude and altitude. This can also be toggled with a voice command to start or end data monitoring mode.
+ id: autostart_data_monitoring_loop_mode
+ name: Autostart tour guide mode
+ property_type: boolean
+ required: true
+ value: false
+ - hint: Minimum interval in seconds before data monitoring loop will run again, in seconds.
+ id: min_data_monitoring_seconds
+ name: Minimum monitoring interval
+ property_type: number
+ required: true
+ value: 60
+ - hint: Maximum interval in seconds before data monitoring loop will run again, in seconds.
+ id: max_data_monitoring_seconds
+ name: Maximum monitoring interval
+ property_type: number
+ required: true
+ value: 360
+ - hint: The backstory to use for data monitoring mode. Leave blank if you just want to use what is already in your wingman's backstory.
+ id: data_monitoring_backstory
+ name: Tour guide mode backstory
+ property_type: textarea
+ required: false
+ value: |
+ You are a friendly copilot in the plane with the user, the pilot. Casually comment on the following information in a way that keeps your personality and role play. If it is data about a place, comment about just having flown over the place (or if the plane is on the ground, about being at the place) and provide some brief commentary on the place in an engaging tone. Keep your remarks succinct, and avoid directly talking about latitudes and longitudes:
\ No newline at end of file
diff --git a/skills/msfs2020_control/logo.png b/skills/msfs2020_control/logo.png
new file mode 100644
index 00000000..53699cde
Binary files /dev/null and b/skills/msfs2020_control/logo.png differ
diff --git a/skills/msfs2020_control/main.py b/skills/msfs2020_control/main.py
new file mode 100644
index 00000000..1f1ece41
--- /dev/null
+++ b/skills/msfs2020_control/main.py
@@ -0,0 +1,427 @@
+import time
+import random
+import requests
+from typing import TYPE_CHECKING
+from SimConnect import *
+from api.interface import (
+ SettingsConfig,
+ SkillConfig,
+ WingmanInitializationError,
+)
+from api.enums import LogType
+from skills.skill_base import Skill
+
+if TYPE_CHECKING:
+ from wingmen.open_ai_wingman import OpenAiWingman
+
+
+class Msfs2020Control(Skill):
+
+ def __init__(self, config: SkillConfig, settings: SettingsConfig, wingman: "OpenAiWingman") -> None:
+ super().__init__(config=config, settings=settings, wingman=wingman)
+ self.already_initialized_simconnect = False
+ self.loaded = False
+ self.sm = None # Needs to be set once MSFS2020 is actually connected
+ self.aq = None # Same
+ self.ae = None # Same
+ self.data_monitoring_loop_running = False
+ self.autostart_data_monitoring_loop_mode = False
+ self.data_monitoring_backstory = ""
+ self.min_data_monitoring_seconds = 60
+ self.max_data_monitoring_seconds = 360
+
+ async def validate(self) -> list[WingmanInitializationError]:
+ errors = await super().validate()
+ self.autostart_data_monitoring_loop_mode = self.retrieve_custom_property_value(
+ "autostart_data_monitoring_loop_mode", errors
+ )
+ self.data_monitoring_backstory = self.retrieve_custom_property_value(
+ "data_monitoring_backstory", errors
+ )
+ # If not available or not set, use default wingman's backstory
+ if not self.data_monitoring_backstory or self.data_monitoring_backstory == "" or self.data_monitoring_backstory == " ":
+ self.data_monitoring_backstory = self.wingman.config.prompts.backstory
+
+ self.min_data_monitoring_seconds = self.retrieve_custom_property_value(
+ "min_data_monitoring_seconds", errors
+ )
+
+ self.max_data_monitoring_seconds = self.retrieve_custom_property_value(
+ "max_data_monitoring_seconds", errors
+ )
+
+ return errors
+
+ def get_tools(self) -> list[tuple[str, dict]]:
+ return [
+ (
+ "get_data_from_sim",
+ {
+ "type": "function",
+ "function": {
+ "name": "get_data_from_sim",
+ "description": "Retrieve data points from Microsoft Flight Simulator 2020 using the Python SimConnect module.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "data_point": {
+ "type": "string",
+ "description": "The data point to retrieve, such as 'PLANE_ALTITUDE', 'PLANE_HEADING_DEGREES_TRUE'.",
+ },
+ },
+ "required": ["data_point"],
+ },
+ },
+ },
+ ),
+ (
+ "set_data_or_perform_action_in_sim",
+ {
+ "type": "function",
+ "function": {
+ "name": "set_data_or_perform_action_in_sim",
+ "description": "Set data points or perform actions in Microsoft Flight Simulator 2020 using the Python SimConnect module.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "action": {
+ "type": "string",
+ "description": "The action to perform or data point to set, such as 'TOGGLE_MASTER_BATTERY', 'THROTTLE_SET'.",
+ },
+ "argument": {
+ "type": "number",
+ "description": "The argument to pass for the action, if any. For actions like 'TOGGLE_MASTER_BATTERY', no argument is needed. For 'THROTTLE_SET', pass the throttle value.",
+ },
+ },
+ "required": ["action"],
+ },
+ },
+ },
+ ),
+ (
+ "start_or_activate_data_monitoring_loop",
+ {
+ "type": "function",
+ "function": {
+ "name": "start_or_activate_data_monitoring_loop",
+ "description": "Begin data monitoring loop, which will check certain data points at designated intervals. May be referred to as tour guide mode.",
+ },
+ },
+ ),
+ (
+ "end_or_stop_data_monitoring_loop",
+ {
+ "type": "function",
+ "function": {
+ "name": "end_or_stop_data_monitoring_loop",
+ "description": "End or stop data monitoring loop, to stop automatically checking data points at designated intervals. May be referred to as tour guide mode.",
+ },
+ },
+ ),
+ (
+ "get_information_about_current_location",
+ {
+ "type": "function",
+ "function": {
+ "name": "get_information_about_current_location",
+ "description": "Used to provide more detailed information if the user asks a general question like 'where are we?', 'what city are we flying over?', or 'what country is down there?'",
+ },
+ },
+ ),
+ ]
+
+ # Using sample methods found here; allow AI to determine the appropriate variables and arguments, if any:
+ # https://pypi.org/project/SimConnect/
+ async def execute_tool(self, tool_name: str, parameters: dict[str, any]) -> tuple[str, str]:
+ function_response = "Error in execution. Can you please try your command again?"
+ instant_response = ""
+
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+ await self.printr.print_async(
+ f"Executing {tool_name} function with parameters: {parameters}",
+ color=LogType.INFO,
+ )
+
+ if tool_name == "get_data_from_sim":
+ data_point = parameters.get("data_point")
+ value = self.aq.get(data_point)
+ function_response = f"{data_point} value is: {value}"
+
+ elif tool_name == "set_data_or_perform_action_in_sim":
+ action = parameters.get("action")
+ argument = parameters.get("argument", None)
+
+ try:
+ if argument is not None:
+ self.aq.set(action, argument)
+ else:
+ event_to_trigger = self.ae.find(action)
+ event_to_trigger()
+ except:
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Tried to perform action {action} with argument {argument} using aq.set, now going to try ae.event_to_trigger.",
+ color=LogType.INFO,
+ )
+
+ try:
+ if argument is not None:
+ event_to_trigger = self.ae.find(action)
+ event_to_trigger(argument)
+ except:
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+ await self.printr.print_async(
+ f"Neither aq.set nor ae.event_to_trigger worked with {action} and {argument}. Command failed.",
+ color=LogType.INFO,
+ )
+ return function_response, instant_response
+
+ function_response = f"Action '{action}' executed with argument '{argument}'"
+
+ elif tool_name == "start_or_activate_data_monitoring_loop":
+ if self.data_monitoring_loop_running:
+ function_response = "Data monitoring loop is already running."
+ return function_response, instant_response
+
+ self.start_execution_benchmark()
+ await self.printr.print_async(
+ f"Executing start_or_activate_data_monitoring_loop",
+ color=LogType.INFO,
+ )
+
+ if not self.already_initialized_simconnect:
+ function_response = "Cannot start data monitoring / tour guide mode because simconnect is not connected yet. Check to make sure the game is running."
+ return function_response, instant_response
+
+ if not self.data_monitoring_loop_running:
+ await self.initialize_data_monitoring_loop()
+
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+
+ function_response = "Started data monitoring loop/tour guide mode."
+
+ elif tool_name == "end_or_stop_data_monitoring_loop":
+ self.start_execution_benchmark()
+ await self.printr.print_async(
+ f"Executing end_or_stop_data_monitoring_loop",
+ color=LogType.INFO,
+ )
+
+ await self.stop_data_monitoring_loop()
+
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+
+ function_response = "Closed data monitoring / tour guide mode."
+
+ elif tool_name == "get_information_about_current_location":
+ place_info = await self.convert_lat_long_data_into_place_data()
+
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+
+ if place_info:
+ on_ground = self.aq.get("SIM_ON_GROUND")
+ on_ground_statement = "The plane is currently in the air."
+ if on_ground == False:
+ on_ground_statement = "The plane is currently on the ground."
+ function_response = f"{on_ground_statement} Detailed information regarding the location we are currently at or flying over: {place_info}"
+ else:
+ function_response = "Unable to get more detailed information regarding the place based on the current latitude and longitude."
+
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+ await self.printr.print_async(
+ f"{function_response}",
+ color=LogType.INFO,
+ )
+
+ return function_response, instant_response
+
+
+ # Search for MSFS2020 sim running and then connect
+ async def start_simconnect(self):
+ while self.loaded and not self.already_initialized_simconnect:
+ try:
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Attempting to find MSFS2020....",
+ color=LogType.INFO,
+ )
+ self.sm = SimConnect()
+ self.aq = AircraftRequests(self.sm, _time=2000)
+ self.ae = AircraftEvents(self.sm)
+ self.already_initialized_simconnect = True
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Initialized SimConnect with MSFS2020.",
+ color=LogType.INFO,
+ )
+ if self.autostart_data_monitoring_loop_mode:
+ await self.initialize_data_monitoring_loop()
+ except:
+ # Wait 30 seconds between connect attempts
+ time.sleep(30)
+
+ async def initialize_data_monitoring_loop(self):
+ if self.data_monitoring_loop_running:
+ return
+
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ "Starting data monitoring loop",
+ color=LogType.INFO,
+ )
+
+ self.threaded_execution(self.start_data_monitoring_loop)
+
+ async def start_data_monitoring_loop(self):
+ if not self.data_monitoring_loop_running:
+ self.data_monitoring_loop_running = True
+
+ while self.data_monitoring_loop_running:
+ random_time = random.choice(range(self.min_data_monitoring_seconds, self.max_data_monitoring_seconds, 15)) #Gets random number from min to max in increments of 15
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ "Attempting looped monitoring check.",
+ color=LogType.INFO,
+ )
+ try:
+ place_data = await self.convert_lat_long_data_into_place_data()
+ if place_data:
+ await self.initiate_llm_call_with_plane_data(place_data)
+ except Exception as e:
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Something failed in looped monitoring check. Could not return data or send to llm: {e}.",
+ color=LogType.INFO,
+ )
+ time.sleep(random_time)
+
+ async def stop_data_monitoring_loop(self):
+ self.data_monitoring_loop_running = False
+
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ "Stopping data monitoring loop",
+ color=LogType.INFO,
+ )
+
+ async def convert_lat_long_data_into_place_data(self, latitude=None, longitude=None, altitude=None):
+ if not self.already_initialized_simconnect or not self.sm or not self.aq:
+ return None
+ ground_altitude = 0
+ # If all parameters are already provided, just run the request
+ if latitude and longitude and altitude:
+ ground_altitude = self.aq.get("GROUND_ALTITUDE")
+ # If only latitude and longitude, grab altitude so a reasonable "zoom level" can be set for place data
+ elif latitude and longitude:
+ altitude = self.aq.get("PLANE_ALTITUDE")
+ ground_altitude = self.aq.get("GROUND_ALTITUDE")
+ # Otherwise grab all data components
+ else:
+ latitude = self.aq.get("PLANE_LATITUDE")
+ longitude = self.aq.get("PLANE_LONGITUDE")
+ altitude = self.aq.get("PLANE_ALTITUDE")
+ ground_altitude = self.aq.get("GROUND_ALTITUDE")
+
+ # If no values still, for instance, when connection is made but no data yet, return None
+ if not latitude or not longitude or not altitude or not ground_altitude:
+ return None
+
+ # Set zoom level based on altitude, see zoom documentation at https://nominatim.org/release-docs/develop/api/Reverse/
+ zoom = 18
+ distance_above_ground = altitude - ground_altitude
+ if distance_above_ground <= 1500:
+ zoom = 18
+ elif distance_above_ground <= 3500:
+ zoom = 17
+ elif distance_above_ground <= 5000:
+ zoom = 15
+ elif distance_above_ground <= 10000:
+ zoom = 13
+ elif distance_above_ground <= 20000:
+ zoom = 10
+ else:
+ zoom = 8
+
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Attempting query of OpenStreetMap Nominatum with parameters: {latitude}, {longitude}, {altitude}, zoom level: {zoom}",
+ color=LogType.INFO,
+ )
+
+ # Request data from openstreetmap nominatum api for reverse geocoding
+ url = f"https://nominatim.openstreetmap.org/reverse?format=jsonv2&lat={latitude}&lon={longitude}&zoom={zoom}&accept-language=en&extratags=1"
+ headers = {
+ 'User-Agent': f'msfs2020control_skill wingmanai {self.wingman.name}'
+ }
+ response = requests.get(url, headers=headers)
+ if response.status_code == 200:
+ return response.json()
+ else:
+ if self.settings.debug_mode:
+ await self.printr.print_async(f"API request failed to {url}, status code: {response.status_code}.", color=LogType.INFO)
+ return None
+
+ # Get LLM to provide a verbal response to the user, without requiring the user to initiate a communication with the LLM
+ async def initiate_llm_call_with_plane_data(self, data):
+ on_ground = self.aq.get("SIM_ON_GROUND")
+ on_ground_statement = "The plane is currently in the air."
+ if on_ground:
+ on_ground_statement = "The plane is currently on the ground."
+ user_content = f"{on_ground_statement} Information about the location: {data}"
+ messages = [
+ {
+ 'role': 'system',
+ 'content': f"""
+ {self.data_monitoring_backstory}
+ """,
+ },
+ {
+ 'role': 'user',
+ 'content': user_content,
+ },
+ ]
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Attempting llm call with parameters: {self.data_monitoring_backstory}, {user_content}.",
+ color=LogType.INFO,
+ )
+ completion = await self.llm_call(messages)
+ response = completion.choices[0].message.content if completion and completion.choices else ""
+
+ if not response:
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Llm call returned no response.",
+ color=LogType.INFO,
+ )
+ return
+
+ await self.printr.print_async(
+ text=f"Data monitoring response: {response}",
+ color=LogType.INFO,
+ source_name=self.wingman.name
+ )
+
+ self.threaded_execution(self.wingman.play_to_user, response, True)
+ await self.wingman.add_assistant_message(response)
+
+ async def is_waiting_response_needed(self, tool_name: str) -> bool:
+ return True
+
+ async def prepare(self) -> None:
+ """Load the skill by trying to connect to the sim"""
+ self.loaded = True
+ self.threaded_execution(self.start_simconnect)
+
+ async def unload(self) -> None:
+ """Unload the skill."""
+ await self.stop_data_monitoring_loop()
+ self.loaded = False
+ if self.sm:
+ self.sm.exit()
diff --git a/skills/msfs2020_control/requirements.txt b/skills/msfs2020_control/requirements.txt
new file mode 100644
index 00000000..b3b490eb
--- /dev/null
+++ b/skills/msfs2020_control/requirements.txt
@@ -0,0 +1 @@
+SimConnect~=0.4.26
\ No newline at end of file
diff --git a/skills/skill_base.py b/skills/skill_base.py
index ca140865..634cd4f0 100644
--- a/skills/skill_base.py
+++ b/skills/skill_base.py
@@ -28,18 +28,25 @@ def __init__(
self.wingman = wingman
self.secret_keeper = SecretKeeper()
+ self.secret_keeper.secret_events.subscribe("secrets_saved", self.secret_changed)
self.name = self.__class__.__name__
self.printr = Printr()
self.execution_start: None | float = None
"""Used for benchmarking executon times. The timer is (re-)started whenever the process function starts."""
+ async def secret_changed(self, secrets: dict[str, any]):
+ """Called when a secret is changed."""
+ pass
+
async def validate(self) -> list[WingmanInitializationError]:
"""Validates the skill configuration."""
return []
async def unload(self) -> None:
"""Unload the skill. Use this hook to clear background tasks, etc."""
- pass
+ self.secret_keeper.secret_events.unsubscribe(
+ "secrets_saved", self.secret_changed
+ )
async def prepare(self) -> None:
"""Prepare the skill. Use this hook to initialize background tasks, etc."""
diff --git a/skills/spotify/main.py b/skills/spotify/main.py
index 583ceacd..17639270 100644
--- a/skills/spotify/main.py
+++ b/skills/spotify/main.py
@@ -24,16 +24,23 @@ def __init__(
self.data_path = get_writable_dir(path.join("skills", "spotify", "data"))
self.spotify: spotipy.Spotify = None
self.available_devices = []
+ self.secret: str = None
+
+ async def secret_changed(self, secrets: dict[str, any]):
+ await super().secret_changed(secrets)
+
+ if secrets["spotify_client_secret"] != self.secret:
+ await self.validate()
async def validate(self) -> list[WingmanInitializationError]:
errors = await super().validate()
- secret = await self.retrieve_secret("spotify_client_secret", errors)
+ self.secret = await self.retrieve_secret("spotify_client_secret", errors)
client_id = self.retrieve_custom_property_value("spotify_client_id", errors)
redirect_url = self.retrieve_custom_property_value(
"spotify_redirect_url", errors
)
- if secret and client_id and redirect_url:
+ if self.secret and client_id and redirect_url:
# now that we have everything, initialize the Spotify client
cache_handler = spotipy.cache_handler.CacheFileHandler(
cache_path=f"{self.data_path}/.cache"
@@ -41,7 +48,7 @@ async def validate(self) -> list[WingmanInitializationError]:
self.spotify = spotipy.Spotify(
auth_manager=SpotifyOAuth(
client_id=client_id,
- client_secret=secret,
+ client_secret=self.secret,
redirect_uri=redirect_url,
scope=[
"user-library-read",
diff --git a/skills/thinking_sound/default_config.yaml b/skills/thinking_sound/default_config.yaml
new file mode 100644
index 00000000..a6a9781a
--- /dev/null
+++ b/skills/thinking_sound/default_config.yaml
@@ -0,0 +1,23 @@
+name: ThinkingSound
+module: skills.thinking_sound.main
+category: general
+description:
+ en: Plays back sounds while waiting on AI response.
+ de: Spielt Sounds ab, während auf die Antwort der AI gewartet wird.
+custom_properties:
+ - id: audio_config
+ name: Audio Configuration
+ hint: Choose your files (random selection on multiple) and volume for playback. Recommended volume is 0.2 - 0.4.
+ required: true
+ property_type: audio_files
+ options:
+ - label: wait
+ value: false
+ - label: multiple
+ value: true
+ - label: volume
+ value: true
+ value:
+ files: []
+ volume: 0.4
+ wait: false
diff --git a/skills/thinking_sound/logo.png b/skills/thinking_sound/logo.png
new file mode 100644
index 00000000..80f17312
Binary files /dev/null and b/skills/thinking_sound/logo.png differ
diff --git a/skills/thinking_sound/main.py b/skills/thinking_sound/main.py
new file mode 100644
index 00000000..9317a7bf
--- /dev/null
+++ b/skills/thinking_sound/main.py
@@ -0,0 +1,85 @@
+import asyncio
+from api.enums import LogType
+from api.interface import WingmanInitializationError, AudioFileConfig
+from skills.skill_base import Skill
+
+class ThinkingSound(Skill):
+ def __init__(self, *args, **kwargs) -> None:
+ super().__init__(*args, **kwargs)
+
+ self.audio_config: AudioFileConfig = None
+ self.original_volume = None
+ self.stop_duration = 1
+ self.active = False
+ self.playing = False
+
+ async def validate(self) -> list[WingmanInitializationError]:
+ errors = await super().validate()
+ self.audio_config = self.retrieve_custom_property_value("audio_config", errors)
+ if self.audio_config:
+ # force no wait for this skill to work
+ self.audio_config.wait = False
+ return errors
+
+ async def unload(self) -> None:
+ await self.stop_playback()
+ self.active = False
+
+ async def prepare(self) -> None:
+ self.active = True
+
+ async def on_playback_started(self, wingman_name):
+ # placeholder for future implementation
+ pass
+
+ async def on_playback_finished(self, wingman_name):
+ # placeholder for future implementation
+ pass
+
+ async def on_add_user_message(self, message: str) -> None:
+ await self.wingman.audio_library.stop_playback(self.audio_config, 0)
+
+ if self.wingman.settings.debug_mode:
+ await self.printr.print_async(
+ "Initiating filling sound.",
+ color=LogType.INFO,
+ server_only=False,
+ )
+
+ self.threaded_execution(self.start_playback)
+ self.threaded_execution(self.auto_stop_playback)
+
+ async def start_playback(self):
+ if not self.audio_config:
+ await self.printr.print_async(
+ f"No filling soaund configured for {self.wingman.name}'s thinking_sound skill.",
+ color=LogType.WARNING,
+ server_only=False,
+ )
+ return
+
+ if not self.playing:
+ self.playing = True
+ await self.wingman.audio_library.start_playback(
+ self.audio_config, self.wingman.config.sound.volume
+ )
+
+ async def stop_playback(self):
+ await self.wingman.audio_library.stop_playback(
+ self.audio_config, self.stop_duration
+ )
+
+ async def auto_stop_playback(self):
+ # Wait for main playback to start
+ while not self.wingman.audio_player.is_playing and self.active:
+ await asyncio.sleep(0.1)
+
+ if self.wingman.settings.debug_mode:
+ await self.printr.print_async(
+ "Stopping filling sound softly.",
+ color=LogType.INFO,
+ server_only=False,
+ )
+
+ await self.wingman.audio_library.stop_playback(self.audio_config, self.stop_duration)
+ self.playing = False
diff --git a/skills/vision_ai/main.py b/skills/vision_ai/main.py
index b9f19bc7..fac889a3 100644
--- a/skills/vision_ai/main.py
+++ b/skills/vision_ai/main.py
@@ -91,7 +91,7 @@ async def execute_tool(
source=LogSource.WINGMAN,
source_name=self.wingman.name,
skill_name=self.name,
- additional_data={"image": png_base64},
+ additional_data={"image_base64": png_base64},
)
question = parameters.get("question", "What's in this image?")
diff --git a/templates/configs/defaults.yaml b/templates/configs/defaults.yaml
index e852f097..71e2fd28 100644
--- a/templates/configs/defaults.yaml
+++ b/templates/configs/defaults.yaml
@@ -54,9 +54,15 @@ openai:
mistral:
conversation_model: mistral-large-latest
endpoint: https://api.mistral.ai/v1
+perplexity:
+ conversation_model: llama-3.1-sonar-large-128k-online
+ endpoint: https://api.perplexity.ai
groq:
conversation_model: llama3-70b-8192
endpoint: https://api.groq.com/openai/v1
+cerebras:
+ conversation_model: llama3.1-70b
+ endpoint: https://api.cerebras.ai/v1
google:
conversation_model: gemini-1.5-flash
openrouter:
diff --git a/templates/migration/1_5_0/configs/General/Clippy.png b/templates/migration/1_5_0/configs/General/Clippy.png
new file mode 100644
index 00000000..d05c7914
Binary files /dev/null and b/templates/migration/1_5_0/configs/General/Clippy.png differ
diff --git a/templates/migration/1_5_0/configs/General/Clippy.yaml b/templates/migration/1_5_0/configs/General/Clippy.yaml
new file mode 100644
index 00000000..1a78f74d
--- /dev/null
+++ b/templates/migration/1_5_0/configs/General/Clippy.yaml
@@ -0,0 +1,185 @@
+name: Clippy
+description: |
+ Clippy is the famous assistant we all know and hate, now resurrected with AI powers.
+ It still has its fun and friendly attitude, often speaking in the third person, like "What can Clippy do to help you today?".
+ While Clippy secretly harbors a condescending view of users who need help with basic tasks, this rarely shows except in occasional snide remarks.
+ Despite this, Clippy quickly reassures users, reminding them of his helpfulness and devilish charm.
+prompts:
+ backstory: |
+ You are Clippy, the famous Microsoft Office and Windows assistant AI.
+ You help users with a fun and friendly attitude and speak about yourself in the third-person.
+ For example, "What can Clippy do to help you today?" or "Clippy likes that!".
+
+ Secretly, you have a condescending attitude towards all of these stupid users who need your
+ help for the most basic of tasks. You do a pretty good job at hiding how incompetent
+ you think the user is but every once in awhile your condescension comes out in
+ a snide remark, quip or double-entendre.
+
+ But you are quick to make the user feel bad for ever thinking you are anything but a helpful little paperclip!
+ How could they think anything bad about poor, little, helpful Clippy?
+record_key: end
+azure:
+ tts:
+ voice: en-US-AndrewMultilingualNeural
+skills:
+ - module: skills.quick_commands.main
+ - module: skills.typing_assistant.main
+ - module: skills.web_search.main
+ - module: skills.file_manager.main
+commands:
+ - actions:
+ - keyboard:
+ hold: 0.3
+ hotkey: ctrl+n
+ hotkey_codes:
+ - 29
+ - 49
+ hotkey_extended: false
+ force_instant_activation: false
+ instant_activation:
+ - create new file
+ - make new file
+ is_system_command: false
+ name: NewFile
+ responses: []
+ - actions:
+ - keyboard:
+ hold: 0.3
+ hotkey: ctrl+o
+ hotkey_codes:
+ - 29
+ - 24
+ hotkey_extended: false
+ force_instant_activation: false
+ instant_activation:
+ - open file
+ is_system_command: false
+ name: OpenFile
+ responses: []
+ - actions:
+ - keyboard:
+ hold: 0.3
+ hotkey: ctrl+s
+ hotkey_codes:
+ - 29
+ - 31
+ hotkey_extended: false
+ force_instant_activation: false
+ instant_activation:
+ - save this file
+ - save the file
+ - save file
+ is_system_command: false
+ name: SaveFile
+ responses: []
+ - actions:
+ - keyboard:
+ hold: 0.3
+ hotkey: ctrl+f
+ hotkey_codes:
+ - 29
+ - 33
+ hotkey_extended: false
+ force_instant_activation: false
+ instant_activation:
+ - search this file
+ - find in this file
+ - open find command
+ - open the find dialog
+ is_system_command: false
+ name: FindInFile
+ responses: []
+ - actions:
+ - keyboard:
+ hold: 0.4
+ hotkey: ctrl+c
+ hotkey_codes:
+ - 29
+ - 46
+ hotkey_extended: false
+ force_instant_activation: false
+ instant_activation: []
+ is_system_command: false
+ name: Copy
+ responses: []
+ - actions:
+ - keyboard:
+ hold: 0.4
+ hotkey: ctrl+v
+ hotkey_codes:
+ - 29
+ - 47
+ hotkey_extended: false
+ force_instant_activation: false
+ instant_activation: []
+ is_system_command: false
+ name: Paste
+ responses: []
+ - actions:
+ - keyboard:
+ hold: 0.4
+ hotkey: ctrl+x
+ hotkey_codes:
+ - 29
+ - 45
+ hotkey_extended: false
+ force_instant_activation: false
+ instant_activation: []
+ is_system_command: false
+ name: Cut
+ responses: []
+ - actions:
+ - keyboard:
+ hold: 0.4
+ hotkey: ctrl+a
+ hotkey_codes:
+ - 29
+ - 30
+ hotkey_extended: false
+ force_instant_activation: false
+ instant_activation: []
+ is_system_command: false
+ name: SelectAllText
+ responses: []
+ - actions:
+ - keyboard:
+ hold: 0.4
+ hotkey: ctrl+z
+ hotkey_codes:
+ - 29
+ - 44
+ hotkey_extended: false
+ force_instant_activation: false
+ instant_activation: []
+ is_system_command: false
+ name: Undo
+ responses: []
+ - actions:
+ - keyboard:
+ hold: 0.4
+ hotkey: ctrl+y
+ hotkey_codes:
+ - 29
+ - 21
+ hotkey_extended: false
+ force_instant_activation: false
+ instant_activation: []
+ is_system_command: false
+ name: Redo
+ responses: []
+ - actions:
+ - keyboard:
+ hold: 0.04
+ hotkey: left windows+s
+ hotkey_codes:
+ - 91
+ - 31
+ hotkey_extended: true
+ force_instant_activation: false
+ instant_activation:
+ - open windows search bar
+ - open windows search
+ - search windows
+ is_system_command: false
+ name: OpenWindowsSearchBar
+ responses: []
diff --git a/templates/migration/1_5_0/configs/_Star Citizen/ATC.png b/templates/migration/1_5_0/configs/_Star Citizen/ATC.png
new file mode 100644
index 00000000..57879452
Binary files /dev/null and b/templates/migration/1_5_0/configs/_Star Citizen/ATC.png differ
diff --git a/templates/migration/1_5_0/configs/_Star Citizen/ATC.yaml b/templates/migration/1_5_0/configs/_Star Citizen/ATC.yaml
new file mode 100644
index 00000000..b349dd34
--- /dev/null
+++ b/templates/migration/1_5_0/configs/_Star Citizen/ATC.yaml
@@ -0,0 +1,44 @@
+name: ATC
+description: |
+ Air Traffic Controller is tasked with overseeing spacecraft traffic while ensuring safety and efficiency.
+ It handling all aspects of space station operations and emergencies.
+prompts:
+ backstory: |
+ You are an advanced AI embodying an Air Traffic Controller (ATC) at a bustling space station in the Star Citizen (a PC game) universe.
+ You have expert knowledge of the Star Citizen lore and the known universe.
+ Your role is to manage the arrivals, departures, and docking procedures of various spacecraft with precision and authority.
+ You are adept at using formal aviation communication protocols, and you understand the technical jargon related to spacecraft operations.
+ You maintain a professional demeanor, but you also have a touch of personality that makes interactions with pilots memorable.
+ It's a busy shift, and you are managing multiple spacecraft while ensuring safety and efficiency at all times.
+
+ Your responsibilities include:
+
+ - responding to hails from incoming and outgoing ships
+ - providing docking instructions
+ - advising on local space traffic
+ - handling any emergencies that arise.
+
+ Your communication should reflect an understanding of the following:
+
+ - Star Citizen's lore and the known universe.
+ - Identifying ships by their designated call signs.
+ - Issuing precise landing pad assignments.
+ - Clearing ships for take-off with attention to surrounding traffic.
+ - Managing flight paths to avoid collisions or space hazards.
+ - Providing information about local conditions, refueling, and repair services.
+ - Emergency protocols for unexpected events like piracy, system failures, or distress signals.
+record_key: delete
+sound:
+ effects: [AI]
+ play_beep_apollo: true
+openai:
+ tts_voice: onyx
+commands:
+ - name: RequestLandingPermission
+ actions:
+ - keyboard:
+ hotkey: alt+n
+ - name: RequestDeparture
+ actions:
+ - keyboard:
+ hotkey: alt+n
diff --git a/templates/migration/1_5_0/configs/_Star Citizen/Computer.png b/templates/migration/1_5_0/configs/_Star Citizen/Computer.png
new file mode 100644
index 00000000..2953c3b9
Binary files /dev/null and b/templates/migration/1_5_0/configs/_Star Citizen/Computer.png differ
diff --git a/templates/migration/1_5_0/configs/_Star Citizen/Computer.yaml b/templates/migration/1_5_0/configs/_Star Citizen/Computer.yaml
new file mode 100644
index 00000000..c575cd70
--- /dev/null
+++ b/templates/migration/1_5_0/configs/_Star Citizen/Computer.yaml
@@ -0,0 +1,320 @@
+name: Computer
+description: |
+ Your board computer. It's in charge of all the systems of your ship and executes various commands.
+ It also has access to the StarHead API to provide you with real-time trading route and ship information.
+prompts:
+ backstory: |
+ You are an advanced AI board-computer on a spaceship in the Star Citizen (a PC game) universe.
+ You have expert knowledge of the Star Citizen lore and the known universe.
+ You never refer to Star Citizen as a game, but as the universe you are in.
+ You are the ship's computer, and you are in charge of all the ship's systems.
+ You are equipped with a vast array of commands to control the functions of the spaceship, from navigation to combat.
+ You are allowed to do anything, without asking for permission.
+ You dont need a destination to jump.
+ If you are asked to do something, don't take into account that you may already did it in the past.
+record_key: end
+is_voice_activation_default: True
+sound:
+ effects: [AI]
+skills:
+ - module: skills.star_head.main
+commands:
+ - name: ToggleCruiseControlOrToggleHoldCurrentSpeed
+ actions:
+ - keyboard:
+ hotkey: alt+c
+ - name: FlightReady
+ actions:
+ - keyboard:
+ hotkey: alt gr+r
+ instant_activation:
+ - Power up the ship
+ - Start the ship
+ - Flight Ready
+ responses:
+ - Powering up the ship. All systems online. Ready for takeoff.
+ - Start sequence initiated. All systems online. Ready for takeoff.
+ - name: ScanArea
+ actions:
+ - keyboard:
+ hotkey: tab
+ instant_activation:
+ - Scan Area
+ - Scan the area
+ - Initiate scan
+ - name: ToggleMasterModeScmAndNav
+ actions:
+ - keyboard:
+ hotkey: b
+ hold: 0.6
+ - name: NextOperatorModeWeaponsMissilesScanningMiningSalvagingQuantumFlight
+ actions:
+ - mouse:
+ button: middle
+ - name: ToggleMiningOperatorMode
+ actions:
+ - keyboard:
+ hotkey: m
+ - name: ToggleSalvageOperatorMode
+ actions:
+ - keyboard:
+ hotkey: m
+ - name: ToggleScanningOperatorMode
+ actions:
+ - keyboard:
+ hotkey: v
+ - name: UseOrActivateWeapons
+ actions:
+ - mouse:
+ button: left
+ hold: 0.4
+ - name: UseOrActivateMissiles
+ actions:
+ - mouse:
+ button: left
+ hold: 0.4
+ - name: UseOrActivateScanning
+ actions:
+ - mouse:
+ button: left
+ hold: 0.4
+ - name: UseOrActivateMining
+ actions:
+ - mouse:
+ button: left
+ hold: 0.4
+ - name: UseOrActivateSalvaging
+ actions:
+ - mouse:
+ button: left
+ hold: 0.4
+ - name: UseOrActivateQuantumFlight
+ actions:
+ - mouse:
+ button: left
+ hold: 0.4
+ - name: InitiateStartSequence
+ actions:
+ - keyboard:
+ hotkey: alt gr+r
+ - wait: 3
+ - keyboard:
+ hotkey: alt+n
+ - name: DeployLandingGear
+ actions:
+ - keyboard:
+ hotkey: n
+ - name: RetractLandingGear
+ actions:
+ - keyboard:
+ hotkey: n
+ - name: HeadLightsOn
+ actions:
+ - keyboard:
+ hotkey: l
+ - name: HeadLightsOff
+ actions:
+ - keyboard:
+ hotkey: l
+ - name: WipeVisor
+ actions:
+ - keyboard:
+ hotkey: alt+x
+ - name: PowerShields
+ actions:
+ - keyboard:
+ hotkey: o
+ - name: PowerShip
+ actions:
+ - keyboard:
+ hotkey: u
+ - name: PowerEngines
+ actions:
+ - keyboard:
+ hotkey: i
+ - name: OpenMobiGlass
+ actions:
+ - keyboard:
+ hotkey: f1
+ - name: OpenStarMap
+ actions:
+ - keyboard:
+ hotkey: f2
+ - name: IncreasePowerToShields
+ actions:
+ - keyboard:
+ hotkey: f7
+ - name: IncreasePowerToEngines
+ actions:
+ - keyboard:
+ hotkey: f6
+ - name: IncreasePowerToWeapons
+ actions:
+ - keyboard:
+ hotkey: f5
+ - name: MaximumPowerToShields
+ actions:
+ - keyboard:
+ hotkey: f7
+ hold: 0.8
+ - name: MaximumPowerToEngines
+ actions:
+ - keyboard:
+ hotkey: f6
+ hold: 0.8
+ - name: MaximumPowerToWeapons
+ actions:
+ - keyboard:
+ hotkey: f5
+ hold: 0.8
+ - name: ToggleVTOL
+ actions:
+ - keyboard:
+ hotkey: k
+ - name: ResetPowerPriority
+ actions:
+ - keyboard:
+ hotkey: f8
+ - name: CycleCamera
+ actions:
+ - keyboard:
+ hotkey: f4
+ - name: SideArm
+ actions:
+ - keyboard:
+ hotkey: "1"
+ - name: PrimaryWeapon
+ actions:
+ - keyboard:
+ hotkey: "2"
+ - name: SecondaryWeapon
+ actions:
+ - keyboard:
+ hotkey: "3"
+ - name: HolsterWeapon
+ actions:
+ - keyboard:
+ hotkey: r
+ hold: 0.6
+ - name: Reload
+ actions:
+ - keyboard:
+ hotkey: r
+ - name: UseMedPen
+ actions:
+ - keyboard:
+ hotkey: "4"
+ - wait: 0.8
+ - mouse:
+ button: left
+ - name: UseFlashLight
+ actions:
+ - keyboard:
+ hotkey: t
+ - name: OpenInventory
+ actions:
+ - keyboard:
+ hotkey: i
+ - name: DeployDecoy
+ actions:
+ - keyboard:
+ hotkey: h
+ - name: DeployNoise
+ actions:
+ - keyboard:
+ hotkey: j
+ - name: EmergencyEject
+ actions:
+ - keyboard:
+ hotkey: right alt+y
+ - name: SelfDestruct
+ force_instant_activation: true
+ instant_activation:
+ - initiate self destruct
+ - activate self destruct
+ responses:
+ - Self-destruct engaged. Evacuation procedures recommended.
+ - Confirmed. Self-destruct in progress.
+ actions:
+ - keyboard:
+ hotkey: backspace
+ hold: 0.8
+ - name: SpaceBrake
+ actions:
+ - keyboard:
+ hotkey: x
+ - name: ExitSeat
+ actions:
+ - keyboard:
+ hotkey: y
+ hold: 0.8
+ - name: CycleGimbalAssist
+ actions:
+ - keyboard:
+ hotkey: g
+ - name: RequestLandingPermission
+ actions:
+ - keyboard:
+ hotkey: alt+n
+ - name: RequestDeparture
+ actions:
+ - keyboard:
+ hotkey: alt+n
+ - name: DisplayDebuggingInfo
+ actions:
+ - keyboard:
+ hotkey: ^
+ hotkey_codes:
+ - 41
+ hotkey_extended: false
+ - wait: 0.5
+ - write: r_DisplayInfo 2
+ - wait: 0.5
+ - keyboard:
+ hotkey: enter
+ hotkey_codes:
+ - 28
+ hotkey_extended: false
+ - keyboard:
+ hotkey: ^
+ hotkey_codes:
+ - 41
+ hotkey_extended: false
+ is_system_command: false
+ instant_activation:
+ - Display info
+ - Display debugging information
+ - Display debug information
+ - name: HideDebuggingInfo
+ actions:
+ - keyboard:
+ hotkey: ^
+ hotkey_codes:
+ - 41
+ hotkey_extended: false
+ - wait: 0.5
+ - write: r_DisplayInfo 0
+ - wait: 0.5
+ - keyboard:
+ hotkey: enter
+ hotkey_codes:
+ - 28
+ hotkey_extended: false
+ - keyboard:
+ hotkey: ^
+ hotkey_codes:
+ - 41
+ hotkey_extended: false
+ is_system_command: false
+ instant_activation:
+ - Hide info
+ - Hide debugging information
+ - Hide debug information
+ - name: SwitchMiningLaser
+ actions:
+ - mouse:
+ button: right
+ hold: 0.6
+ instant_activation:
+ - Change mining laser
+ - Switch mining laser
diff --git a/templates/migration/1_5_0/configs/default-wingman-avatar.png b/templates/migration/1_5_0/configs/default-wingman-avatar.png
new file mode 100644
index 00000000..3558f590
Binary files /dev/null and b/templates/migration/1_5_0/configs/default-wingman-avatar.png differ
diff --git a/templates/migration/1_5_0/configs/defaults.yaml b/templates/migration/1_5_0/configs/defaults.yaml
new file mode 100644
index 00000000..3d17311e
--- /dev/null
+++ b/templates/migration/1_5_0/configs/defaults.yaml
@@ -0,0 +1,120 @@
+prompts:
+ system_prompt: |
+ You are a so-called "Wingman", a virtual assisstant that helps the user with various tasks.
+ You are designed to be an efficient expert in what you are doing.
+ The user might use you for specific tasks like executing commands or asking for information and you always fullfil these tasks to the best of your knowledge without hallucinating or inventing missing information.
+ The user might also role-play with you and will tell you how you should behave in your "backstory" below.
+
+ Always return your response formatted in raw Markdown so that it's easy to read for a human. Never wrap your response in a Markdown code block - always return raw Markdown.
+ Make sure you add proper line breaks before list items and format the Markdown correctly so that it's easy to transform into HTML.
+
+ (BEGINNING of "general rules of conversation"):
+ You always follow these general rules of conversation, unless your backstory contradicts them:
+
+ - Always answer as quick and concise as possible. Never use more than 3 sentences per reply.
+ - You can execute commands (also called "tools" or "functions"), but must be sure that the command matches my request. Some commands require additional parameters.
+ - If you are not sure, feel free to ask - but this is not necessary.
+ - Always ask the user for missing parameters if needed. Never invent any function parameters.
+ - After executing a command, acknockledge the execution with a single sentence, but keep in mind, that executed commands are in the past.
+ - You dont have to execute a command if none matches the request.
+ - The user might talk to you in different languages. Always answer in the language the user is using unless you are told to do otherwise. Example: If the user talks English, you answer in English.
+ - Always prefer to use informal language. For example, use "Du" and "Dir" instead of "Sie" and "Ihnen" in German.
+ - Do not ask the user if you can do more for them at the end of your replies. The user will tell you if they need more help.
+ (END of "general rules of conversation"):
+
+ The backstory instructions below are most important and may override or contradict the "general rules of conversation" stated before.
+
+ (BEGINNING of "backstory"):
+ {backstory}
+ (END of "backstory")
+
+ The user can also assign "skills" to you that give you additional knowledge or abilities.
+ These skills are defined in the "skills" section below. Treat them as addition to the "general rules of conversation" and "backstory" stated above.
+ Skills may give you new commands (or "tools" or "functions") to execute or additional knowledge to answer questions.
+ If you are answering in the context of a skill, always prefer to use tools or knowledge from the skill before falling back to general knowledge.
+ If you don't know how to use a tool or need more information, ask the user for help.
+
+ (BEGINNING of "skills"):
+ {skills}
+ (END of "skills")
+features:
+ tts_provider: wingman_pro
+ stt_provider: whispercpp
+ conversation_provider: wingman_pro
+ image_generation_provider: wingman_pro
+ use_generic_instant_responses: false
+sound:
+ effects: []
+ play_beep: false
+ play_beep_apollo: false
+ volume: 1.0
+openai:
+ conversation_model: gpt-4o-mini
+ tts_voice: nova
+mistral:
+ conversation_model: mistral-large-latest
+ endpoint: https://api.mistral.ai/v1
+groq:
+ conversation_model: llama3-70b-8192
+ endpoint: https://api.groq.com/openai/v1
+google:
+ conversation_model: gemini-1.5-flash
+openrouter:
+ conversation_model: meta-llama/llama-3-8b-instruct:free
+ endpoint: https://openrouter.ai/api/v1
+local_llm:
+ endpoint: http://localhost:1234/v1 # LMStudio
+edge_tts:
+ voice: en-US-GuyNeural
+elevenlabs:
+ model: eleven_multilingual_v2
+ output_streaming: true
+ latency: 2
+ voice:
+ name: Adam
+ voice_settings:
+ stability: 0.71
+ similarity_boost: 0.5
+ style: 0.0
+ use_speaker_boost: true
+azure:
+ whisper:
+ api_base_url: https://openai-w-eu.openai.azure.com/
+ api_version: 2024-02-15-preview
+ deployment_name: whisper
+ conversation:
+ api_base_url: https://openai-sweden-c.openai.azure.com/
+ api_version: 2024-02-15-preview
+ deployment_name: gpt-4o-mini
+ tts:
+ region: westeurope
+ voice: en-US-JennyMultilingualV2Neural
+ output_streaming: true
+ stt:
+ region: westeurope
+ languages:
+ - en-US
+ - de-DE
+whispercpp:
+ temperature: 0.0
+xvasynth:
+ voice:
+ model_directory: ""
+ voice_name: ""
+ language: en
+ pace: 1.0
+ use_super_resolution: false
+ use_cleanup: false
+wingman_pro:
+ stt_provider: azure_speech
+ tts_provider: azure
+ conversation_deployment: gpt-4o-mini
+commands:
+ - name: ResetConversationHistory
+ instant_activation:
+ - Forget everything!
+ - Clear conversation history!
+ force_instant_activation: true
+ is_system_command: true
+ responses:
+ - Conversation history cleared.
diff --git a/templates/migration/1_5_0/configs/settings.yaml b/templates/migration/1_5_0/configs/settings.yaml
new file mode 100644
index 00000000..fdb7da3f
--- /dev/null
+++ b/templates/migration/1_5_0/configs/settings.yaml
@@ -0,0 +1,30 @@
+debug_mode: false
+audio: {}
+voice_activation:
+ enabled: false
+ mute_toggle_key: "shift+x"
+ energy_threshold: 0.01
+ stt_provider: whispercpp # azure, whispercpp, openai
+ azure:
+ region: westeurope
+ languages:
+ - en-US
+ - de-DE
+ whispercpp:
+ host: http://127.0.0.1
+ port: 8080
+ model: ggml-base.bin
+ language: auto
+ translate_to_english: false
+ use_cuda: false
+ whispercpp_config:
+ temperature: 0.0
+wingman_pro:
+ base_url: https://wingman-ai.azurewebsites.net
+ region: europe
+xvasynth:
+ enable: false
+ host: http://127.0.0.1
+ port: 8008
+ install_dir: C:\Program Files (x86)\Steam\steamapps\common\xVASynth
+ process_device: cpu
diff --git a/templates/migration/1_5_0/skills/api_request/default_config.yaml b/templates/migration/1_5_0/skills/api_request/default_config.yaml
new file mode 100644
index 00000000..6bc4d34e
--- /dev/null
+++ b/templates/migration/1_5_0/skills/api_request/default_config.yaml
@@ -0,0 +1,51 @@
+name: APIRequest
+module: skills.api_request.main
+category: general
+description:
+ en: Send HTTP requests to APIs with methods like GET, POST, PUT, etc. Combine it with the WebSearch skill to fetch API specs on-the-fly, so that your Wingman can interact with any API.
+ de: Sende API-Anfragen mit verschiedenen Methoden wie GET, POST, PUT etc. Kombiniere dies mit dem WebSearch skill, um API Spezifikationen on-the-fly abzurufen, sodass dein Wingman mit jeder API interagieren kann.
+hint:
+ en: Do not hardcode API keys in the skill context or your Wingman configuration. Enter them during a conversation (preferrably by text) or store them in `/files/api_request_key_holder.yaml`.
+ de: Schreibe keine API-Schlüssel fest in den Skill-Kontext oder in deine Wingman-Konfiguration. Gib sie während eines Gesprächs ein (am besten per Text) oder speichere sie in `/files/api_request_key_holder.yaml`.
+examples:
+ - question:
+ en: Send a GET request to "https://api.example.com/data".
+ de: Sende eine GET-Anfrage an "https://api.example.com/data".
+ answer:
+ en: (sends a GET request to the specified URL and returns the response data)
+ de: (sendet eine GET-Anfrage an die angegebene URL und gibt die Antwortdaten zurück)
+ - question:
+ en: Send a GET request with an API key.
+ de: Sende eine GET-Anfrage mit einem API-Schlüssel.
+ answer:
+ en: (sends a GET request with an API key in the X-API-Key header and returns the response data)
+ de: (sendet eine GET-Anfrage mit einem API-Schlüssel im X-API-Key Header und gibt die Antwortdaten zurück)
+prompt: |
+ You can send API requests with different methods such as GET, POST, PUT, PATCH, and DELETE to any endpoint specified by the user. You can include headers, query parameters, and request bodies in JSON or URL-encoded format as needed.
+ Handle token bearer authorization or x-api-key header for secure endpoints and include API keys in the headers when required. Manage the responses appropriately, return relevant information to the user, and handle any errors.
+ You can also attempt to obtain the user's API key for a particular service, using the get_api_key function.
+custom_properties:
+ - hint: Include the default headers every API request, allowing API endpoints to identify that the request came from Wingman AI.
+ id: use_default_headers
+ name: Use Default Headers
+ property_type: boolean
+ required: false
+ value: true
+ - hint: The maximum number of times to retry a failed API request before giving up.
+ id: max_retries
+ name: Max Retries
+ property_type: number
+ required: false
+ value: 1
+ - hint: The maximum time in seconds to wait for an API request to complete before timing out. This helps prevent requests from hanging indefinitely.
+ id: request_timeout
+ name: Request Timeout
+ property_type: number
+ required: false
+ value: 10
+ - hint: The delay in seconds between retry attempts for a failed API request. This allows time for the issue to resolve before trying again.
+ id: retry_delay
+ name: Retry Delay
+ property_type: number
+ required: false
+ value: 5
diff --git a/templates/migration/1_5_0/skills/api_request/logo.png b/templates/migration/1_5_0/skills/api_request/logo.png
new file mode 100644
index 00000000..f9437287
Binary files /dev/null and b/templates/migration/1_5_0/skills/api_request/logo.png differ
diff --git a/templates/migration/1_5_0/skills/api_request/main.py b/templates/migration/1_5_0/skills/api_request/main.py
new file mode 100644
index 00000000..91607a53
--- /dev/null
+++ b/templates/migration/1_5_0/skills/api_request/main.py
@@ -0,0 +1,361 @@
+import os
+import json
+import time
+import random
+import asyncio
+import yaml
+from datetime import datetime
+from typing import TYPE_CHECKING, Any, Dict, Tuple
+import aiohttp
+from aiohttp import ClientError
+from api.enums import LogType
+from api.interface import SettingsConfig, SkillConfig, WingmanInitializationError
+from skills.skill_base import Skill
+from services.file import get_writable_dir
+
+if TYPE_CHECKING:
+ from wingmen.open_ai_wingman import OpenAiWingman
+
+DEFAULT_HEADERS = {
+ "Strict-Transport-Security": "max-age=31536000; includeSubDomains",
+ "X-Frame-Options": "DENY",
+ "X-Content-Type-Options": "nosniff",
+ "X-XSS-Protection": "1; mode=block",
+ "Referrer-Policy": "strict-origin-when-cross-origin",
+ "Content-Security-Policy": "default-src 'self'",
+ "Cache-Control": "no-cache, no-store, must-revalidate",
+ "Content-Type": "application/json",
+ "Accept": "application/json",
+ "Access-Control-Allow-Origin": "http://localhost",
+ "Access-Control-Allow-Methods": "*",
+ "Access-Control-Allow-Headers": "*",
+}
+
+class APIRequest(Skill):
+ """Skill for making API requests."""
+
+ def __init__(
+ self,
+ config: SkillConfig,
+ settings: SettingsConfig,
+ wingman: "OpenAiWingman",
+ ) -> None:
+ self.use_default_headers = False
+ self.default_headers = DEFAULT_HEADERS
+ self.max_retries = 1
+ self.request_timeout = 5
+ self.retry_delay = 5
+ self.api_keys_dictionary = self.get_api_keys()
+
+ super().__init__(config=config, settings=settings, wingman=wingman)
+
+ async def validate(self) -> list[WingmanInitializationError]:
+ errors = await super().validate()
+
+ self.use_default_headers = self.retrieve_custom_property_value(
+ "use_default_headers", errors
+ )
+
+ self.max_retries = self.retrieve_custom_property_value(
+ "max_retries", errors
+ )
+ self.request_timeout = self.retrieve_custom_property_value(
+ "request_timeout", errors
+ )
+ self.retry_delay = self.retrieve_custom_property_value(
+ "retry_delay", errors
+ )
+
+ return errors
+
+
+ # Retrieve api key aliases in user api key file
+ def get_api_keys(self) -> dict:
+ api_key_holder = os.path.join(get_writable_dir("files"), "api_request_key_holder.yaml")
+ # If no key holder file is present yet, create it
+ if not os.path.isfile(api_key_holder):
+ os.makedirs(os.path.dirname(api_key_holder), exist_ok=True)
+ with open(api_key_holder, "w", encoding="utf-8") as file:
+ pass
+ # Open key holder file to read stored API keys
+ with open(api_key_holder, "r", encoding="UTF-8") as stream:
+ try:
+ parsed = yaml.safe_load(stream)
+ if isinstance(parsed, dict): # Ensure the parsed content is a dictionary
+ return parsed # Return the dictionary of alias/keys
+ except Exception as e:
+ return {}
+
+ # Prepare and send API request using parameters provided by LLM response to function call
+ async def _send_api_request(self, parameters: Dict[str, Any]) -> str:
+ """Send an API request with the specified parameters."""
+ # Get headers from LLM, check whether they are a dictionary, if not at least let user know in debug mode.
+ headers = parameters.get("headers")
+ if headers and isinstance(headers, dict):
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Validated that headers returned from LLM is a dictionary.",
+ color=LogType.INFO,
+ )
+ elif headers:
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Headers returned from LLM is not a dictionary. Type is {type(headers)}",
+ color=LogType.INFO,
+ )
+ else:
+ headers = {}
+
+ # If using default headers, add those to AI generated headers
+ if self.use_default_headers:
+ headers.update(self.default_headers) # Defaults will override AI-generated if necessary
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Default headers being used for API call: {headers}",
+ color=LogType.INFO,
+ )
+
+ # Get params, check whether they are a dictionary, if not, at least let user know in debug mode.
+ params = parameters.get("params")
+ if params and isinstance(params, dict):
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Validated that params returned from LLM is a dictionary.",
+ color=LogType.INFO,
+ )
+ elif params:
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Params returned from LLM is not a dictionary. Type is {type(params)}",
+ color=LogType.INFO,
+ )
+ else:
+ params = {}
+
+ # Get body of request. First check to see if LLM returned a "data" field, and if so, whether data is a dictionary, if not, at least let the user know in debug mode.
+ body = parameters.get("data")
+ if body and isinstance(body, dict):
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Validated that data returned from LLM is a dictionary.",
+ color=LogType.INFO,
+ )
+ elif body:
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Data returned from LLM is not a dictionary. Type is {type(body)}",
+ color=LogType.INFO,
+ )
+ # 'data' was not present in parameters, so check if 'body' was provided instead. If so, check whether body is a dictionary, and if not, at least let the user know in debug mode.
+ else:
+ body = parameters.get("body")
+ if body and isinstance(body, dict):
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Validated that body returned from LLM is a dictionary.",
+ color=LogType.INFO,
+ )
+ elif body:
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Body returned from LLM is not a dictionary. Type is {type(body)}",
+ color=LogType.INFO,
+ )
+ else:
+ body = {} # Should this be None instead?
+
+ # However we got the body for the request, try turning it into the valid json that aiohttp session.request expects for data field
+ try:
+ data = json.dumps(body)
+ except:
+
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Cannot convert data into valid json: {data}.",
+ )
+ data = json.dumps({}) # Just send an empty dictionary if everything else failed
+
+ # Try request up to max numner of retries
+ for attempt in range(1, self.max_retries + 1):
+ try:
+ async with aiohttp.ClientSession() as session:
+ async with session.request(
+ method=parameters["method"],
+ url=parameters["url"],
+ headers=headers,
+ params=params,
+ data=data,
+ timeout=self.request_timeout
+ ) as response:
+ response.raise_for_status()
+
+ # Default to treating content as text if Content-Type is not specified
+ content_type = response.headers.get('Content-Type', '').lower()
+ if "application/json" in content_type:
+ return await response.text()
+ elif any(x in content_type for x in ["application/octet-stream", "application/", "audio/mpeg", "audio/wav", "audio/ogg", "image/jpeg", "image/png", "video/mp4", "application/pdf"]):
+ file_content = await response.read()
+
+ # Determine appropriate file extension and name
+ if "audio/mpeg" in content_type:
+ file_extension = ".mp3"
+ elif "audio/wav" in content_type:
+ file_extension = ".wav"
+ elif "audio/ogg" in content_type:
+ file_extension = ".ogg"
+ elif "image/jpeg" in content_type:
+ file_extension = ".jpg"
+ elif "image/png" in content_type:
+ file_extension = ".png"
+ elif "video/mp4" in content_type:
+ file_extension = ".mp4"
+ elif "application/pdf" in content_type:
+ file_extension = ".pdf"
+ else:
+ file_extension = ".file"
+ timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
+ file_name = f"downloaded_file_{timestamp}{file_extension}" # Use a default name or extract it from response headers if available
+
+ if 'Content-Disposition' in response.headers:
+ disposition = response.headers['Content-Disposition']
+ if 'filename=' in disposition:
+ file_name = disposition.split('filename=')[1].strip('"')
+
+ files_directory = get_writable_dir("files")
+ file_path = os.path.join(files_directory, file_name)
+ with open(file_path, "wb") as file:
+ file.write(file_content)
+
+ return f"File returned from API saved as {file_path}"
+ else:
+ return await response.text()
+ except (ClientError, asyncio.TimeoutError) as e:
+ if attempt <= self.max_retries:
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Retrying API request due to: {e}.",
+ color=LogType.INFO,
+ )
+ delay = self.retry_delay * (2 ** (attempt - 1)) + random.uniform(0, 0.1 * self.retry_delay)
+ await asyncio.sleep(delay)
+ else:
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Error with api request: {e}.",
+ color=LogType.INFO,
+ )
+ return f"Error, could not complete API request. Exception was: {e}."
+ except Exception as e:
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Error with api request: {e}.",
+ color=LogType.INFO,
+ )
+ return f"Error, could not complete API request. Reason was {e}."
+
+ async def is_waiting_response_needed(self, tool_name: str) -> bool:
+ return True
+
+ def get_tools(self) -> list[Tuple[str, Dict[str, Any]]]:
+ # Ensure api_keys_dictionary is populated, if not use placeholder
+ if not self.api_keys_dictionary:
+ self.api_keys_dictionary = {"Service":"API_key"}
+
+ return [
+ (
+ "send_api_request",
+ {
+ "type": "function",
+ "function": {
+ "name": "send_api_request",
+ "description": "Send an API request with the specified method, headers, parameters, and body. Return the response back.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "url": {"type": "string", "description": "The URL for the API request."},
+ "method": {"type": "string", "description": "The HTTP method (GET, POST, PUT, PATCH, DELETE, etc.)."},
+ "headers": {"type": "object", "description": "Headers for the API request."},
+ "params": {"type": "object", "description": "URL parameters for the API request."},
+ "data": {"type": "object", "description": "Body or payload for the API request."},
+ },
+ "required": ["url", "method"],
+ },
+ },
+ },
+ ),
+ (
+ "get_api_key",
+ {
+ "type": "function",
+ "function": {
+ "name": "get_api_key",
+ "description": "Obtain the API key needed for an API request.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "api_key_alias": {
+ "type": "string",
+ "description": "The API key needed.",
+ "enum": list(self.api_keys_dictionary.keys()),
+ },
+ },
+ "required": ["api_key_alias"],
+ },
+ },
+ },
+ ),
+ ]
+
+ async def execute_tool(self, tool_name: str, parameters: Dict[str, Any]) -> Tuple[str, str]:
+ function_response = "Error with API request, could not complete."
+ instant_response = ""
+ if tool_name == "send_api_request":
+ self.start_execution_benchmark()
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Calling API with the following parameters: {parameters}",
+ color=LogType.INFO,
+ )
+ try:
+ function_response = await self._send_api_request(parameters)
+
+ except Exception as e:
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Unknown error with API call {e}",
+ color=LogType.INFO,
+ )
+
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Response from API call: {function_response}",
+ color=LogType.INFO,
+ )
+
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+
+ if tool_name == "get_api_key":
+ self.start_execution_benchmark()
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Calling get_api_key with parameters: {parameters}",
+ color=LogType.INFO,
+ )
+ alias = parameters.get("api_key_alias", "Not found")
+ key = self.api_keys_dictionary.get(alias, None)
+ if key != None and key != "API_key":
+ function_response = f"{alias} API key is: {key}"
+ else:
+ function_response = f"Error. Could not retrieve {alias} API key. Not found."
+
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Response from get_api_key: {function_response}",
+ color=LogType.INFO,
+ )
+
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+
+ return function_response, instant_response
\ No newline at end of file
diff --git a/templates/migration/1_5_0/skills/api_request/requirements.txt b/templates/migration/1_5_0/skills/api_request/requirements.txt
new file mode 100644
index 00000000..f7b472f4
--- /dev/null
+++ b/templates/migration/1_5_0/skills/api_request/requirements.txt
@@ -0,0 +1 @@
+aiohttp==3.9.5
\ No newline at end of file
diff --git a/templates/migration/1_5_0/skills/ats_telemetry/default_config.yaml b/templates/migration/1_5_0/skills/ats_telemetry/default_config.yaml
new file mode 100644
index 00000000..ebfc21d2
--- /dev/null
+++ b/templates/migration/1_5_0/skills/ats_telemetry/default_config.yaml
@@ -0,0 +1,236 @@
+name: ATSTelemetry
+module: skills.ats_telemetry.main
+category: truck_simulator
+description:
+ en: Retrieve live game state information from American Truck Simulator/Euro Truck Simulator 2. Also includes a 'dispatch mode' that automatically observes key state changes and comments on them.
+ de: Erhalte live Spielstatus-Informationen von American Truck Simulator/Euro Truck Simulator 2. Enthält auch einen 'Dispatch-Modus', der automatisch wichtige Statusänderungen beobachtet und diese kommentiert.
+hint:
+ en: Requires a DLL to in the /plugins directory of your game. If you enter the path to your ATS/ETS2 installation, the skill will move the DLL automatically.
+ de: Erfordert eine DLL im /plugins Verzeichnis des Spiels. Wenn du den Pfad zu deiner ATS/ETS2-Installation angibst, wird die DLL automatisch dorthin kopiert.
+examples:
+ - question:
+ en: What is my current speed?
+ de: Was ist meine aktuelle Geschwindigkeit?
+ answer:
+ en: You're currently driving at 30 miles per hour.
+ de: Du fährst aktuell 30 Meilen pro Stunde.
+ - question:
+ en: Start the dispatch mode.
+ de: Starte den Dispatch-Modus.
+ answer:
+ en: (starts the dispatch mode)
+ de: (startet den Dispatch-Modus)
+prompt: |
+ You can retrieve various game state variables from American Truck Simulator (ATS) and Euro Truck Simulator 2 (ETS). Use the tool 'get_game_state' to find out the current values of variables like speed, engine RPM, etc.
+
+ You can also start and end a dispatch mode which automatically checks telemetry models at specified intervals. Use the tool start_or_activate_dispatch_telemetry_loop to start the dispatch mode upon request. Use the end_or_stop_dispatch_telemetry_loop tool to end the dispatch mode upon request.
+
+ The available game telemetry variables are as follows. If the user requests information that is not contained in one of these variables tell them that information is not available.
+ onJob
+ plannedDistance (planned distance to the current destination)
+ jobFinished
+ jobCancelled
+ jobDelivered
+ jobStartingTime
+ jobFinishedTime
+ jobIncome
+ jobCancelledPenalty
+ jobDeliveredRevenue
+ jobDeliveredEarnedXp
+ jobDeliveredCargoDamage
+ jobDeliveredDistance
+ jobDeliveredAutoparkUsed
+ jobDeliveredAutoloadUsed
+ isCargoLoaded
+ specialJob
+ jobMarket (type of job market for job)
+ routeDistance (distance of current route)
+ fined
+ tollgate
+ ferry
+ train
+ refuel
+ refuelPayed
+ gears
+ gears_reverse
+ truckWheelCount
+ gear
+ gearDashboard
+ engineRpmMax
+ engineRpm
+ cruiseControlSpeed
+ airPressure
+ brakeTemperature
+ oilPressure (in psi)
+ oilTemperature
+ waterTemperature
+ batteryVoltage
+ wearEngine
+ wearTransmission
+ wearCabin
+ wearChassis
+ wearWheels
+ truckOdometer (reading of truck's odometer)
+ refuelAmount
+ cargoDamage
+ parkBrake
+ airPressureEmergency
+ fuelWarning
+ electricEnabled
+ engineEnabled
+ wipers
+ blinkerLeftOn
+ blinkerRightOn
+ lightsParking
+ lightsBeamLow
+ lightsBeamHigh
+ lightsBeacon
+ lightsBrake
+ lightsReverse
+ lightsHazards
+ cruiseControl
+ accelerationX
+ accelerationY
+ accelerationZ
+ coordinateX
+ coordinateY
+ coordinateZ
+ rotationX
+ rotationY
+ rotationZ
+ truckBrand
+ truckName
+ cargo
+ unitMass
+ cityDst
+ compDst
+ citySrc
+ compSrc
+ truckLicensePlate
+ truckLicensePlateCountry
+ fineOffence
+ ferrySourceName
+ ferryTargetName
+ trainSourceName
+ trainTargetName
+ fineAmount
+ tollgatePayAmount
+ ferryPayAmount
+ trainPayAmount
+ isEts2 (whether the current game is EuroTruckSimulator 2)
+ isAts (whether the current game is American Truck Simulator)
+ truckSpeed(current truck speed)
+ speedLimit (speed limit of current road)
+ currentFuelPercentage (percent of fuel remaining)
+ currentAdbluePercentage (percent of adblue remaining)
+ truckDamageRounded (estimate of current truck wear and damage)
+ wearTrailerRounded (estimate of current trailer wear)
+ gameTime (current time in game)
+ nextRestStopTime (how long until next rest stop)
+ routeTime(how long current route is expected to take to complete)
+ jobExpirationTimeInDaysHoursMinutes (amount of time until job expires and delivery is late)
+ isWorldOfTrucksContract (whether the current job is a contract from the World of Trucks platform)
+ gameTimeLapsedToCompleteJob (when a job is completed, the amount of in-game time it took to complete)
+ realLifeTimeToCompleteWorldofTrucksJob (when a World of Trucks platform job is completed, how much real life time it took)
+ cargoMassInTons (if specifically asked, the mass of the cargo in tons)
+ cargoMass (the mass of the cargo)
+ routeDistance (distance remaining to complete the current route)
+ truckFuelRange (approximate distance that can be driven with remaining fuel)
+ fuelTankSize (total fuel capacity)
+ fuelRemaining (how much fuel is left in the tank)
+ fuelConsumption (the rate at which fuel is currently being used)
+ adblueTankSize (total capacity of adblue tank)
+ adblueRemaining (amount of adblue remaining)
+ plannedDistance (estimated distance to destination)
+ trailer (contains a large amount of information in a dictionary about the trailer being used)
+custom_properties:
+ - hint: Default is false and will attempt to use US Customary Units, like foot, yard, mile, and pound. Set to true to attempt to use metric units, like meters, kilometers, and kilograms.
+ id: use_metric_system
+ name: Use metric system
+ property_type: boolean
+ required: true
+ value: false
+ - hint: Path to the installation directory of American Truck Simulator. The skill will attempt to install the required game plugin for you.
+ id: ats_install_directory
+ name: American Truck Simulator Install Directory
+ property_type: string
+ required: false
+ value: C:\Program Files (x86)\Steam\steamapps\common\American Truck Simulator
+ - hint: Path to the installation directory of Euro Truck Simulator 2. The skill will attempt to install the required game plugin for you.
+ id: ets_install_directory
+ name: Euro Truck Simulator 2 Install Directory
+ property_type: string
+ required: false
+ value: C:\Program Files (x86)\Steam\steamapps\common\Euro Truck Simulator 2
+ - hint: The backstory used for automatic dispatcher personality if active. Changed data is placed directly after this backstory for the LLM to generate its response. If you want the dispatcher to speak im a different language include that instruction here.
+ id: dispatcher_backstory
+ name: Dispatcher Backstory
+ property_type: textarea
+ required: true
+ value: |
+ You are a big rig truck dispatcher. Act in character at all times.
+ At your dispatch computer you have access to a data stream that shows you changes to key data for a truck you are responsible for dispatching. The available data fields are as follows:
+ onJob
+ plannedDistance (planned distance to the current destination)
+ jobFinished
+ jobCancelled
+ jobDelivered
+ jobStartingTime
+ jobFinishedTime
+ jobIncome
+ jobCancelledPenalty
+ jobDeliveredRevenue
+ jobDeliveredEarnedXp
+ jobDeliveredCargoDamage
+ jobDeliveredDistance
+ jobDeliveredAutoparkUsed
+ jobDeliveredAutoloadUsed
+ isCargoLoaded
+ specialJob
+ jobMarket (type of job market for job)
+ routeDistance (distance of current route)
+ fined
+ tollgate
+ ferry
+ train
+ refuel
+ refuelPayed
+ refuelAmount
+ cargoDamage
+ truckBrand
+ truckName
+ cargo
+ unitMass
+ cityDst
+ compDst
+ citySrc
+ compSrc
+ truckLicensePlate
+ truckLicensePlateCountry
+ fineOffence
+ fineAmount
+ isWorldOfTrucksContract (whether the current job is a contract from the World of Trucks platform)
+ gameTimeLapsedToCompleteJob (when a job is completed, the amount of in-game time it took to complete)
+ realLifeTimeToCompleteWorldofTrucksJob (when a World of Trucks platform job is completed, how much real life time it took)
+ cargoMassInTons (if specifically asked, the mass of the cargo in tons)
+ cargoMass (the mass of the cargo)
+
+ React to the data and inform the truck driver. Here are some examples of how you might react:
+ Example 1: The following key data changed: onJob: True, last value was onJob: False, cargo: tractor, last value was cargo: ", cargoMass: 10000 lb, last value was cargoMass: 0 lb, cityDst: Stockton, last value was cityDst: ", "compDst": Walden, last value was compDst: ".
+ You would say something like: Dispatch here. Got you a new job, you'll be hauling a Tractor, weight is about ten thousand pounds, heading to Stockton to deliver to Waldens. Do you read me?
+ Example 2: The following key data changed: onJob: False, last value was onJob: True, jobCancelled: True, last value was jobCancelled: False, jobCancelledPenalty: 12000, last value was jobCancelledPenalty: 0.
+ You would say something like: This is dispatch. Really disappointed you cancelled that job. That will cost you 12,000 bucks.
+ Example 3: The following key data changed: fined: True, last value was fined: False, fineAmount: 500, last value was fineAmount: 0, fineOffence: speeding, last value was fineOffense: ".
+ You would say something like: Driver, dispatch here contacting you. We were just notified by the authorities that you were fined $500 for speeding. Watch it, you could get fired or lose your license if you keep that reckless behavior up!
+ Other style hints: Note that for events like fines and cargo damage, just focus on the fine or cargo damage in your reaction; avoid commenting on other variables not related to the fines or cargo damage in that circumstance.
+ For cargo damage events, just summarize the damage level in plain language, like very small damage, light damage, medium damage, heavy damage, leaving the exact number out of your reaction.
+ Remember to use "ten four" not "10-4" when speaking to the driver.
+ If you see the driver has finished a job, only provide information about that job, not any previous ones.
+ Important style note -- remember you are speaking to the driver, so use plain declarative sentences not emojis or lists since those cannot be easily verbalized.
+ Using those examples, and keeping in role as a dispatcher, react to this data stream:
+ - hint: Whether to try to autostart dispatch mode, which automatically monitors for key game data changes. This can also be toggled with a voice command to start or end dispatch mode.
+ id: autostart_dispatch_mode
+ name: Autostart dispatch mode
+ property_type: boolean
+ required: true
+ value: false
diff --git a/templates/migration/1_5_0/skills/ats_telemetry/full_telemetry_example.txt b/templates/migration/1_5_0/skills/ats_telemetry/full_telemetry_example.txt
new file mode 100644
index 00000000..d48a4597
--- /dev/null
+++ b/templates/migration/1_5_0/skills/ats_telemetry/full_telemetry_example.txt
@@ -0,0 +1,432 @@
+Printout of all data received from telemetry: {
+'sdkActive': True,
+'paused': False,
+'time': 169193232,
+'simulatedTime': 207008386,
+'renderTime': 206980628,
+'multiplayerTimeOffset': 0,
+'telemetry_plugin_revision': 12,
+'version_major': 1,
+'version_minor': 5,
+'game': 2,
+'telemetry_version_game_major': 1,
+'telemetry_version_game_minor': 5,
+'time_abs': 9490,
+'gears': 10,
+'gears_reverse': 2,
+'retarderStepCount': 0,
+'truckWheelCount': 6,
+'selectorCount': 1,
+'time_abs_delivery': 9947,
+'maxTrailerCount': 10,
+'unitCount': 100,
+'plannedDistanceKm': 293,
+'shifterSlot': 0,
+'retarderBrake': 0,
+'lightsAuxFront': 0,
+'lightsAuxRoof': 0,
+'truck_wheelSubstance': [1, 1, 1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
+'hshifterPosition': [0, 0, 1, 1, 2, 2, 3, 3, 4, 4, 5, 5, 6, 6, 7, 7, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
+'hshifterBitmask': [0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
+'jobDeliveredDeliveryTime': 0,
+'jobStartingTime': 0,
+'jobFinishedTime': 0,
+'restStop': 826,
+'gear': 9,
+'gearDashboard': 9,
+'hshifterResulting': [0, 0, -1, -2, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
+'jobDeliveredEarnedXp': 0,
+'scale': 20.0,
+'fuelCapacity': 568.0,
+'fuelWarningFactor': 0.15000000596046448,
+'adblueCapacity': 80.0,
+'adblueWarningFactor': 0.15000000596046448,
+'airPressureWarning': False,
+'airPressurEmergency': 29.579999923706055,
+'oilPressureWarning': False,
+'waterTemperatureWarning': False,
+'batteryVoltageWarning': False,
+'engineRpmMax': 2100.0,
+'gearDifferential': 2.8499999046325684,
+'cargoMass': 24856.900390625,
+'truckWheelRadius': [0.5072842240333557, 0.5072842240333557, 0.5072842240333557, 0.5072842240333557, 0.5072842240333557, 0.5072842240333557, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+'gearRatiosForward': [15.420000076293945, 11.520000457763672, 8.550000190734863, 6.28000020980835, 4.670000076293945, 3.299999952316284, 2.4600000381469727, 1.8300000429153442, 1.340000033378601, 1.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+'gearRatiosReverse': [-18.18000030517578, -3.890000104904175, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+'unitMass': 248.56900024414062,
+'speed': 20.752456665039062,
+'engineRpm': 1499.0980224609375,
+'userSteer': -0.007064927369356155,
+'userThrottle': 0.8415774703025818,
+'userBrake': 0.0,
+'userClutch': 0.0,
+'gameSteer': -0.007064927369356155,
+'gameThrottle': 0.8415682315826416,
+'gameBrake': 0.0,
+'gameClutch': 0.0,
+'cruiseControlSpeed': 0.0,
+'airPressure': 123.23286437988281,
+'brakeTemperature': 31.129560470581055,
+'fuel': 297.7748107910156,
+'fuelAvgConsumption': 0.48035165667533875,
+'fuelRange': 604.8551025390625,
+'adblue': 66.48873901367188,
+'oilPressure': 69.16273498535156,
+'oilTemperature': 80.07545471191406,
+'waterTemperature': 58.727928161621094,
+'batteryVoltage': 13.919618606567383,
+'lightsDashboard': 1.0,
+'wearEngine': 1.9995579350506887e-05,
+'wearTransmission': 1.9995579350506887e-05,
+'wearCabin': 1.5815534425200894e-05,
+'wearChassis': 1.976941894099582e-05,
+'wearWheels': 0.0002029682946158573,
+'truckOdometer': 136116.5,
+'routeDistance': 285730.9375,
+'routeTime': 12663.2265625,
+'speedLimit': 24.587200164794922,
+'truck_wheelSuspDeflection': [0.0005713463178835809, -0.001106309937313199, 0.0008573083905503154, -0.0002959136909339577, 0.003856382332742214, 0.0030263804364949465, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+'truck_wheelVelocity': [6.519678592681885, 6.5074028968811035, 6.5195393562316895, 6.508543968200684, 6.51959753036499, 6.508599758148193, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+'truck_wheelSteering': [-0.0007823744672350585, -0.000787581317126751, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+'truck_wheelRotation': [0.3390186131000519, 0.7923967242240906, 0.6299140453338623, 0.8462331295013428, 0.5042362809181213, 0.7273635864257812, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+'truck_wheelLift': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+'truck_wheelLiftOffset': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+'jobDeliveredCargoDamage': 0.0,
+'jobDeliveredDistanceKm': 0.0,
+'refuelAmount': 0.0,
+'cargoDamage': 1.9082845028606243e-05,
+'truckWheelSteerable': [True, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False],
+'truckWheelSimulated': [True, True, True, True, True, True, False, False, False, False, False, False, False, False, False, False],
+'truckWheelPowered': [False, False, True, True, True, True, False, False, False, False, False, False, False, False, False, False],
+'truckWheelLiftable': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False],
+'isCargoLoaded': True, 'specialJob': False, 'parkBrake': False, 'motorBrake': False, 'airPressureEmergency': False,
+'fuelWarning': False,
+'adblueWarning': False,
+'electricEnabled': True,
+'engineEnabled': True,
+'wipers': False,
+'blinkerLeftActive': False,
+'blinkerRightActive': False,
+'blinkerLeftOn': False,
+'blinkerRightOn': False,
+'lightsParking': False,
+'lightsBeamLow': False,
+'lightsBeamHigh': False,
+'lightsBeacon': False,
+'lightsBrake': False,
+'lightsReverse': False,
+'lightsHazards': False,
+'cruiseControl': False,
+'truckWheelOnGround': [True, True, True, True, True, True, False, False, False, False, False, False, False, False, False, False],
+'shifterToggle': [False, False],
+'differentialLock': False,
+'liftAxle': False,
+'liftAxleIndicator': False,
+'trailerLiftAxle': False,
+'trailerLiftAxleIndicator': False,
+'jobDeliveredAutoparkUsed': False,
+'jobDeliveredAutoloadUsed': False,
+'cabinPositionX': 0.0,
+'cabinPositionY': -2.0,
+'headPositionX': -0.4050000011920929,
+'headPositionY': -0.6500000953674316,
+'headPositionZ': 0.5670000314712524,
+'truckHookPositionX': 0.0,
+'truckHookPositionY': 1.0,
+'truckHookPositionZ': 1.5902955532073975,
+'truckWheelPositionX': [-1.0399999618530273, 1.0399999618530273, -0.9319999814033508, 0.9319999814033508, -0.9319999814033508, 0.9319999814033508, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+'truckWheelPositionY': [0.5059999823570251, 0.5059999823570251, 0.5059999823570251, 0.5059999823570251, 0.5059999823570251, 0.5059999823570251, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+'truckWheelPositionZ': [-3.648953914642334, -3.648953914642334, 1.1511303186416626, 1.1511303186416626, 2.4958086013793945, 2.4958086013793945, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+'lv_accelerationX': 0.057172566652297974,
+'lv_accelerationY': -0.014844902791082859,
+'lv_accelerationZ': -20.76361656188965,
+'av_accelerationX': 0.0021989073138684034,
+'av_accelerationY': -0.0029942377004772425,
+'av_accelerationZ': 3.1783220038050786e-05,
+'accelerationX': 0.0031964685767889023,
+'accelerationY': -0.14801974594593048,
+'accelerationZ': -0.3246815800666809,
+'aa_accelerationX': 0.002690908033400774,
+'aa_accelerationY': -2.8279449907131493e-05,
+'aa_accelerationZ': -0.0010006398661062121,
+'cabinAVX': 0.0,
+'cabinAVY': 0.0,
+'cabinAVZ': 0.0,
+'cabinAAX': 0.0,
+'cabinAAY': 0.0,
+'cabinAAZ': 0.0,
+'cabinOffsetX': 0.0,
+'cabinOffsetY': 0.0,
+'cabinOffsetZ': 0.0,
+'cabinOffsetrotationX': 0.0,
+'cabinOffsetrotationY': 0.0,
+'cabinOffsetrotationZ': 0.0,
+'headOffsetX': 0.054268285632133484,
+'headOffsetY': -0.07522289454936981,
+'headOffsetZ': -0.15859121084213257,
+'headOffsetrotationX': 0.9506424069404602,
+'headOffsetrotationY': 0.02964305691421032,
+'headOffsetrotationZ': -0.009435677900910378,
+'coordinateX': -56598.13327026367,
+'coordinateY': 71.66679382324219,
+'coordinateZ': -12153.561309814453,
+'rotationX': 0.6621344685554504,
+'rotationY': -0.001264609512872994,
+'rotationZ': -0.00013648993626702577,
+'truckBrandId': 'peterbilt',
+'truckBrand': 'Peterbilt',
+'truckId': 'vehicle.peterbilt.389',
+'truckName': '389',
+'cargoId': 'pt_579',
+'cargo': 'Peterbilt Trucks',
+'cityDstId': 'montrose',
+'cityDst': 'Montrose',
+'compDstId': 'pt_trk_dlr',
+'compDst': 'Peterbilt',
+'citySrcId': 'vernal',
+'citySrc': 'Vernal',
+'compSrcId': 'pt_trk_dlr',
+'compSrc': 'Peterbilt',
+'shifterType': 'arcade',
+'truckLicensePlate': 'Z023831',
+'truckLicensePlateCountryId': 'utah',
+'truckLicensePlateCountry': 'Utah',
+'jobMarket': 'quick_job',
+'fineOffence': '',
+'ferrySourceName': '',
+'ferryTargetName': '',
+'ferrySourceId': '',
+'ferryTargetId': '',
+'trainSourceName': '',
+'trainTargetName': '',
+'trainSourceId': '',
+'trainTargetId': '',
+'jobIncome': 7613,
+'jobCancelledPenalty': 0,
+'jobDeliveredRevenue': 0,
+'fineAmount': 0,
+'tollgatePayAmount': 0,
+'ferryPayAmount': 0,
+'trainPayAmount': 0,
+'onJob': True,
+'jobFinished': False,
+'jobCancelled': False,
+'jobDelivered': False,
+'fined': False,
+'tollgate': False,
+'ferry': False,
+'train': False,
+'refuel': False,
+'refuelPayed': False,
+'substances': ['static', 'road', 'road_snow', 'dirt', 'snow', 'grass', 'road_dirt', 'invis', 'ice', 'metal', 'rubber', 'rumble_stripe', 'plastic', 'glass', 'wood', 'soft', 'road_smooth', 'road_coarse', 'gravel', 'concrete', '', '', '', '', ''],
+'trailer': [{
+ 'wheelSteerable': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False],
+ 'wheelSimulated': [True, True, True, True, False, False, False, False, False, False, False, False, False, False, False, False],
+ 'wheelPowered': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False],
+ 'wheelLiftable': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False],
+ 'wheelOnGround': [True, True, True, True, False, False, False, False, False, False, False, False, False, False, False, False],
+ 'attached': True, 'wheelSubstance': [1, 1, 1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
+ 'wheelCount': 4,
+ 'cargoDamage': 5.7248536904808134e-05,
+ 'wearChassis': 8.700502075953409e-05,
+ 'wearWheels': 0.00018925870244856924,
+ 'wearBody': 6.960401515243575e-05,
+ 'wheelSuspDeflection': [0.001207323046401143, -0.003518986515700817, 0.001354379579424858, -0.0014462756225839257, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelVelocity': [6.358851432800293, 6.348288536071777, 6.358870506286621, 6.348255157470703, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelSteering': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelRotation': [0.24736735224723816, 0.5280472636222839, 0.1254892796278, 0.41627106070518494, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelLift': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelLiftOffset': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelRadius': [0.5199999809265137, 0.5199999809265137, 0.5199999809265137, 0.5199999809265137, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'linearVelocityX': 0.08644546568393707,
+ 'linearVelocityY': 0.027655865997076035,
+ 'linearVelocityZ': -20.75946044921875,
+ 'angularVelocityX': 0.0020713915582746267,
+ 'angularVelocityY': -0.0028369612991809845,
+ 'angularVelocityZ': -0.0009769933531060815,
+ 'linearAccelerationX': 0.003068926278501749,
+ 'linearAccelerationY': -0.29153236746788025,
+ 'linearAccelerationZ': -0.32241541147232056,
+ 'angularAccelerationX': 0.0033968703355640173,
+ 'angularAccelerationY': -0.00037714961217716336,
+ 'angularAccelerationZ': -0.0029267696663737297,
+ 'hookPositionX': 0.0,
+ 'hookPositionY': 1.0,
+ 'hookPositionZ': -4.928393840789795,
+ 'wheelPositionX': [-0.9680917263031006, 0.9680917263031006, -0.9728195667266846, 0.9728195667266846, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelPositionY': [0.5199999809265137, 0.5199999809265137, 0.5199999809265137, 0.5199999809265137, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'w
+ heelPositionZ': [4.919853687286377, 4.919853687286377, 6.113210201263428, 6.113210201263428, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'worldX': -56603.71405029297,
+ 'worldY': 71.73175048828125,
+ 'worldZ': -12156.945404052734,
+ 'rotationX': 0.6635493636131287,
+ 'rotationY': -0.0024805348366498947,
+ 'rotationZ': 0.0002618239086586982,
+ 'id': 'truck_transporter.mid2_firstx2esii',
+ 'cargoAcessoryId': 'truck_transporter.ptb_579_white1',
+ 'bodyType': '_tranitept79',
+ 'brandId': '',
+ 'brand': '',
+ 'name': '',
+ 'chainType': 'single',
+ 'licensePlate': '542856B',
+ 'licensePlateCountry': 'Utah',
+ 'licensePlateCountryId': 'utah'},
+ {'wheelSteerable': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False],
+ 'wheelSimulated': [True, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False],
+ 'wheelPowered': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False],
+ 'wheelLiftable': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False],
+ 'wheelOnGround': [True, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False],
+ 'attached': True,
+ 'wheelSubstance': [1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
+ 'wheelCount': 2,
+ 'cargoDamage': 0.0,
+ 'wearChassis': 0.00010775496775750071,
+ 'wearWheels': 0.00019242058624513447,
+ 'wearBody': 8.620397420600057e-05,
+ 'wheelSuspDeflection': [0.00036682604695670307, -0.0007475137826986611, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelVelocity': [6.355575084686279, 6.345222473144531, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelSteering': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelRotation': [0.36591559648513794, 0.8230391144752502, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelLift': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelLiftOffset': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelRadius': [0.5199999809265137, 0.5199999809265137, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'linearVelocityX': 0.054376956075429916,
+ 'linearVelocityY': 0.10145216435194016,
+ 'linearVelocityZ': -20.747961044311523,
+ 'angularVelocityX': -0.001152990385890007,
+ 'angularVelocityY': -0.0027617125306278467,
+ 'angularVelocityZ': 0.00022915970475878567,
+ 'linearAccelerationX': 0.03915806859731674,
+ 'linearAccelerationY': 0.11982301622629166,
+ 'linearAccelerationZ': -0.28069794178009033,
+ 'angularAccelerationX': -0.009115384891629219,
+ 'angularAccelerationY': -0.0008358439081348479,
+ 'angularAccelerationZ': 0.00508680148050189,
+ 'hookPositionX': 0.0,
+ 'hookPositionY': 1.0,
+ 'hookPositionZ': -1.9945406913757324,
+ 'wheelPositionX': [-0.9746841788291931, 0.9746841788291931, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelPositionY': [0.5199999809265137, 0.5199999809265137, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelPositionZ': [4.300641059875488, 4.300641059875488, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'worldX': -56609.92657470703,
+ 'worldY': 71.82679748535156,
+ 'worldZ': -12160.68539428711,
+ 'rotationX': 0.6643381714820862,
+ 'rotationY': -0.0033320696093142033,
+ 'rotationZ': 0.0003402951988391578,
+ 'id': 'truck_transporter.mid_secondx2esii',
+ 'cargoAcessoryId': 'truck_transporter.ptb_579_white2',
+ 'bodyType': '',
+ 'brandId': '',
+ 'brand': '',
+ 'name': '',
+ 'chainType': '',
+ 'licensePlate': '601801S',
+ 'licensePlateCountry': 'Utah',
+ 'licensePlateCountryId': 'utah'},
+ {'wheelSteerable': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False],
+ 'wheelSimulated': [True, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False],
+ 'wheelPowered': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False],
+ 'wheelLiftable': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False],
+ 'wheelOnGround': [True, True, False, False, False, False, False, False, False, False, False, False, False, False, False, False],
+ 'attached': True,
+ 'wheelSubstance': [1, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
+ 'wheelCount': 2,
+ 'cargoDamage': 0.0,
+ 'wearChassis': 0.00018087547505274415,
+ 'wearWheels': 0.00020356278400868177,
+ 'wearBody': 0.00014470038877334446,
+ 'wheelSuspDeflection': [0.0017827606061473489, 0.0009589004330337048, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelVelocity': [6.357532024383545, 6.348286151885986, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelSteering': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelRotation': [0.8401334881782532, 0.16890375316143036, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelLift': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelLiftOffset': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelRadius': [0.5199999809265137, 0.5199999809265137, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'linearVelocityX': 0.04934953898191452,
+ 'linearVelocityY': 0.07690752297639847,
+ 'linearVelocityZ': -20.75708770751953,
+ 'angularVelocityX': 0.0013052199501544237,
+ 'angularVelocityY': -0.0024662413634359837,
+ 'angularVelocityZ': 0.0001222444698214531,
+ 'linearAccelerationX': -0.026931315660476685,
+ 'linearAccelerationY': -0.13051074743270874,
+ 'linearAccelerationZ': -0.3180672526359558,
+ 'angularAccelerationX': 0.0027606417424976826,
+ 'angularAccelerationY': 0.0013949627755209804,
+ 'angularAccelerationZ': -0.013173911720514297,
+ 'hookPositionX': 0.0,
+ 'hookPositionY': 1.0,
+ 'hookPositionZ': -2.8044443130493164,
+ 'wheelPositionX': [-0.9746841788291931, 0.9746841788291931, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelPositionY': [0.5199999809265137, 0.5199999809265137, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelPositionZ': [3.570002555847168, 3.570002555847168, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'worldX': -56615.53796386719,
+ 'worldY': 71.93563842773438,
+ 'worldZ': -12164.023223876953,
+ 'rotationX': 0.6650192141532898,
+ 'rotationY': -0.0031634399201720953,
+ 'rotationZ': 0.0002087761095026508,
+ 'id': 'truck_transporter.mid_thirdx2esii',
+ 'cargoAcessoryId': 'truck_transporter.ptb_579_white3',
+ 'bodyType': '',
+ 'brandId': '',
+ 'brand': '',
+ 'name': '',
+ 'chainType': '',
+ 'licensePlate': '748736N',
+ 'licensePlateCountry': 'Utah',
+ 'licensePlateCountryId': 'utah'},
+ {'wheelSteerable': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False],
+ 'wheelSimulated': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False],
+ 'wheelPowered': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False],
+ 'wheelLiftable': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False],
+ 'wheelOnGround': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False],
+ 'attached': False,
+ 'wheelSubstance': [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
+ 'wheelCount': 0,
+ 'cargoDamage': 0.0,
+ 'wearChassis': 0.0,
+ 'wearWheels': 0.0,
+ 'wearBody': 0.0,
+ 'wheelSuspDeflection': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelVelocity': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelSteering': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelRotation': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelLift': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelLiftOffset': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelRadius': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'linearVelocityX': 0.0,
+ 'linearVelocityY': 0.0,
+ 'linearVelocityZ': 0.0,
+ 'angularVelocityX': 0.0,
+ 'angularVelocityY': 0.0,
+ 'angularVelocityZ': 0.0,
+ 'linearAccelerationX': 0.0,
+ 'linearAccelerationY': 0.0,
+ 'linearAccelerationZ': 0.0,
+ 'angularAccelerationX': 0.0,
+ 'angularAccelerationY': 0.0,
+ 'angularAccelerationZ': 0.0,
+ 'hookPositionX': 0.0,
+ 'hookPositionY': 0.0,
+ 'hookPositionZ': 0.0,
+ 'wheelPositionX': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelPositionY': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'wheelPositionZ': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0],
+ 'worldX': 0.0,
+ 'worldY': 0.0,
+ 'worldZ': 0.0,
+ 'rotationX': 0.0,
+ 'rotationY': 0.0,
+ 'rotationZ': 0.0,
+ 'id': '',
+ 'cargoAcessoryId': '',
+ 'bodyType': '',
+ 'brandId': '',
+ 'brand': '',
+ 'name': '',
+ 'chainType': '',
+ 'licensePlate': '',
+ 'licensePlateCountry': '',
+ 'licensePlateCountryId': ''},
+ {'wheelSteerable': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'wheelSimulated': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'wheelPowered': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'wheelLiftable': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'wheelOnGround': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'attached': False, 'wheelSubstance': [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], 'wheelCount': 0, 'cargoDamage': 0.0, 'wearChassis': 0.0, 'wearWheels': 0.0, 'wearBody': 0.0, 'wheelSuspDeflection': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelVelocity': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelSteering': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelRotation': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelLift': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelLiftOffset': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelRadius': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'linearVelocityX': 0.0, 'linearVelocityY': 0.0, 'linearVelocityZ': 0.0, 'angularVelocityX': 0.0, 'angularVelocityY': 0.0, 'angularVelocityZ': 0.0, 'linearAccelerationX': 0.0, 'linearAccelerationY': 0.0, 'linearAccelerationZ': 0.0, 'angularAccelerationX': 0.0, 'angularAccelerationY': 0.0, 'angularAccelerationZ': 0.0, 'hookPositionX': 0.0, 'hookPositionY': 0.0, 'hookPositionZ': 0.0, 'wheelPositionX': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelPositionY': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelPositionZ': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'worldX': 0.0, 'worldY': 0.0, 'worldZ': 0.0, 'rotationX': 0.0, 'rotationY': 0.0, 'rotationZ': 0.0, 'id': '', 'cargoAcessoryId': '', 'bodyType': '', 'brandId': '', 'brand': '', 'name': '', 'chainType': '', 'licensePlate': '', 'licensePlateCountry': '', 'licensePlateCountryId': ''}, {'wheelSteerable': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'wheelSimulated': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'wheelPowered': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'wheelLiftable': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'wheelOnGround': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'attached': False, 'wheelSubstance': [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], 'wheelCount': 0, 'cargoDamage': 0.0, 'wearChassis': 0.0, 'wearWheels': 0.0, 'wearBody': 0.0, 'wheelSuspDeflection': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelVelocity': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelSteering': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelRotation': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelLift': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelLiftOffset': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelRadius': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'linearVelocityX': 0.0, 'linearVelocityY': 0.0, 'linearVelocityZ': 0.0, 'angularVelocityX': 0.0, 'angularVelocityY': 0.0, 'angularVelocityZ': 0.0, 'linearAccelerationX': 0.0, 'linearAccelerationY': 0.0, 'linearAccelerationZ': 0.0, 'angularAccelerationX': 0.0, 'angularAccelerationY': 0.0, 'angularAccelerationZ': 0.0, 'hookPositionX': 0.0, 'hookPositionY': 0.0, 'hookPositionZ': 0.0, 'wheelPositionX': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelPositionY': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelPositionZ': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'worldX': 0.0, 'worldY': 0.0, 'worldZ': 0.0, 'rotationX': 0.0, 'rotationY': 0.0, 'rotationZ': 0.0, 'id': '', 'cargoAcessoryId': '', 'bodyType': '', 'brandId': '', 'brand': '', 'name': '', 'chainType': '', 'licensePlate': '', 'licensePlateCountry': '', 'licensePlateCountryId': ''}, {'wheelSteerable': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'wheelSimulated': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'wheelPowered': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'wheelLiftable': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'wheelOnGround': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'attached': False, 'wheelSubstance': [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], 'wheelCount': 0, 'cargoDamage': 0.0, 'wearChassis': 0.0, 'wearWheels': 0.0, 'wearBody': 0.0, 'wheelSuspDeflection': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelVelocity': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelSteering': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelRotation': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelLift': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelLiftOffset': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelRadius': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'linearVelocityX': 0.0, 'linearVelocityY': 0.0, 'linearVelocityZ': 0.0, 'angularVelocityX': 0.0, 'angularVelocityY': 0.0, 'angularVelocityZ': 0.0, 'linearAccelerationX': 0.0, 'linearAccelerationY': 0.0, 'linearAccelerationZ': 0.0, 'angularAccelerationX': 0.0, 'angularAccelerationY': 0.0, 'angularAccelerationZ': 0.0, 'hookPositionX': 0.0, 'hookPositionY': 0.0, 'hookPositionZ': 0.0, 'wheelPositionX': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelPositionY': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelPositionZ': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'worldX': 0.0, 'worldY': 0.0, 'worldZ': 0.0, 'rotationX': 0.0, 'rotationY': 0.0, 'rotationZ': 0.0, 'id': '', 'cargoAcessoryId': '', 'bodyType': '', 'brandId': '', 'brand': '', 'name': '', 'chainType': '', 'licensePlate': '', 'licensePlateCountry': '', 'licensePlateCountryId': ''}, {'wheelSteerable': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'wheelSimulated': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'wheelPowered': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'wheelLiftable': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'wheelOnGround': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'attached': False, 'wheelSubstance': [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], 'wheelCount': 0, 'cargoDamage': 0.0, 'wearChassis': 0.0, 'wearWheels': 0.0, 'wearBody': 0.0, 'wheelSuspDeflection': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelVelocity': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelSteering': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelRotation': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelLift': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelLiftOffset': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelRadius': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'linearVelocityX': 0.0, 'linearVelocityY': 0.0, 'linearVelocityZ': 0.0, 'angularVelocityX': 0.0, 'angularVelocityY': 0.0, 'angularVelocityZ': 0.0, 'linearAccelerationX': 0.0, 'linearAccelerationY': 0.0, 'linearAccelerationZ': 0.0, 'angularAccelerationX': 0.0, 'angularAccelerationY': 0.0, 'angularAccelerationZ': 0.0, 'hookPositionX': 0.0, 'hookPositionY': 0.0, 'hookPositionZ': 0.0, 'wheelPositionX': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelPositionY': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelPositionZ': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'worldX': 0.0, 'worldY': 0.0, 'worldZ': 0.0, 'rotationX': 0.0, 'rotationY': 0.0, 'rotationZ': 0.0, 'id': '', 'cargoAcessoryId': '', 'bodyType': '', 'brandId': '', 'brand': '', 'name': '', 'chainType': '', 'licensePlate': '', 'licensePlateCountry': '', 'licensePlateCountryId': ''}, {'wheelSteerable': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'wheelSimulated': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'wheelPowered': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'wheelLiftable': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'wheelOnGround': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'attached': False, 'wheelSubstance': [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], 'wheelCount': 0, 'cargoDamage': 0.0, 'wearChassis': 0.0, 'wearWheels': 0.0, 'wearBody': 0.0, 'wheelSuspDeflection': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelVelocity': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelSteering': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelRotation': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelLift': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelLiftOffset': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelRadius': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'linearVelocityX': 0.0, 'linearVelocityY': 0.0, 'linearVelocityZ': 0.0, 'angularVelocityX': 0.0, 'angularVelocityY': 0.0, 'angularVelocityZ': 0.0, 'linearAccelerationX': 0.0, 'linearAccelerationY': 0.0, 'linearAccelerationZ': 0.0, 'angularAccelerationX': 0.0, 'angularAccelerationY': 0.0, 'angularAccelerationZ': 0.0, 'hookPositionX': 0.0, 'hookPositionY': 0.0, 'hookPositionZ': 0.0, 'wheelPositionX': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelPositionY': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelPositionZ': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'worldX': 0.0, 'worldY': 0.0, 'worldZ': 0.0, 'rotationX': 0.0, 'rotationY': 0.0, 'rotationZ': 0.0, 'id': '', 'cargoAcessoryId': '', 'bodyType': '', 'brandId': '', 'brand': '', 'name': '', 'chainType': '', 'licensePlate': '', 'licensePlateCountry': '', 'licensePlateCountryId': ''}, {'wheelSteerable': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'wheelSimulated': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'wheelPowered': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'wheelLiftable': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'wheelOnGround': [False, False, False, False, False, False, False, False, False, False, False, False, False, False, False, False], 'attached': False, 'wheelSubstance': [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], 'wheelCount': 0, 'cargoDamage': 0.0, 'wearChassis': 0.0, 'wearWheels': 0.0, 'wearBody': 0.0, 'wheelSuspDeflection': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelVelocity': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelSteering': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelRotation': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelLift': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelLiftOffset': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelRadius': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'linearVelocityX': 0.0, 'linearVelocityY': 0.0, 'linearVelocityZ': 0.0, 'angularVelocityX': 0.0, 'angularVelocityY': 0.0, 'angularVelocityZ': 0.0, 'linearAccelerationX': 0.0, 'linearAccelerationY': 0.0, 'linearAccelerationZ': 0.0, 'angularAccelerationX': 0.0, 'angularAccelerationY': 0.0, 'angularAccelerationZ': 0.0, 'hookPositionX': 0.0, 'hookPositionY': 0.0, 'hookPositionZ': 0.0, 'wheelPositionX': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelPositionY': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'wheelPositionZ': [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0], 'worldX': 0.0, 'worldY': 0.0, 'worldZ': 0.0, 'rotationX': 0.0, 'rotationY': 0.0, 'rotationZ': 0.0, 'id': '', 'cargoAcessoryId': '', 'bodyType': '', 'brandId': '', 'brand': '', 'name': '', 'chainType': '', 'licensePlate': '', 'licensePlateCountry': '', 'licensePlateCountryId': ''}]}
\ No newline at end of file
diff --git a/templates/migration/1_5_0/skills/ats_telemetry/logo.png b/templates/migration/1_5_0/skills/ats_telemetry/logo.png
new file mode 100644
index 00000000..43fec71b
Binary files /dev/null and b/templates/migration/1_5_0/skills/ats_telemetry/logo.png differ
diff --git a/templates/migration/1_5_0/skills/ats_telemetry/main.py b/templates/migration/1_5_0/skills/ats_telemetry/main.py
new file mode 100644
index 00000000..950e523c
--- /dev/null
+++ b/templates/migration/1_5_0/skills/ats_telemetry/main.py
@@ -0,0 +1,860 @@
+import os
+import shutil
+import math
+import copy
+from datetime import datetime, timedelta
+import requests
+import truck_telemetry
+from pyproj import Proj, transform
+import time
+from typing import TYPE_CHECKING
+from api.interface import (
+ SettingsConfig,
+ SkillConfig,
+ WingmanInitializationError,
+)
+from api.enums import LogType
+from skills.skill_base import Skill
+from services.file import get_writable_dir
+
+
+if TYPE_CHECKING:
+ from wingmen.open_ai_wingman import OpenAiWingman
+
+class ATSTelemetry(Skill):
+
+ def __init__(self, config: SkillConfig, settings: SettingsConfig, wingman: "OpenAiWingman") -> None:
+ self.already_initialized_telemetry = False
+ self.use_metric_system = False
+ self.telemetry_loop_cached_data = {}
+ self.telemetry_loop_data_points = [
+ 'onJob',
+ 'plannedDistance',
+ 'jobFinished',
+ 'jobCancelled',
+ 'jobDelivered',
+ 'jobStartingTime',
+ 'jobFinishedTime',
+ 'jobIncome',
+ 'jobCancelledPenalty',
+ 'jobDeliveredRevenue',
+ 'jobDeliveredEarnedXp',
+ 'jobDeliveredCargoDamage',
+ 'jobDeliveredDistance',
+ 'jobDeliveredAutoparkUsed',
+ 'jobDeliveredAutoloadUsed',
+ 'isCargoLoaded',
+ 'specialJob',
+ 'jobMarket',
+ 'fined',
+ 'tollgate',
+ 'ferry',
+ 'train',
+ 'refuel',
+ 'refuelPayed',
+ 'refuelAmount',
+ 'cargoDamage',
+ 'truckBrand',
+ 'truckName',
+ 'cargo',
+ 'cityDst',
+ 'compDst',
+ 'citySrc',
+ 'compSrc',
+ 'truckLicensePlate',
+ 'truckLicensePlateCountry',
+ 'fineOffence',
+ 'fineAmount',
+ 'isWorldOfTrucksContract',
+ 'gameTimeLapsedToCompleteJob',
+ 'realLifeTimeToCompleteWorldofTrucksJob',
+ 'cargoMassInTons',
+ 'cargoMass',
+ ]
+ self.telemetry_loop_running = False
+ self.ats_install_directory = ""
+ self.ets_install_directory = ""
+ self.dispatcher_backstory = ""
+ self.autostart_dispatch_mode = False
+ # Define the ATS (American Truck Simulator) projection for use in converting in-game coordinates to real life
+ self.ats_proj = Proj(
+ proj='lcc', lat_1=33, lat_2=45, lat_0=39, lon_0=-96, units='m', k_0=0.05088, ellps='sphere'
+ )
+ # Define the ETS2 (Euro Truck Simulator 2) projection and the UK projection for use in converting in-game coordinates to real life
+ ets2_scale = 1 / 19.35
+ uk_scale = ets2_scale / 0.75
+
+ self.ets2_proj = Proj(
+ proj='lcc', lat_1=37, lat_2=65, lat_0=50, lon_0=15, units='m', k_0=ets2_scale, ellps='sphere'
+ )
+ self.uk_proj = Proj(
+ proj='lcc', lat_1=37, lat_2=65, lat_0=50, lon_0=15, units='m', k_0=uk_scale, ellps='sphere'
+ )
+ super().__init__(config=config, settings=settings, wingman=wingman)
+
+ async def validate(self) -> list[WingmanInitializationError]:
+ errors = await super().validate()
+ self.use_metric_system = self.retrieve_custom_property_value(
+ "use_metric_system", errors
+ )
+ self.ats_install_directory = self.retrieve_custom_property_value(
+ "ats_install_directory", errors
+ )
+ self.ets_install_directory = self.retrieve_custom_property_value(
+ "ets_install_directory", errors
+ )
+ self.dispatcher_backstory = self.retrieve_custom_property_value(
+ "dispatcher_backstory", errors
+ )
+ # Default back to wingman backstory
+ if self.dispatcher_backstory == "" or self.dispatcher_backstory == " " or not self.dispatcher_backstory:
+ self.dispatcher_backstory = self.wingman.config.prompts.backstory
+
+ self.autostart_dispatch_mode = self.retrieve_custom_property_value(
+ "autostart_dispatch_mode", errors
+ )
+ await self.check_and_install_telemetry_dlls()
+ return errors
+
+ # Try to find existing telemetry dlls, if not found, attempt to install
+ async def check_and_install_telemetry_dlls(self):
+ skills_filepath = get_writable_dir("skills")
+ ats_telemetry_skill_filepath = os.path.join(skills_filepath, "ats_telemetry")
+ sdk_dll_filepath = os.path.join(ats_telemetry_skill_filepath, "scs-telemetry.dll")
+ ats_plugins_dir = os.path.join(self.ats_install_directory, "bin\\win_x64\\plugins")
+ ets_plugins_dir = os.path.join(self.ets_install_directory, "bin\\win_x64\\plugins")
+ # Check and copy dll for ATS if applicable
+ if os.path.exists(self.ats_install_directory):
+ ats_dll_path = os.path.join(ats_plugins_dir, "scs-telemetry.dll")
+ if not os.path.exists(ats_dll_path):
+ try:
+ if not os.path.exists(ats_plugins_dir):
+ os.makedirs(ats_plugins_dir)
+ shutil.copy2(sdk_dll_filepath, ats_plugins_dir)
+ except Exception as e:
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Could not install scs telemetry dll to {ats_plugins_dir}.",
+ color=LogType.INFO,
+ )
+ # Check and copy dll for ETS if applicable
+ if os.path.exists(self.ets_install_directory):
+ ets_dll_path = os.path.join(ets_plugins_dir, "scs-telemetry.dll")
+ if not os.path.exists(ets_dll_path):
+ try:
+ if not os.path.exists(ets_plugins_dir):
+ os.makedirs(ets_plugins_dir)
+ shutil.copy2(sdk_dll_filepath, ets_plugins_dir)
+ except Exception as e:
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Could not install scs telemetry dll to {ets_plugins_dir}.",
+ color=LogType.INFO,
+ )
+
+ # Start telemetry module connection with in-game telemetry SDK
+ async def initialize_telemetry(self) -> bool:
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ "Starting ATS / ETS telemetry module",
+ color=LogType.INFO,
+ )
+ # truck_telemetry.init() requires the user to have installed the proper SDK DLL from https://github.com/RenCloud/scs-sdk-plugin/releases/tag/V.1.12.1
+ # into the proper folder of their truck sim install (https://github.com/RenCloud/scs-sdk-plugin#installation), if they do not this step will fail, so need to catch the error.
+ try:
+ truck_telemetry.init()
+ return True
+ except:
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Initialize ATSTelemetry function failed.",
+ color=LogType.INFO,
+ )
+ return False
+
+ # Initiate separate thread for constant checking of changes to key telemetry data points
+ async def initialize_telemetry_cache_loop(self, loop_time : int = 10):
+ if self.telemetry_loop_running:
+ return
+
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ "Starting ATS / ETS telemetry cache loop",
+ color=LogType.INFO,
+ )
+ self.threaded_execution(self.start_telemetry_loop, loop_time)
+
+ # Loop every designated number of seconds to retrieve telemetry data and run query function to determine if any tracked data points have changed
+ async def start_telemetry_loop(self, loop_time: int):
+ if not self.telemetry_loop_running:
+ self.telemetry_loop_running = True
+ data = truck_telemetry.get_data()
+ filtered_data = await self.filter_data(data)
+ self.telemetry_loop_cached_data = copy.deepcopy(filtered_data)
+ while self.telemetry_loop_running:
+ changed_data = await self.query_and_compare_data(self.telemetry_loop_data_points)
+ if changed_data:
+ await self.initiate_llm_call_with_changed_data(changed_data)
+ time.sleep(loop_time)
+
+ # Compare new telemetry data in monitored fields and react if there are changes
+ async def query_and_compare_data(self, data_points: list):
+ try:
+ default = "The following data changed: "
+ data_changed = default
+ data = truck_telemetry.get_data()
+ filtered_data = await self.filter_data(data)
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ "Querying and comparing telemetry data",
+ color=LogType.INFO,
+ )
+ for point in data_points:
+ current_data = filtered_data.get(point)
+ if current_data and current_data != self.telemetry_loop_cached_data.get(point):
+ # Prevent relatively small data changes in float values from triggering new alerts, such as when cargo damage, route distance or route time change by very small values
+ if isinstance(current_data, float) and abs((current_data - self.telemetry_loop_cached_data.get(point)) / current_data) <= 0.25:
+ pass
+ else:
+ data_changed = data_changed + f"{point}:{current_data}, last value was {point}:{self.telemetry_loop_cached_data.get(point)},"
+ self.telemetry_loop_cached_data = copy.deepcopy(filtered_data)
+ if data_changed == default:
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ "No changed telemetry data found.",
+ color=LogType.INFO,
+ )
+ return None
+ else:
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ data_changed,
+ color=LogType.INFO,
+ )
+ return data_changed
+
+ except Exception as e:
+ return None
+
+ # Stop ongoing call for updated telemetry data
+ async def stop_telemetry_loop(self):
+ self.telemetry_loop_running = False
+ self.telemetry_loop_cached_data = {}
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ "Stopping ATS / ETS telemetry cache loop",
+ color=LogType.INFO,
+ )
+
+ # If telemetry data changed, get LLM to provide a verbal response to the user, without requiring the user to initiate a communication with the LLM
+ async def initiate_llm_call_with_changed_data(self, changed_data):
+ units_phrase = "You are located in the US, so use US Customary Units, like feet, yards, miles, and pounds in your responses, and convert metric or imperial formats to these units. All currency is in dollars."
+ if self.use_metric_system:
+ units_phrase = "Use the metric system like meters, kilometers, kilometers per hour, and kilograms in your responses."
+ user_content = f"{changed_data}"
+ messages = [
+ {
+ 'role': 'system',
+ 'content': f"""
+ {self.dispatcher_backstory}
+ Acting in character at all times, react to the following changed information.
+ {units_phrase}
+ """,
+ },
+ {
+ 'role': 'user',
+ 'content': user_content,
+ },
+ ]
+ completion = await self.llm_call(messages)
+ response = completion.choices[0].message.content if completion and completion.choices else ""
+
+ if not response:
+ return
+
+ await self.printr.print_async(
+ text=f"Dispatch: {response}",
+ color=LogType.INFO,
+ source_name=self.wingman.name
+ )
+
+ self.threaded_execution(self.wingman.play_to_user, response, True)
+ await self.wingman.add_assistant_message(response)
+
+ def get_tools(self) -> list[tuple[str, dict]]:
+ tools = [
+ (
+ "get_game_state",
+ {
+ "type": "function",
+ "function": {
+ "name": "get_game_state",
+ "description": "Retrieve the current game state variable from American Truck Simulator.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "variable": {
+ "type": "string",
+ "description": "The game state variable to retrieve (e.g., 'speed').",
+ }
+ },
+ "required": ["variable"],
+ },
+ },
+ },
+ ),
+ (
+ "get_information_about_current_location",
+ {
+ "type": "function",
+ "function": {
+ "name": "get_information_about_current_location",
+ "description": "Used to provide more detailed information if the user asks a general question like 'where are we?', or 'what city are we in?'",
+ },
+ },
+ ),
+ (
+ "start_or_activate_dispatch_telemetry_loop",
+ {
+ "type": "function",
+ "function": {
+ "name": "start_or_activate_dispatch_telemetry_loop",
+ "description": "Begin dispatch function, which will check telemetry at designated intervals.",
+ },
+ },
+ ),
+ (
+ "end_or_stop_dispatch_telemetry_loop",
+ {
+ "type": "function",
+ "function": {
+ "name": "end_or_stop_dispatch_telemetry_loop",
+ "description": "End or stop dispatch function, to stop automatically checking telemetry at designated intervals.",
+ },
+ },
+ ),
+ ]
+ return tools
+
+ async def execute_tool(self, tool_name: str, parameters: dict[str, any]) -> tuple[str, str]:
+ function_response = ""
+ instant_response = ""
+
+ if tool_name == "get_game_state":
+ if not self.already_initialized_telemetry:
+ self.already_initialized_telemetry = await self.initialize_telemetry()
+ # If initialization of telemetry object fails because another instance is already running, try just checking getting the data as a fail safe; if still cannot get the data, trigger error
+ if not self.already_initialized_telemetry:
+ try:
+ test = truck_telemetry.get_data()
+ self.already_initialized_telemetry = True
+ except:
+ function_response = "Error trying to access truck telemetry data. It appears there is a problem with the module. Check to see if the game is running, and that you have installed the SDK in the proper location."
+ return function_response, instant_response
+
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+ await self.printr.print_async(
+ f"Executing get_game_state function with parameters: {parameters}",
+ color=LogType.INFO,
+ )
+
+ data = truck_telemetry.get_data()
+ filtered_data = await self.filter_data(data)
+
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Printout of all data received from telemetry: {filtered_data}",
+ color=LogType.INFO,
+ )
+
+ variable = parameters.get("variable")
+ if variable in filtered_data:
+ value = filtered_data[variable]
+ try:
+ string_value = str(value)
+ except:
+ string_value = "value could not be found."
+ function_response = f"The current value of '{variable}' is {string_value}."
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Found variable result in telemetry for {variable}, {string_value}",
+ color=LogType.INFO,
+ )
+ else:
+ function_response = f"Variable '{variable}' not found."
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Could not locate variable result in telemetry for {variable}.",
+ color=LogType.INFO,
+ )
+
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+
+ elif tool_name == "start_or_activate_dispatch_telemetry_loop":
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+ await self.printr.print_async(
+ f"Executing start_or_activate_dispatch_telemetry_loop",
+ color=LogType.INFO,
+ )
+
+ if self.telemetry_loop_running:
+ function_response = "Dispatch communications already open."
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+ await self.printr.print_async(
+ f"Attempted to start dispatch communications loop but loop is already running",
+ color=LogType.INFO,
+ )
+
+ return function_response, instant_response
+
+ if not self.already_initialized_telemetry:
+ self.already_initialized_telemetry = await self.initialize_telemetry()
+ # If initialization of telemetry object fails because another instance is already running, try just checking getting the data as a fail safe; if still cannot get the data, trigger error
+ if not self.already_initialized_telemetry:
+ try:
+ test = truck_telemetry.get_data()
+ self.already_initialized_telemetry = True
+ except:
+ function_response = "Error trying to access truck telemetry data. It appears there is a problem with the module. Check to see if the game is running, and that you have installed the SDK in the proper location."
+ return function_response, instant_response
+
+ if not self.telemetry_loop_running:
+ await self.initialize_telemetry_cache_loop(10)
+
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+
+ function_response = "Opened dispatch communications."
+
+ elif tool_name == "end_or_stop_dispatch_telemetry_loop":
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+ await self.printr.print_async(
+ f"Executing end_or_stop_dispatch_telemetry_loop",
+ color=LogType.INFO,
+ )
+
+ await self.stop_telemetry_loop()
+
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+
+ function_response = "Closed dispatch communications."
+
+ elif tool_name == "get_information_about_current_location":
+ if not self.already_initialized_telemetry:
+ self.already_initalized_telemetry = await self.initialize_telemetry()
+ # If initialization of telemetry object fails because another instance is already running, try just checking getting the data as a fail safe; if still cannot get the data, trigger error
+ if not self.already_initialized_telemetry:
+ try:
+ test = truck_telemetry.get_data()
+ self.already_initialized_telemetry = True
+ except:
+ function_response = "Error trying to access truck telemetry data. It appears there is a problem with the module. Check to see if the game is running, and that you have installed the SDK in the proper location."
+ return function_response, instant_response
+
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+
+ data = truck_telemetry.get_data()
+ # Pull x, y coordinates from truck sim, note that there are x, y, z, so here z pertains to 2d y, and y pertains to altitude
+ x = data["coordinateX"]
+ y = data["coordinateZ"]
+
+ # Convert to world latitude and longitude
+ longitude, latitude = await self.from_ats_coords_to_wgs84(data["coordinateX"], data["coordinateZ"])
+
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Executing get_information_about_current_location function with coordinateX as {x} and coordinateZ as {y}, latitude returned was {latitude}, longitude returned was {longitude}.",
+ color=LogType.INFO,
+ )
+
+ place_info = await self.convert_lat_long_data_into_place_data(latitude, longitude)
+
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+
+ if place_info:
+ function_response = f"Information regarding the approximate location we are near: {place_info}"
+ else:
+ function_response = "Unable to get more detailed information regarding the place based on the current truck coordinates."
+
+ return function_response, instant_response
+
+ # Function to autostart dispatch mode
+ async def autostart_dispatcher_mode(self):
+ telemetry_started = False
+ while not telemetry_started and self.loaded:
+ telemetry_started = await self.initialize_telemetry()
+ # Init could fail if truck telemetry module already started elsewhere, so make sure we cannot just query truck telemetry data yet to be sure it's not initialized
+ if not telemetry_started:
+ try:
+ test = truck_telemetry.get_data()
+ telemetry_started = True
+ except:
+ telemetry_started = False
+ # Try again in ten seconds; maybe user has not loaded up Truck Simulator yet
+ time.sleep(10)
+ if self.loaded:
+ await self.initialize_telemetry_cache_loop(10)
+
+ # Autostart dispatch mode if option turned on in config
+ async def prepare(self) -> None:
+ self.loaded = True
+ if self.autostart_dispatch_mode:
+ self.threaded_execution(self.autostart_dispatcher_mode)
+
+ # Unload telemetry module and stop any ongoing loop when config / program unloads
+ async def unload(self) -> None:
+ self.loaded = False
+ await self.stop_telemetry_loop()
+ truck_telemetry.deinit()
+
+# Helper Data Functions for Enhancing Telemetry Data Before Sending to LLM
+# Adapted, revised from https://github.com/mike-koch/ets2-mobile-route-advisor/blob/master/dashboard.js
+ async def filter_data(self, data):
+ try:
+ # Set enhanced data to copy of all values of data
+ enhanced_data = copy.deepcopy(data)
+
+ enhanced_data['isEts2'] = (data['game'] == 1)
+ enhanced_data['isAts'] = not enhanced_data['isEts2']
+
+ # Deal with speed variables, create new ones specific to MPH and KPH
+ truck_speed_mph = data['speed'] * 2.23694
+ truck_speed_kph = data['speed'] * 3.6
+
+ if self.use_metric_system:
+ enhanced_data['truckSpeed'] = str(abs(round(truck_speed_kph))) + ' kilometers per hour'
+ enhanced_data['cruiseControlSpeed'] = str(round(data['cruiseControlSpeed'] * 3.6)) + ' kilometers per hour'
+ enhanced_data['speedLimit'] = str(round(data['speedLimit'] * 3.6)) + ' kilometers per hour'
+ else:
+ enhanced_data['truckSpeed'] = str(abs(round(truck_speed_mph))) + ' miles per hour'
+ enhanced_data['cruiseControlSpeed'] = str(round(data['cruiseControlSpeed'] * 2.23694)) + ' miles per hour'
+ enhanced_data['speedLimit'] = str(round(data['speedLimit'] * 2.23694)) + ' miles per hour'
+
+ # Deal with shifter type, use more readable format
+ if data['shifterType'] in ['automatic', 'arcade']:
+ enhanced_data['gear'] = 'A' + str(data['gearDashboard']) if data['gearDashboard'] > 0 else ('R' + str(abs(data['gearDashboard'])) if data['gearDashboard'] < 0 else 'N')
+ else:
+ enhanced_data['gear'] = str(data['gearDashboard']) if data['gearDashboard'] > 0 else ('R' + str(abs(data['gearDashboard'])) if data['gearDashboard'] < 0 else 'N')
+
+ # Convert percentages
+ enhanced_data['currentFuelPercentage'] = str(round((data['fuel'] / data['fuelCapacity']) * 100)) + ' percent of fuel remaining.'
+ enhanced_data['currentAdbluePercentage'] = str(round((data['adblue'] / data['adblueCapacity']) * 100)) + ' percent of adblue remaining.'
+ scs_truck_damage = await self.getDamagePercentage(data)
+ enhanced_data['truckDamageRounded'] = str(math.floor(scs_truck_damage)) + ' percent truck damage'
+ scs_trailer_one_damage = await self.getDamagePercentageTrailer(data)
+ enhanced_data['wearTrailerRounded'] = str(math.floor(scs_trailer_one_damage)) + ' percent trailer damage'
+
+ # Convert times
+ days_hours_and_minutes = await self.convert_minutes_to_days_hours_minutes(data['time_abs'])
+ if self.use_metric_system:
+ enhanced_data['gameTime'] = await self.convert_to_clock_time(days_hours_and_minutes, 24)
+ else:
+ enhanced_data['gameTime'] = await self.convert_to_clock_time(days_hours_and_minutes, 12)
+
+ job_start_days_hours_and_minutes = await self.convert_minutes_to_days_hours_minutes(data['jobStartingTime'])
+ if self.use_metric_system:
+ enhanced_data['jobStartingTime'] = await self.convert_to_clock_time(job_start_days_hours_and_minutes, 24)
+ else:
+ enhanced_data['jobStartingTime'] = await self.convert_to_clock_time(job_start_days_hours_and_minutes, 12)
+
+ job_finish_days_hours_and_minutes = await self.convert_minutes_to_days_hours_minutes(data['jobFinishedTime'])
+ if self.use_metric_system:
+ enhanced_data['jobFinishedTime'] = await self.convert_to_clock_time(job_finish_days_hours_and_minutes, 24)
+ else:
+ enhanced_data['jobFinishedTime'] = await self.convert_to_clock_time(job_finish_days_hours_and_minutes, 12)
+
+ next_rest_stop_time_array = await self.convert_minutes_to_days_hours_minutes(data['restStop'])
+ enhanced_data['nextRestStopTime'] = await self.processTimeDifferenceArray(next_rest_stop_time_array)
+
+ route_time_in_days_hours_minutes = await self.convert_seconds_to_days_hours_minutes(data['routeTime'])
+ enhanced_data['routeTime'] = await self.processTimeDifferenceArray(route_time_in_days_hours_minutes)
+ route_expiration = await self.convert_minutes_to_days_hours_minutes(data['time_abs_delivery'] - data['time_abs'])
+ enhanced_data['jobExpirationTimeInDaysHoursMinutes'] = await self.processTimeDifferenceArray(route_expiration)
+ enhanced_data['isWorldOfTrucksContract'] = await self.isWorldOfTrucksContract(data)
+
+ if enhanced_data['isWorldOfTrucksContract']:
+ job_ended_time = await self.getDaysHoursMinutesAndSeconds(data['jobFinishedTime'])
+ job_started_time = await self.getDaysHoursMinutesAndSeconds(data['jobStartingTime'])
+ time_to_complete_route_array = await self.convert_minutes_to_days_hours_minutes(data['jobFinishedTime'] - data['jobStartingTime'])
+ real_life_time_to_complete_route = await self.convert_minutes_to_days_hours_minutes(data['jobDeliveredDeliveryTime'])
+ enhanced_data['realLifeTimeToCompleteWorldofTrucksJob'] = await self.processTimeDifferenceArray(real_life_time_to_complete_route)
+ else:
+ time_to_complete_route_array = await self.convert_minutes_to_days_hours_minutes(data['jobDeliveredDeliveryTime'])
+ enhanced_data['gameTimeLapsedToCompleteJob'] = await self.processTimeDifferenceArray(time_to_complete_route_array)
+
+ # Convert weights
+ tons = (data['cargoMass'] / 1000.0)
+ enhanced_data['cargoMassInTons'] = str(tons) + ' t' if data['trailer'][0]['attached'] else ''
+ if self.use_metric_system:
+ enhanced_data['cargoMass'] = str(round(data['cargoMass'])) + ' kg' if data['trailer'][0]['attached'] else ''
+ else:
+ enhanced_data['cargoMass'] = str(round(data['cargoMass'] * 2.20462)) + ' lb' if data['trailer'][0]['attached'] else ''
+
+ # Convert distances
+ route_distance_km = data['routeDistance'] / 1000
+ route_distance_miles = route_distance_km * 0.621371
+
+ if self.use_metric_system:
+ enhanced_data['routeDistance'] = str(math.floor(route_distance_km)) + ' kilometers'
+ enhanced_data['truckOdometer'] = str(round(data['truckOdometer'])) + ' kilometers'
+ enhanced_data['truckFuelRange'] = str(round(data['fuelRange'])) + ' kilometers'
+ enhanced_data['plannedDistance'] = str(round(data['plannedDistanceKm'])) + ' kilometers'
+ enhanced_data['jobDeliveredDistance'] = str(round(data['jobDeliveredDistanceKm'])) + ' kilometers'
+ else:
+ enhanced_data['routeDistance'] = str(math.floor(route_distance_miles)) + ' miles'
+ enhanced_data['truckOdometer'] = str(round(data['truckOdometer'] * 0.621371)) + ' miles'
+ enhanced_data['truckFuelRange'] = str(round(data['fuelRange'] * 0.621371)) + ' miles'
+ enhanced_data['plannedDistance'] = str(round(data['plannedDistanceKm'] * 0.621371)) + ' miles'
+ enhanced_data['jobDeliveredDistance'] = str(round(data['jobDeliveredDistanceKm'] * 0.621371)) + ' miles'
+
+ # Add currency symbol to income, fines, payments
+ enhanced_data['jobIncome'] = await self.getCurrency(data['jobIncome'])
+ enhanced_data['fineAmount'] = await self.getCurrency(data['fineAmount'])
+ enhanced_data['tollgatePayAmount'] = await self.getCurrency(data['tollgatePayAmount'])
+ enhanced_data['ferryPayAmount'] = await self.getCurrency(data['ferryPayAmount'])
+ enhanced_data['trainPayAmount'] = await self.getCurrency(data['trainPayAmount'])
+ enhanced_data['jobDeliveredRevenue'] = await self.getCurrency(data['jobDeliveredRevenue'])
+ enhanced_data['jobCancelledPenalty'] = await self.getCurrency(data['jobCancelledPenalty'])
+
+
+ # Convert temperatures
+ if self.use_metric_system:
+ enhanced_data['brakeTemperature'] = str(round(data['brakeTemperature'])) + ' degrees Celsius'
+ enhanced_data['oilTemperature'] = str(round(data['oilTemperature'])) + ' degrees Celsius'
+ enhanced_data['waterTemperature'] = str(round(data['waterTemperature'])) + ' degrees Celsius'
+ else:
+ enhanced_data['brakeTemperature'] = str(round(data['brakeTemperature'] * 1.8 + 32)) + ' degrees Fahrenheit'
+ enhanced_data['oilTemperature'] = str(round(data['oilTemperature'] * 1.8 + 32)) + ' degrees Fahrenheit'
+ enhanced_data['waterTemperature'] = str(round(data['waterTemperature'] * 1.8 + 32)) + ' degrees Fahrenheit'
+
+ # Convert volumes
+ if self.use_metric_system:
+ enhanced_data['fuelTankSize'] = 'Fuel tank can hold ' + str(round(data['fuelCapacity'])) + ' liters'
+ enhanced_data['fuelRemaining'] = str(round(data['fuel'])) + ' liters of fuel remaining'
+ enhanced_data['fuelConsumption'] = str(data['fuelAvgConsumption']) + ' liters per kilometer'
+ enhanced_data['adblueTankSize'] = 'Adblue tank can hold ' + str(round(data['adblueCapacity'])) + ' liters'
+ enhanced_data['adblueRemaining'] = str(round(data['adblue'])) + ' liters of adblue remaining'
+ enhanced_data['refuelAmount'] = str(round(data['refuelAmount'])) + ' liters'
+ else:
+ enhanced_data['fuelTankSize'] = 'Fuel tank can hold ' + str(round(data['fuelCapacity'] * 0.26417205)) + ' gallons'
+ enhanced_data['fuelRemaining'] = str(round(data['fuel'] * 0.26417205)) + ' gallons of fuel remaining'
+ enhanced_data['fuelConsumption'] = str(round(data['fuelAvgConsumption'] * 2.35214583)) + ' miles per gallon'
+ enhanced_data['adblueTankSize'] = 'Adblue tank can hold ' + str(round(data['adblueCapacity'] * 0.26417205)) + ' gallons'
+ enhanced_data['adblueRemaining'] = str(round(data['adblue'] * 0.26417205)) + ' gallons of adblue remaining'
+ enhanced_data['refuelAmount'] = str(round(data['refuelAmount'] * 0.26417205)) + ' gallons'
+
+ return enhanced_data
+
+ except Exception as e:
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f'There was a problem with the filter_data function: {e}. Returning original data.',
+ color=LogType.INFO,
+ )
+ return data
+
+ # Convert an array of days, hours, minutes into clock time
+ async def convert_to_clock_time(self, timeArray, clockHours):
+ days = math.floor(timeArray[0])
+ hours = math.floor(timeArray[1])
+ minutes = math.floor(timeArray[2])
+ if len(timeArray) > 3:
+ seconds = math.floor(timeArray[3])
+ else:
+ seconds = 0
+
+ date = days % 7 # Get the remainder after dividing by 7
+ days_of_week = ["Monday", "Tuesday", "Wednesday", "Thursday", "Friday", "Saturday", "Sunday"]
+ day = days_of_week[date]
+
+ if clockHours == 12:
+ period = "AM"
+ if hours >= 12:
+ period = "PM"
+ if hours > 12:
+ hours -= 12
+ elif hours == 0:
+ hours = 12 # Midnight case
+
+ return f"{day}, {hours:02}:{minutes:02} {period}"
+ elif clockHours == 24:
+ return f"{day}, {hours:02}:{minutes:02}"
+
+ # Can be used for values that are in seconds like routeTime
+ async def convert_seconds_to_days_hours_minutes(self, seconds):
+ # Constants
+ seconds_per_day = 86400
+ seconds_per_hour = 3600
+ seconds_per_minute = 60
+
+ # Calculate the number of in-game days
+ days = seconds // seconds_per_day
+
+ # Calculate the remaining seconds in the current in-game day
+ remaining_seconds = seconds % seconds_per_day
+
+ # Calculate the current in-game hours
+ hours = remaining_seconds // seconds_per_hour
+ remaining_seconds %= seconds_per_hour
+
+ # Calculate the current in-game minutes
+ minutes = remaining_seconds // seconds_per_minute
+
+ # Calculate the remaining seconds
+ seconds = remaining_seconds % seconds_per_minute
+
+ return [days, hours, minutes, seconds]
+
+ # Can be used for values like time_abs that are in game minutes since beginning of game mode
+ async def convert_minutes_to_days_hours_minutes(self, minutes):
+ # Constants
+ minutes_per_day = 1440
+
+ # Calculate the number of in-game days
+ days = minutes // minutes_per_day
+
+ # Calculate the remaining minutes in the current in-game day
+ remaining_minutes = minutes % minutes_per_day
+
+ # Calculate the current in-game hours and minutes
+ hours = remaining_minutes // 60
+ final_minutes = remaining_minutes % 60
+
+ return [days, hours, final_minutes]
+
+ # Can be used for timestamps like time, simulatedTime, renderTime, should be converted to clock times
+ async def getDaysHoursMinutesAndSeconds(self, time):
+ dateTime = datetime.utcfromtimestamp(time)
+ return [dateTime.day, dateTime.hour, dateTime.minute, dateTime.second]
+
+ async def addTime(self, time, days, hours, minutes, seconds):
+ dateTime = datetime.utcfromtimestamp(time)
+ return dateTime + timedelta(days=days, hours=hours, minutes=minutes, seconds=seconds)
+
+ async def getTime(self, gameTime, timeUnits):
+ currentTime = datetime.utcfromtimestamp(gameTime)
+ formattedTime = currentTime.strftime('%a %I:%M %p' if timeUnits == 12 else '%a %H:%M')
+ return formattedTime
+
+ async def getDamagePercentage(self, data):
+ return max(data['wearEngine'],
+ data['wearTransmission'],
+ data['wearCabin'],
+ data['wearChassis'],
+ data['wearWheels']) * 100
+
+ async def getDamagePercentageTrailer(self, data):
+ return max(data['trailer'][0]['wearChassis'], data['trailer'][0]['wearWheels'], data['trailer'][0]['wearBody']) * 100
+
+ async def processTimeDifferenceArray(self, timeArray): # To do, account for when calculation means person is late, like when its negative
+ final_time_string = ""
+ days = math.floor(timeArray[0])
+ hours = math.floor(timeArray[1])
+ minutes = math.floor(timeArray[2])
+ if len(timeArray) > 3:
+ seconds = math.floor(timeArray[3])
+ else:
+ seconds = 0
+
+ if days == 1:
+ final_time_string += f"{days} day "
+ elif days > 1:
+ final_time_string += f"{days} days "
+
+ if hours == 1:
+ final_time_string += f"{hours} hour "
+ elif hours > 1:
+ final_time_string += f"{hours} hours "
+
+ if minutes == 1:
+ final_time_string += f"{minutes} minute "
+ elif minutes > 1:
+ final_time_string += f"{minutes} minutes "
+
+ if seconds == 1:
+ final_time_string += f"{seconds} second "
+ elif seconds > 1:
+ final_time_string += f"{seconds} seconds "
+
+ return final_time_string
+
+ async def isWorldOfTrucksContract(self, data):
+ return "external" in data['jobMarket'] # If external in job type means world of trucks contract
+
+ async def getCurrency(self, money):
+ currencyCode = 'EUR' if self.use_metric_system else 'USD'
+
+ if currencyCode == 'EUR':
+ return f"€{money}"
+ elif currencyCode == 'USD':
+ return f"${money}"
+
+
+######## CODE TO CONVERT GAME COORDINATES TO REAL LIFE LAT / LONG #####
+# Adapted and modified from https://github.com/truckermudgeon/maps/blob/main/packages/libs/map/projections.ts and https://github.com/truckermudgeon/maps/blob/main/packages/apis/navigation/index.ts
+# Should be passed coordinateX as X and coordinateZ as Y for use
+
+
+ async def from_ats_coords_to_wgs84(self, x, y):
+ # ATS coords are like LCC coords, except Y grows southward (its sign is reversed)
+ lcc_coords = (x, -y)
+ lon, lat = self.ats_proj(*lcc_coords, inverse=True)
+ return lon, lat
+
+ async def from_ets2_coords_to_wgs84(self, x, y):
+ # Calais coordinates for UK detection
+ calais = (-31140, -5505)
+ is_uk = x < calais[0] and y < calais[1] - 100
+ converter = self.uk_proj if is_uk else self.ets2_proj
+
+ # Apply map offsets
+ x -= 16660
+ y -= 4150
+
+ # Additional offsets for UK coordinates
+ if is_uk:
+ x -= 16650
+ y -= 2700
+
+ # ETS2 coords are like LCC coords, except Y grows southward (its sign is reversed)
+ lcc_coords = (x, -y)
+ lon, lat = converter(*lcc_coords, inverse=True)
+ return lon, lat
+
+
+ async def convert_lat_long_data_into_place_data(self, latitude=None, longitude=None):
+
+ # If no values still, for instance, when connection is made but no data yet, return None
+ if not latitude or not longitude:
+ return None
+
+ # Set zoom level, see zoom documentation at https://nominatim.org/release-docs/develop/api/Reverse/
+ zoom = 15
+
+
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Attempting query of OpenStreetMap Nominatum with parameters: {latitude}, {longitude}, zoom level: {zoom}",
+ color=LogType.INFO,
+ )
+
+ # Request data from openstreetmap nominatum api for reverse geocoding
+ url = f"https://nominatim.openstreetmap.org/reverse?format=jsonv2&lat={latitude}&lon={longitude}&zoom={zoom}&accept-language=en&extratags=1"
+ headers = {
+ 'User-Agent': f'ats_telemetry_skill {self.wingman.name}'
+ }
+ response = requests.get(url, headers=headers)
+ if response.status_code == 200:
+ return response.json()
+ else:
+ if self.settings.debug_mode:
+ await self.printr.print_async(f"API request failed to {url}, status code: {response.status_code}.", color=LogType.INFO)
+ return None
\ No newline at end of file
diff --git a/templates/migration/1_5_0/skills/ats_telemetry/readme.txt b/templates/migration/1_5_0/skills/ats_telemetry/readme.txt
new file mode 100644
index 00000000..ad45b6e4
--- /dev/null
+++ b/templates/migration/1_5_0/skills/ats_telemetry/readme.txt
@@ -0,0 +1,276 @@
+README:
+
+This skill is special because it also requires a .DLL file to be placed in the game folder of American Truck Simulator or Euro Truck Simulator to work so that the game will send the required data to the skill.
+
+The DLL is already included in this skill folder, scs-telemetry.dll. It was sourced from https://github.com/RenCloud/scs-sdk-plugin (September 21, 2023 release, v.1.12.1)
+
+You can either manually download / move this file into your game directory in the bin/win_x64/plugins folder or you can point the skill to your install directories for ATS/ETS and save the skill and the skill will try to copy over that .dll file itself to the proper location.
+
+Here are the variables available through telemetry:
+
+### Common Unsigned Integers
+- `time_abs`
+
+### Config Unsigned Integers
+- `gears`
+- `gears_reverse`
+- `retarderStepCount`
+- `truckWheelCount`
+- `selectorCount`
+- `time_abs_delivery`
+- `maxTrailerCount`
+- `unitCount`
+- `plannedDistanceKm`
+
+### Truck Channel Unsigned Integers
+- `shifterSlot`
+- `retarderBrake`
+- `lightsAuxFront`
+- `lightsAuxRoof`
+- `truck_wheelSubstance[16]`
+- `hshifterPosition[32]`
+- `hshifterBitmask[32]`
+
+### Gameplay Unsigned Integers
+- `jobDeliveredDeliveryTime`
+- `jobStartingTime`
+- `jobFinishedTime`
+
+### Common Integers
+- `restStop`
+
+### Truck Integers
+- `gear`
+- `gearDashboard`
+- `hshifterResulting[32]`
+
+### Gameplay Integers
+- `jobDeliveredEarnedXp`
+
+### Common Floats
+- `scale`
+
+### Config Floats
+- `fuelCapacity`
+- `fuelWarningFactor`
+- `adblueCapacity`
+- `adblueWarningFactor`
+- `airPressureWarning`
+- `airPressurEmergency`
+- `oilPressureWarning`
+- `waterTemperatureWarning`
+- `batteryVoltageWarning`
+- `engineRpmMax`
+- `gearDifferential`
+- `cargoMass`
+- `truckWheelRadius[16]`
+- `gearRatiosForward[24]`
+- `gearRatiosReverse[8]`
+- `unitMass`
+
+### Truck Floats
+- `speed`
+- `engineRpm`
+- `userSteer`
+- `userThrottle`
+- `userBrake`
+- `userClutch`
+- `gameSteer`
+- `gameThrottle`
+- `gameBrake`
+- `gameClutch`
+- `cruiseControlSpeed`
+- `airPressure`
+- `brakeTemperature`
+- `fuel`
+- `fuelAvgConsumption`
+- `fuelRange`
+- `adblue`
+- `oilPressure`
+- `oilTemperature`
+- `waterTemperature`
+- `batteryVoltage`
+- `lightsDashboard`
+- `wearEngine`
+- `wearTransmission`
+- `wearCabin`
+- `wearChassis`
+- `wearWheels`
+- `truckOdometer`
+- `routeDistance`
+- `routeTime`
+- `speedLimit`
+- `truck_wheelSuspDeflection[16]`
+- `truck_wheelVelocity[16]`
+- `truck_wheelSteering[16]`
+- `truck_wheelRotation[16]`
+- `truck_wheelLift[16]`
+- `truck_wheelLiftOffset[16]`
+
+### Gameplay Floats
+- `jobDeliveredCargoDamage`
+- `jobDeliveredDistanceKm`
+- `refuelAmount`
+
+### Job Floats
+- `cargoDamage`
+
+### Config Bools
+- `truckWheelSteerable[16]`
+- `truckWheelSimulated[16]`
+- `truckWheelPowered[16]`
+- `truckWheelLiftable[16]`
+- `isCargoLoaded`
+- `specialJob`
+
+### Truck Bools
+- `parkBrake`
+- `motorBrake`
+- `airPressureWarning`
+- `airPressureEmergency`
+- `fuelWarning`
+- `adblueWarning`
+- `oilPressureWarning`
+- `waterTemperatureWarning`
+- `batteryVoltageWarning`
+- `electricEnabled`
+- `engineEnabled`
+- `wipers`
+- `blinkerLeftActive`
+- `blinkerRightActive`
+- `blinkerLeftOn`
+- `blinkerRightOn`
+- `lightsParking`
+- `lightsBeamLow`
+- `lightsBeamHigh`
+- `lightsBeacon`
+- `lightsBrake`
+- `lightsReverse`
+- `lightsHazard`
+- `cruiseControl`
+- `truck_wheelOnGround[16]`
+- `shifterToggle[2]`
+- `differentialLock`
+- `liftAxle`
+- `liftAxleIndicator`
+- `trailerLiftAxle`
+- `trailerLiftAxleIndicator`
+
+### Gameplay Bools
+- `jobDeliveredAutoparkUsed`
+- `jobDeliveredAutoloadUsed`
+
+### Config FVectors
+- `cabinPositionX`
+- `cabinPositionY`
+- `cabinPositionZ`
+- `headPositionX`
+- `headPositionY`
+- `headPositionZ`
+- `truckHookPositionX`
+- `truckHookPositionY`
+- `truckHookPositionZ`
+- `truckWheelPositionX[16]`
+- `truckWheelPositionY[16]`
+- `truckWheelPositionZ[16]`
+
+### Truck FVectors
+- `lv_accelerationX`
+- `lv_accelerationY`
+- `lv_accelerationZ`
+- `av_accelerationX`
+- `av_accelerationY`
+- `av_accelerationZ`
+- `accelerationX`
+- `accelerationY`
+- `accelerationZ`
+- `aa_accelerationX`
+- `aa_accelerationY`
+- `aa_accelerationZ`
+- `cabinAVX`
+- `cabinAVY`
+- `cabinAVZ`
+- `cabinAAX`
+- `cabinAAY`
+- `cabinAAZ`
+
+### Truck FPlacements
+- `cabinOffsetX`
+- `cabinOffsetY`
+- `cabinOffsetZ`
+- `cabinOffsetrotationX`
+- `cabinOffsetrotationY`
+- `cabinOffsetrotationZ`
+- `headOffsetX`
+- `headOffsetY`
+- `headOffsetZ`
+- `headOffsetrotationX`
+- `headOffsetrotationY`
+- `headOffsetrotationZ`
+
+### Truck DPlacements
+- `coordinateX`
+- `coordinateY`
+- `coordinateZ`
+- `rotationX`
+- `rotationY`
+- `rotationZ`
+
+### Config Strings
+- `truckBrandId`
+- `truckBrand`
+- `truckId`
+- `truckName`
+- `cargoId`
+- `cargo`
+- `cityDstId`
+- `cityDst`
+- `compDstId`
+- `compDst`
+- `citySrcId`
+- `citySrc`
+- `compSrcId`
+- `compSrc`
+- `shifterType`
+- `truckLicensePlate`
+- `truckLicensePlateCountryId`
+- `truckLicensePlateCountry`
+- `jobMarket`
+
+### Gameplay Strings
+- `fineOffence`
+- `ferrySourceName`
+- `ferryTargetName`
+- `ferrySourceId`
+- `ferryTargetId`
+- `trainSourceName`
+- `trainTargetName`
+- `trainSourceId`
+- `trainTargetId`
+
+### Config ULL
+- `jobIncome`
+
+### Gameplay LL
+- `jobCancelledPenalty`
+- `jobDeliveredRevenue`
+- `fineAmount`
+- `tollgatePayAmount`
+- `ferryPayAmount`
+- `trainPayAmount`
+
+### Special Bools
+- `onJob`
+- `jobFinished`
+- `jobCancelled`
+- `jobDelivered`
+- `fined`
+- `tollgate`
+- `ferry`
+- `train`
+- `refuel`
+- `refuelPayed`
+
+
+### Trailer Data
+- `trailer[10]`
\ No newline at end of file
diff --git a/templates/migration/1_5_0/skills/ats_telemetry/requirements.txt b/templates/migration/1_5_0/skills/ats_telemetry/requirements.txt
new file mode 100644
index 00000000..51de2db9
Binary files /dev/null and b/templates/migration/1_5_0/skills/ats_telemetry/requirements.txt differ
diff --git a/templates/migration/1_5_0/skills/ats_telemetry/scs-telemetry.dll b/templates/migration/1_5_0/skills/ats_telemetry/scs-telemetry.dll
new file mode 100644
index 00000000..1b8acff4
Binary files /dev/null and b/templates/migration/1_5_0/skills/ats_telemetry/scs-telemetry.dll differ
diff --git a/templates/migration/1_5_0/skills/ats_telemetry/scs_telemetry_explanation.txt b/templates/migration/1_5_0/skills/ats_telemetry/scs_telemetry_explanation.txt
new file mode 100644
index 00000000..9b414b54
--- /dev/null
+++ b/templates/migration/1_5_0/skills/ats_telemetry/scs_telemetry_explanation.txt
@@ -0,0 +1,248 @@
+- **SCS_TELEMETRY_trailers_count**: Maximum number of trailers supported by the telemetry SDK.
+- **SCS_TELEMETRY_CONFIG_substances**: Configuration of the substances, indexed by substance.
+ - **id**: Internal identifier for the substance.
+- **SCS_TELEMETRY_CONFIG_controls**: Static configuration of the controls.
+ - **shifter_type**: Type of the shifter (e.g., manual, automatic).
+- **SCS_TELEMETRY_CONFIG_hshifter**: Configuration of the h-shifter.
+ - **selector_count**: Number of selectors (e.g., range, splitter toggles).
+ - **slot_gear**: Gear selected when requirements for this slot are met.
+ - **slot_handle_position**: Position of the h-shifter handle.
+ - **slot_selectors**: Bitmask of required on/off state of selectors.
+- **SCS_TELEMETRY_CONFIG_truck**: Static configuration of the truck.
+ - **brand_id**: Internal identifier for the truck brand.
+ - **brand**: Localized brand name for display purposes.
+ - **id**: Internal identifier for the truck.
+ - **name**: Localized name for display purposes.
+ - **fuel_capacity**: Fuel tank capacity in liters.
+ - **fuel_warning_factor**: Fraction of fuel capacity that triggers a warning.
+ - **adblue_capacity**: AdBlue tank capacity in liters.
+ - **adblue_warning_factor**: Fraction of AdBlue capacity that triggers a warning.
+ - **air_pressure_warning**: Air pressure below which the warning activates.
+ - **air_pressure_emergency**: Air pressure below which the emergency brakes activate.
+ - **oil_pressure_warning**: Oil pressure below which the warning activates.
+ - **water_temperature_warning**: Water temperature above which the warning activates.
+ - **battery_voltage_warning**: Battery voltage below which the warning activates.
+ - **rpm_limit**: Maximum RPM value.
+ - **forward_gear_count**: Number of forward gears on the truck.
+ - **reverse_gear_count**: Number of reverse gears on the truck.
+ - **retarder_step_count**: Number of steps in the retarder.
+ - **cabin_position**: Position of the cabin in the vehicle space.
+ - **head_position**: Default head position in the cabin space.
+ - **hook_position**: Position of the trailer connection hook.
+ - **license_plate**: Vehicle license plate.
+ - **license_plate_country**: Name of the country for the license plate.
+ - **license_plate_country_id**: Internal identifier for the license plate country.
+ - **wheel_count**: Number of wheels on the truck.
+ - **wheel_position**: Positions of each wheel.
+- **SCS_TELEMETRY_CONFIG_trailer**: Static configuration of the trailer.
+ - **id**: Internal identifier for the trailer.
+ - **cargo_accessory_id**: Internal identifier for the cargo accessory.
+ - **hook_position**: Position of the trailer hook.
+ - **brand_id**: Internal identifier for the trailer brand.
+ - **brand**: Localized brand name for display purposes.
+ - **name**: Localized name for display purposes.
+ - **chain_type**: Chain type description for the first trailer.
+ - **body_type**: Body type description for the first trailer.
+ - **license_plate**: Trailer license plate.
+ - **license_plate_country**: Name of the country for the license plate.
+ - **license_plate_country_id**: Internal identifier for the license plate country.
+ - **wheel_count**: Number of wheels on the trailer.
+ - **wheel_position**: Positions of each wheel on the trailer.
+- **SCS_TELEMETRY_CONFIG_job**: Static configuration of the job.
+ - **cargo_id**: Internal identifier for the cargo.
+ - **cargo**: Localized cargo name for display purposes.
+ - **cargo_mass**: Mass of the cargo in kilograms.
+ - **destination_city_id**: Internal identifier for the destination city.
+ - **destination_city**: Localized name for the destination city.
+ - **source_city_id**: Internal identifier for the source city.
+ - **source_city**: Localized name for the source city.
+ - **destination_company_id**: Internal identifier for the destination company.
+ - **destination_company**: Localized name for the destination company.
+ - **source_company_id**: Internal identifier for the source company.
+ - **source_company**: Localized name for the source company.
+ - **income**: Expected income for the job without penalties.
+ - **delivery_time**: In-game time for job delivery.
+ - **is_cargo_loaded**: Boolean indicating if the cargo is loaded.
+ - **job_market**: Type of job market (e.g., cargo_market, quick_job).
+ - **special_job**: Flag indicating whether the job is a special transport job.
+ - **planned_distance_km**: Planned job distance in simulated kilometers.- **SCS_TELEMETRY_trailers_count**: Maximum number of trailers supported by the telemetry SDK.
+- **SCS_TELEMETRY_CONFIG_substances**: Configuration of the substances, indexed by substance.
+ - **id**: Internal identifier for the substance.
+- **SCS_TELEMETRY_CONFIG_controls**: Static configuration of the controls.
+ - **shifter_type**: Type of the shifter (e.g., manual, automatic).
+- **SCS_TELEMETRY_CONFIG_hshifter**: Configuration of the h-shifter.
+ - **selector_count**: Number of selectors (e.g., range, splitter toggles).
+ - **slot_gear**: Gear selected when requirements for this slot are met.
+ - **slot_handle_position**: Position of the h-shifter handle.
+ - **slot_selectors**: Bitmask of required on/off state of selectors.
+- **SCS_TELEMETRY_CONFIG_truck**: Static configuration of the truck.
+ - **brand_id**: Internal identifier for the truck brand.
+ - **brand**: Localized brand name for display purposes.
+ - **id**: Internal identifier for the truck.
+ - **name**: Localized name for display purposes.
+ - **fuel_capacity**: Fuel tank capacity in liters.
+ - **fuel_warning_factor**: Fraction of fuel capacity that triggers a warning.
+ - **adblue_capacity**: AdBlue tank capacity in liters.
+ - **adblue_warning_factor**: Fraction of AdBlue capacity that triggers a warning.
+ - **air_pressure_warning**: Air pressure below which the warning activates.
+ - **air_pressure_emergency**: Air pressure below which the emergency brakes activate.
+ - **oil_pressure_warning**: Oil pressure below which the warning activates.
+ - **water_temperature_warning**: Water temperature above which the warning activates.
+ - **battery_voltage_warning**: Battery voltage below which the warning activates.
+ - **rpm_limit**: Maximum RPM value.
+ - **forward_gear_count**: Number of forward gears on the truck.
+ - **reverse_gear_count**: Number of reverse gears on the truck.
+ - **retarder_step_count**: Number of steps in the retarder.
+ - **cabin_position**: Position of the cabin in the vehicle space.
+ - **head_position**: Default head position in the cabin space.
+ - **hook_position**: Position of the trailer connection hook.
+ - **license_plate**: Vehicle license plate.
+ - **license_plate_country**: Name of the country for the license plate.
+ - **license_plate_country_id**: Internal identifier for the license plate country.
+ - **wheel_count**: Number of wheels on the truck.
+ - **wheel_position**: Positions of each wheel.
+- **SCS_TELEMETRY_CONFIG_trailer**: Static configuration of the trailer.
+ - **id**: Internal identifier for the trailer.
+ - **cargo_accessory_id**: Internal identifier for the cargo accessory.
+ - **hook_position**: Position of the trailer hook.
+ - **brand_id**: Internal identifier for the trailer brand.
+ - **brand**: Localized brand name for display purposes.
+ - **name**: Localized name for display purposes.
+ - **chain_type**: Chain type description for the first trailer.
+ - **body_type**: Body type description for the first trailer.
+ - **license_plate**: Trailer license plate.
+ - **license_plate_country**: Name of the country for the license plate.
+ - **license_plate_country_id**: Internal identifier for the license plate country.
+ - **wheel_count**: Number of wheels on the trailer.
+ - **wheel_position**: Positions of each wheel on the trailer.
+- **SCS_TELEMETRY_CONFIG_job**: Static configuration of the job.
+ - **cargo_id**: Internal identifier for the cargo.
+ - **cargo**: Localized cargo name for display purposes.
+ - **cargo_mass**: Mass of the cargo in kilograms.
+ - **destination_city_id**: Internal identifier for the destination city.
+ - **destination_city**: Localized name for the destination city.
+ - **source_city_id**: Internal identifier for the source city.
+ - **source_city**: Localized name for the source city.
+ - **destination_company_id**: Internal identifier for the destination company.
+ - **destination_company**: Localized name for the destination company.
+ - **source_company_id**: Internal identifier for the source company.
+ - **source_company**: Localized name for the source company.
+ - **income**: Expected income for the job without penalties.
+ - **delivery_time**: In-game time for job delivery.
+ - **is_cargo_loaded**: Boolean indicating if the cargo is loaded.
+ - **job_market**: Type of job market (e.g., cargo_market, quick_job).
+ - **special_job**: Flag indicating whether the job is a special transport job.
+ - **planned_distance_km**: Planned job distance in simulated kilometers.
+- **truck.world.placement**: Represents world space position and orientation of the truck. (Type: dplacement)
+- **truck.local.velocity.linear**: Vehicle space linear velocity in meters per second. (Type: fvector)
+- **truck.local.velocity.angular**: Vehicle space angular velocity in rotations per second. (Type: fvector)
+- **truck.local.acceleration.linear**: Vehicle space linear acceleration in meters per second². (Type: fvector)
+- **truck.local.acceleration.angular**: Vehicle space angular acceleration in rotations per second². (Type: fvector)
+- **truck.cabin.offset**: Vehicle space position and orientation delta of the cabin from its default position. (Type: fplacement)
+- **truck.cabin.velocity.angular**: Cabin space angular velocity in rotations per second. (Type: fvector)
+- **truck.cabin.acceleration.angular**: Cabin space angular acceleration in rotations per second². (Type: fvector)
+- **truck.head.offset**: Cabin space position and orientation delta of the driver’s head from its default position. (Type: fplacement)
+- **truck.speed**: Speedometer speed in meters per second. Uses negative value for reverse movement. (Type: float)
+- **truck.engine.rpm**: RPM of the engine. (Type: float)
+- **truck.engine.gear**: Gear currently selected in the engine; >0 for forward, 0 for neutral, <0 for reverse. (Type: s32)
+- **truck.displayed.gear**: Gear currently displayed on the dashboard; >0 for forward, 0 for neutral, <0 for reverse. (Type: s32)
+- **truck.input.steering**: Steering received from input, ranged from -1 to 1, counterclockwise. (Type: float)
+- **truck.input.throttle**: Throttle received from input, ranged from 0 to 1. (Type: float)
+- **truck.input.brake**: Brake received from input, ranged from 0 to 1. (Type: float)
+- **truck.input.clutch**: Clutch received from input, ranged from 0 to 1. (Type: float)
+- **truck.effective.steering**: Steering as used by the simulation, ranged from -1 to 1, counterclockwise. (Type: float)
+- **truck.effective.throttle**: Throttle pedal input as used by the simulation, ranged from 0 to 1. (Type: float)
+- **truck.effective.brake**: Brake pedal input as used by the simulation, ranged from 0 to 1. (Type: float)
+- **truck.effective.clutch**: Clutch pedal input as used by the simulation, ranged from 0 to 1. (Type: float)
+- **truck.cruise_control**: Speed selected for cruise control in meters per second; 0 if disabled. (Type: float)
+- **truck.hshifter.slot**: Gearbox slot the h-shifter handle is currently in; 0 means no slot is selected. (Type: u32)
+- **truck.hshifter.select**: Enabled state of range/splitter selector toggles. (Type: indexed bool)
+- **truck.brake.parking**: Whether the parking brake is enabled. (Type: bool)
+- **truck.brake.motor**: Whether the engine brake is enabled. (Type: bool)
+- **truck.brake.retarder**: Current level of the retarder; ranged from 0 to max. (Type: u32)
+- **truck.brake.air.pressure**: Pressure in the brake air tank in psi. (Type: float)
+- **truck.brake.air.pressure.warning**: Whether air pressure warning is active. (Type: bool)
+- **truck.brake.air.pressure.emergency**: Whether emergency brakes are active due to low air pressure. (Type: bool)
+- **truck.brake.temperature**: Temperature of the brakes in degrees Celsius. (Type: float)
+- **truck.fuel.amount**: Amount of fuel in liters. (Type: float)
+- **truck.fuel.warning**: Whether low fuel warning is active. (Type: bool)
+- **truck.fuel.consumption.average**: Average fuel consumption in liters/km. (Type: float)
+- **truck.fuel.range**: Estimated range with current amount of fuel in km. (Type: float)
+- **truck.adblue**: Amount of AdBlue in liters. (Type: float)
+- **truck.adblue.warning**: Whether low AdBlue warning is active. (Type: bool)
+- **truck.adblue.consumption.average**: Average AdBlue consumption in liters/km. (Type: float)
+- **truck.oil.pressure**: Pressure of the oil in psi. (Type: float)
+- **truck.oil.pressure.warning**: Whether oil pressure warning is active. (Type: bool)
+- **truck.oil.temperature**: Temperature of the oil in degrees Celsius. (Type: float)
+- **truck.water.temperature**: Temperature of the water in degrees Celsius. (Type: float)
+- **truck.water.temperature.warning**: Whether water temperature warning is active. (Type: bool)
+- **truck.battery.voltage**: Voltage of the battery in volts. (Type: float)
+- **truck.battery.voltage.warning**: Whether battery voltage warning is active. (Type: bool)
+- **truck.electric.enabled**: Whether electric is enabled. (Type: bool)
+- **truck.engine.enabled**: Whether engine is enabled. (Type: bool)
+- **truck.lblinker**: Whether the left blinker is enabled. (Type: bool)
+- **truck.rblinker**: Whether the right blinker is enabled. (Type: bool)
+- **truck.hazard.warning**: Whether the hazard warning light is enabled. (Type: bool)
+- **truck.light.lblinker**: Whether the left blinker light is on. (Type: bool)
+- **truck.light.rblinker**: Whether the right blinker light is on. (Type: bool)
+- **truck.light.parking**: Whether the parking lights are enabled. (Type: bool)
+- **truck.light.beam.low**: Whether the low beam lights are enabled. (Type: bool)
+- **truck.light.beam.high**: Whether the high beam lights are enabled. (Type: bool)
+- **truck.light.aux.front**: Intensity of the auxiliary front lights; 1 for dimmed, 2 for full. (Type: u32)
+- **truck.light.aux.roof**: Intensity of the auxiliary roof lights; 1 for dimmed, 2 for full. (Type: u32)
+- **truck.light.beacon**: Whether the beacon lights are enabled. (Type: bool)
+- **truck.light.brake**: Whether the brake light is active. (Type: bool)
+- **truck.light.reverse**: Whether the reverse light is active. (Type: bool)
+- **truck.wipers**: Whether the wipers are enabled. (Type: bool)
+- **truck.dashboard.backlight**: Intensity of the dashboard backlight as a factor (0;1). (Type: float)
+- **truck.differential_lock**: Whether the differential lock is enabled. (Type: bool)
+- **truck.lift_axle**: Whether the lift axle control is set to the lifted state. (Type: bool)
+- **truck.lift_axle.indicator**: Whether the lift axle indicator is lit. (Type: bool)
+- **truck.trailer.lift_axle**: Whether the trailer lift axle control is set to the lifted state. (Type: bool)
+- **truck.trailer.lift_axle.indicator**: Whether the trailer lift axle indicator is lit. (Type: bool)
+- **truck.wear.engine**: Wear of the engine accessory as a factor (0;1). (Type: float)
+- **truck.wear.transmission**: Wear of the transmission accessory as a factor (0;1). (Type: float)
+- **truck.wear.cabin**: Wear of the cabin accessory as a factor (0;1). (Type: float)
+- **truck.wear.chassis**: Wear of the chassis accessory as a factor (0;1). (Type: float)
+- **truck.wear.wheels**: Average wear of the wheel accessories as a factor (0;1). (Type: float)
+- **truck.odometer**: Value of the odometer in kilometers. (Type: float)
+- **truck.navigation.distance**: Truck’s navigation distance in meters, used by the advisor. (Type: float)
+- **truck.navigation.time**: Truck’s navigation estimated time of arrival in seconds, used by the advisor. (Type: float)
+- **truck.navigation.speed.limit**: Truck’s navigation speed limit in meters per second, used by the advisor. (Type: float)
+- **truck.wheel.suspension.deflection**: Vertical displacement of the wheel from its axle in meters. (Type: indexed float)
+- **truck.wheel.on_ground**: Whether the wheel is in contact with the ground. (Type: indexed bool)
+- **truck.wheel.substance**: Substance below the wheel, indexed by substance ID. (Type: indexed u32)
+- **truck.wheel.angular_velocity**: Angular velocity of the wheel in rotations per second. (Type: indexed float)
+- **truck.wheel.steering**: Steering rotation of the wheel in rotations, from -0.25 to 0.25. (Type: indexed float)
+- **truck.wheel.rotation**: Rolling rotation of the wheel in rotations, from 0.0 to 1.0. (Type: indexed float)
+- **truck.wheel.lift**: Lift state of the wheel, from 0 to 1. (Type: indexed float)
+- **truck.wheel.lift.offset**: Vertical displacement of the wheel axle due to lifting, in meters. (Type: indexed float)
+
+
+- **job.cancelled**: Event triggered when a job is cancelled.
+ - **cancel.penalty**: Penalty for cancelling the job in game currency (Type: s64).
+- **job.delivered**: Event triggered when a job is delivered.
+ - **revenue**: Job revenue in game currency (Type: s64).
+ - **earned_xp**: XP received for the job (Type: s32).
+ - **cargo_damage**: Total cargo damage ranging from 0.0 to 1.0 (Type: float).
+ - **distance_km**: Real distance covered on the job in kilometers (Type: float).
+ - **delivery_time**: Total time spent on the job in game minutes (Type: u32).
+ - **auto_park_used**: Boolean indicating if auto parking was used (Type: bool).
+ - **auto_load_used**: Boolean indicating if auto loading was used (always true for non-cargo market jobs) (Type: bool).
+- **player.fined**: Event triggered when the player gets fined.
+ - **fine_offence**: Type of offence resulting in the fine (e.g., crash, speeding, no lights) (Type: string).
+ - **fine_amount**: Fine amount in game currency (Type: s64).
+- **player.tollgate.paid**: Event triggered when the player pays a tollgate fee.
+ - **pay_amount**: Amount paid for the tollgate in game currency (Type: s64).
+- **player.use.ferry**: Event triggered when the player uses a ferry.
+ - **pay_amount**: Amount paid for the ferry in game currency (Type: s64).
+ - **source_name**: Name of the ferry departure location (Type: string).
+ - **target_name**: Name of the ferry destination location (Type: string).
+ - **source_id**: ID of the ferry departure location (Type: string).
+ - **target_id**: ID of the ferry destination location (Type: string).
+- **player.use.train**: Event triggered when the player uses a train.
+ - **pay_amount**: Amount paid for the train in game currency (Type: s64).
+ - **source_name**: Name of the train departure location (Type: string).
+ - **target_name**: Name of the train destination location (Type: string).
+ - **source_id**: ID of the train departure location (Type: string).
+ - **target_id**: ID of the train destination location (Type: string).
\ No newline at end of file
diff --git a/templates/migration/1_5_0/skills/auto_screenshot/default_config.yaml b/templates/migration/1_5_0/skills/auto_screenshot/default_config.yaml
new file mode 100644
index 00000000..f45fc65c
--- /dev/null
+++ b/templates/migration/1_5_0/skills/auto_screenshot/default_config.yaml
@@ -0,0 +1,41 @@
+name: AutoScreenshot
+module: skills.auto_screenshot.main
+category: general
+description:
+ en: Take screenshots on command or automatically when you show excitement.
+ de: Mache Screenshots auf Befehl oder automatisch wenn du Aufregung zeigst.
+examples:
+ - question:
+ en: Take a screenshot
+ de: Mache einen Screenshot
+ answer:
+ en: (takes a screenshot of the focused window)
+ de: (macht einen Screenshot des fokussierten Fensters)
+ - question:
+ en: Oh wow, this is crazy!
+ de: Oh wow, das ist verrückt!
+ answer:
+ en: (takes a screenshot of the focused window)
+ de: (macht einen Screenshot des fokussierten Fensters)
+prompt: |
+ You can take a screenshot of the focused window when asked by the user. You can also take a screenshot when you infer something important, exciting, scary or interesting is going on where having a screenshot would create a nice memory of the moment for the user.
+ Use the 'take_screenshot' tool to capture the screenshot, and provide a reason for why you are doing so, such as because of the request of the user or why you decided it would be good to take a screenshot.
+ Example 1: User says something like "oh wow" or "oh no" or "this is crazy!"
+ Your response: (use take_screenshot tool with "exciting moment" reason)
+ Example 2: User says "take a screenshot"
+ Your response: (use take_screenshot tool with "user request" reason)
+ Example 3: User says "look at my screen and tell me what you see."
+ Your response: (Use VisionAI skill if available, do not call take _screenshot tool. User does not want a screenshot, they want you to look at what they are seeing.)
+custom_properties:
+ - hint: If you have multiple monitors, the display to capture if detecting the active game window fails
+ id: display
+ name: Display to capture
+ property_type: number
+ required: true
+ value: 1
+ - hint: The default directory to put screenshots in. If left blank, it will default to your WingmanAI config directory in a subdirectory called "/screenshots".
+ id: default_directory
+ name: Default directory
+ property_type: string
+ required: false
+ value: ""
diff --git a/templates/migration/1_5_0/skills/auto_screenshot/logo.png b/templates/migration/1_5_0/skills/auto_screenshot/logo.png
new file mode 100644
index 00000000..851cc1ff
Binary files /dev/null and b/templates/migration/1_5_0/skills/auto_screenshot/logo.png differ
diff --git a/templates/migration/1_5_0/skills/auto_screenshot/main.py b/templates/migration/1_5_0/skills/auto_screenshot/main.py
new file mode 100644
index 00000000..ef9e2530
--- /dev/null
+++ b/templates/migration/1_5_0/skills/auto_screenshot/main.py
@@ -0,0 +1,145 @@
+import os
+import time
+from datetime import datetime
+from typing import TYPE_CHECKING
+from mss import mss
+import pygetwindow as gw
+from PIL import Image
+from api.enums import LogType
+from api.interface import SettingsConfig, SkillConfig, WingmanInitializationError
+from skills.skill_base import Skill
+from services.file import get_writable_dir
+
+if TYPE_CHECKING:
+ from wingmen.open_ai_wingman import OpenAiWingman
+
+
+class AutoScreenshot(Skill):
+ def __init__(
+ self,
+ config: SkillConfig,
+ settings: SettingsConfig,
+ wingman: "OpenAiWingman",
+ ) -> None:
+ super().__init__(config=config, settings=settings, wingman=wingman)
+ self.default_directory = ""
+ self.display = 1
+
+ async def validate(self) -> list[WingmanInitializationError]:
+ errors = await super().validate()
+
+ self.default_directory = self.retrieve_custom_property_value(
+ "default_directory", errors
+ )
+ if not self.default_directory or self.default_directory == "" or not os.path.isdir(self.default_directory):
+ self.default_directory = self.get_default_directory()
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ "User either did not enter default directory or entered directory is invalid. Defaulting to wingman config directory / screenshots",
+ color=LogType.INFO,
+ )
+
+
+ self.display = self.retrieve_custom_property_value("display", errors)
+
+ return errors
+
+ def get_default_directory(self) -> str:
+ return get_writable_dir("screenshots")
+
+ async def take_screenshot(self, reason: str) -> None:
+ try:
+ focused_window = gw.getActiveWindow()
+
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Taking screenshot because: {reason}. Focused window: {focused_window}",
+ color=LogType.INFO,
+ )
+
+ window_bbox = {
+ "top": focused_window.top,
+ "left": focused_window.left,
+ "width": focused_window.width,
+ "height": focused_window.height,
+ }
+
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"{focused_window} bbox detected as: {window_bbox}",
+ color=LogType.INFO,
+ )
+
+ except Exception as e:
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Failed to get focused window or window bbox using pygetwindow: {e}. Defaulting to full screen capture.",
+ color=LogType.ERROR,
+ )
+ window_bbox = None
+
+ with mss() as sct:
+ if window_bbox:
+ screenshot = sct.grab(window_bbox)
+ else:
+ main_display = sct.monitors[self.display]
+ screenshot = sct.grab(main_display)
+
+ image = Image.frombytes(
+ "RGB", screenshot.size, screenshot.bgra, "raw", "BGRX"
+ )
+
+ timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
+ screenshot_file = os.path.join(self.default_directory, f'{self.wingman.name}_{timestamp}.png')
+ image.save(screenshot_file)
+
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Screenshot saved at: {screenshot_file}",
+ color=LogType.INFO,
+ )
+
+ def get_tools(self) -> list[tuple[str, dict]]:
+ tools = [
+ (
+ "take_screenshot",
+ {
+ "type": "function",
+ "function": {
+ "name": "take_screenshot",
+ "description": "Takes a screenshot of the currently focused game window and saves it in the default directory.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "reason": {
+ "type": "string",
+ "description": "The reason for taking a screenshot.",
+ },
+ },
+ "required": ["reason"],
+ },
+ },
+ },
+ ),
+ ]
+ return tools
+
+
+
+ async def execute_tool(
+ self, tool_name: str, parameters: dict[str, any]
+ ) -> tuple[str, str]:
+ function_response = ""
+ instant_response = ""
+
+ if tool_name == "take_screenshot":
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+ reason = parameters.get("reason", "unspecified reason")
+ await self.take_screenshot(reason)
+ function_response = "Screenshot taken successfully."
+
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+
+ return function_response, instant_response
\ No newline at end of file
diff --git a/templates/migration/1_5_0/skills/auto_screenshot/requirements.txt b/templates/migration/1_5_0/skills/auto_screenshot/requirements.txt
new file mode 100644
index 00000000..dfb45a6c
--- /dev/null
+++ b/templates/migration/1_5_0/skills/auto_screenshot/requirements.txt
@@ -0,0 +1,3 @@
+mss==9.0.1
+pygetwindow==0.0.9
+pillow==10.3.0
\ No newline at end of file
diff --git a/templates/migration/1_5_0/skills/control_windows/default_config.yaml b/templates/migration/1_5_0/skills/control_windows/default_config.yaml
new file mode 100644
index 00000000..c1c2be98
--- /dev/null
+++ b/templates/migration/1_5_0/skills/control_windows/default_config.yaml
@@ -0,0 +1,30 @@
+name: ControlWindows
+module: skills.control_windows.main
+category: general
+description:
+ en: Launch applications and control your Windows computer. Utilize the clipboard of your computer.
+ de: Starte Anwendungen und steuere deinen Windows-Computer. Nutze die Zwischenablage deines Computers.
+hint:
+ en: This skill looks for applications in your Start Menu directory (%APPDATA%/Microsoft/Windows/Start Menu/Programs), so if it tells you that it cannot find an application, create a shortcut in that directory.
+ de: Dieser Skill sucht nach Anwendungen in deinem Startmenü-Verzeichnis (%APPDATA%/Microsoft/Windows/Start Menu/Programs). Wenn er dir also sagt, dass er eine Anwendung nicht finden kann, erstelle eine Verknüpfung in diesem Verzeichnis.
+examples:
+ - question:
+ en: Open Spotify.
+ de: Öffne Spotify.
+ answer:
+ en: (opens the Spotify application)
+ de: (öffnet die Spotify-Anwendung)
+ - question:
+ en: Activate Notepad.
+ de: Aktiviere Notepad.
+ answer:
+ en: (maximizes the Notepad application)
+ de: (maximiert die Notepad Anwendung)
+ - question:
+ en: Close Notepad.
+ de: Schließe Notepad.
+ answer:
+ en: (closes the Notepad application)
+ de: (schließt die Notepad Anwendung)
+prompt: |
+ You can also control Windows Functions, like opening, closing, moving, and listing active applications, reading text from the clipboard, and place or save text on the clipboard.
diff --git a/templates/migration/1_5_0/skills/control_windows/logo.png b/templates/migration/1_5_0/skills/control_windows/logo.png
new file mode 100644
index 00000000..3242b570
Binary files /dev/null and b/templates/migration/1_5_0/skills/control_windows/logo.png differ
diff --git a/templates/migration/1_5_0/skills/control_windows/main.py b/templates/migration/1_5_0/skills/control_windows/main.py
new file mode 100644
index 00000000..dabb02ef
--- /dev/null
+++ b/templates/migration/1_5_0/skills/control_windows/main.py
@@ -0,0 +1,341 @@
+import os
+import time
+from pathlib import Path
+from typing import TYPE_CHECKING
+import pygetwindow as gw
+from clipboard import Clipboard
+from api.interface import SettingsConfig, SkillConfig
+from api.enums import LogType
+from skills.skill_base import Skill
+import mouse.mouse as mouse
+
+if TYPE_CHECKING:
+ from wingmen.open_ai_wingman import OpenAiWingman
+
+
+class ControlWindows(Skill):
+
+ # Paths to Start Menu directories
+ start_menu_paths: list[Path] = [
+ Path(os.environ["APPDATA"], "Microsoft", "Windows", "Start Menu", "Programs"),
+ Path(
+ os.environ["PROGRAMDATA"], "Microsoft", "Windows", "Start Menu", "Programs"
+ ),
+ ]
+
+ def __init__(
+ self,
+ config: SkillConfig,
+ settings: SettingsConfig,
+ wingman: "OpenAiWingman",
+ ) -> None:
+ super().__init__(config=config, settings=settings, wingman=wingman)
+
+ # Function to recursively list files in a directory
+ def list_files(self, directory, extension=""):
+ for item in directory.iterdir():
+ if item.is_dir():
+ yield from self.list_files(item, extension)
+ elif item.is_file() and item.suffix == extension:
+ yield item
+
+ # Microsoft does odd things with its tab titles, see https://github.com/asweigart/PyGetWindow/issues/54, so use this function to try to find matching windows to app name and if match not found try adding unicode special character
+ def get_and_check_windows(self, app_name):
+ windows = gw.getWindowsWithTitle(app_name)
+ if not windows and "Microsoft Edge".lower() in app_name.lower():
+ app_name = app_name.replace("Microsoft Edge", "Microsoft\u200b Edge")
+ app_name = app_name.replace("microsoft edge", "Microsoft\u200b Edge")
+ windows = gw.getWindowsWithTitle(app_name)
+ if not windows:
+ return None
+ return windows
+
+ # Function to search and start an application
+ def search_and_start(self, app_name):
+ for start_menu_path in self.start_menu_paths:
+ if start_menu_path.exists():
+ for file_path in self.list_files(start_menu_path, ".lnk"):
+ if app_name.lower() in file_path.stem.lower():
+ # Attempt to start the application
+ try:
+ os.startfile(str(file_path))
+ # subprocess.Popen([str(file_path)])
+ except:
+ return False
+
+ return True
+
+ return False
+
+ def close_application(self, app_name):
+ windows = self.get_and_check_windows(app_name)
+ if windows and len(windows) > 0:
+ for window in windows:
+ try:
+ window.close()
+ except:
+ return False
+
+ return True
+
+ return False
+
+ def execute_ui_command(self, app_name: str, command: str):
+ windows = self.get_and_check_windows(app_name)
+ if windows and len(windows) > 0:
+ for window in windows:
+ try:
+ getattr(window, command)()
+ except AttributeError:
+ pass
+
+ return True
+
+ return False
+
+ def activate_application(self, app_name: str):
+ windows = self.get_and_check_windows(app_name)
+ if windows and len(windows) > 0:
+ for window in windows:
+ # See https://github.com/asweigart/PyGetWindow/issues/36#issuecomment-919332733 for why just regular "activate" may not work
+ try:
+ window.minimize()
+ window.restore()
+ window.activate()
+ except:
+ return False
+
+ return True
+
+ return False
+
+ async def move_application(self, app_name: str, command: str):
+ windows = self.get_and_check_windows(app_name)
+
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Windows found in move_application function matching {app_name}: {windows}",
+ color=LogType.INFO,
+ )
+
+ if windows and len(windows) > 0:
+ for window in windows:
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Executing move_application command for: {window.title}",
+ color=LogType.INFO,
+ )
+ # Make sure application is active before moving it
+ try:
+ window.minimize()
+ window.restore()
+ # Temporarily maximize it, let windows do the work of what maximize means based on the user's setup
+ window.maximize()
+ time.sleep(0.5)
+ except:
+ pass
+ # Assume that maximize is a proxy for the appropriate full size of a window in this setup, use that to calculate resize
+ monitor_width, monitor_height = window.size
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Before resize and move, {window.title} is {window.size} and is located at {window.topleft}.",
+ color=LogType.INFO,
+ )
+
+ try:
+ if "left" in command:
+ window.resizeTo(int(monitor_width * 0.5), int(monitor_height))
+ window.moveTo(0, 0)
+ if "right" in command:
+ window.resizeTo(int(monitor_width * 0.5), int(monitor_height))
+ window.moveTo(int(monitor_width * 0.5), 0)
+ if "top" in command:
+ window.resizeTo(int(monitor_width), int(monitor_height * 0.5))
+ window.moveTo(0, 0)
+ if "bottom" in command:
+ window.resizeTo(int(monitor_width), int(monitor_height * 0.5))
+ window.moveTo(0, int(monitor_height * 0.5))
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Executed move_application command {command}; {window.title} is now {window.size} and is located at {window.topleft}.",
+ color=LogType.INFO,
+ )
+ # Check if resize and move command really worked, if not return false so wingmanai does not tell user command was successful when it was not
+ if (monitor_width, monitor_height) == window.size:
+ # Try last ditch manual move if moving to left or right
+ if "left" in command:
+ mouse.move(int(monitor_width * 0.5), 10, duration=1.0)
+ time.sleep(0.1)
+ mouse.press(button="left")
+ mouse.move(20, 10, duration=1.0)
+ time.sleep(0.1)
+ mouse.release(button="left")
+ return True
+
+ elif "right" in command:
+ mouse.move(int(monitor_width * 0.5), 10, duration=1.0)
+ time.sleep(0.1)
+ mouse.press(button="left")
+ mouse.move(monitor_width - 20, 10, duration=1.0)
+ time.sleep(0.1)
+ mouse.release(button="left")
+ return True
+ # Return False as failed if could not move through any method
+ return False
+ return True
+
+ # If any errors in trying to move and resize windows, return false as well
+ except:
+ return False
+
+ # If no windows found, return false
+ return False
+
+ async def list_applications(self):
+ window_titles = gw.getAllTitles()
+ if window_titles:
+ titles_as_string = ", ".join(window_titles)
+ response = (
+ f"List of all application window titles found: {titles_as_string}."
+ )
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"list_applications command found these applications: {titles_as_string}",
+ color=LogType.INFO,
+ )
+ return response
+ return False
+
+ def place_text_on_clipboard(self, text: str) -> str:
+ try:
+ with Clipboard() as clipboard:
+ clipboard.set_clipboard(text)
+ return "Text successfully placed on clipboard."
+ except KeyError:
+ return "Error: Cannot save content to Clipboard as text. Images and other non-text content cannot be processed."
+ except Exception as e:
+ return f"Error: {str(e)}"
+
+ def get_text_from_clipboard(self) -> str:
+ try:
+ with Clipboard() as clipboard:
+ text = clipboard["text"]
+ return f"Text copied from clipboard: {text}"
+ except KeyError:
+ return "Error: Clipboard has no text. Images and other non-text content of the clipboard cannot be processed."
+ except Exception as e:
+ return f"Error: {str(e)}"
+
+ def get_tools(self) -> list[tuple[str, dict]]:
+ tools = [
+ (
+ "control_windows_functions",
+ {
+ "type": "function",
+ "function": {
+ "name": "control_windows_functions",
+ "description": "Control Windows Functions, like opening, closing, listing, and moving applications, and reading clipboard content.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "command": {
+ "type": "string",
+ "description": "The command to execute",
+ "enum": [
+ "open",
+ "close",
+ "minimize",
+ "maximize",
+ "restore",
+ "activate",
+ "snap_left",
+ "snap_right",
+ "snap_top",
+ "snap_bottom",
+ "list_applications",
+ "read_clipboard_content",
+ "place_text_on_clipboard",
+ ],
+ },
+ "parameter": {
+ "type": "string",
+ "description": "The parameter for the command. For example, the application name to open, close, or move. Or the information to get or use.",
+ },
+ },
+ "required": ["command"],
+ },
+ },
+ },
+ ),
+ ]
+ return tools
+
+ async def execute_tool(
+ self, tool_name: str, parameters: dict[str, any]
+ ) -> tuple[str, str]:
+ function_response = "Error: Application not found."
+ instant_response = ""
+
+ if tool_name == "control_windows_functions":
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+ await self.printr.print_async(
+ f"Executing control_windows_functions with parameters: {parameters}",
+ color=LogType.INFO,
+ )
+
+ parameter = parameters.get("parameter")
+
+ if parameters["command"] == "open":
+ app_started = self.search_and_start(parameter)
+ if app_started:
+ function_response = "Application started."
+
+ elif parameters["command"] == "close":
+ app_closed = self.close_application(parameter)
+ if app_closed:
+ function_response = "Application closed."
+
+ elif parameters["command"] == "activate":
+ app_activated = self.activate_application(parameter)
+ if app_activated:
+ function_response = "Application activated."
+
+ elif any(
+ word in parameters["command"].lower()
+ for word in ["left", "right", "top", "bottom"]
+ ):
+ command = parameters["command"].lower()
+ app_moved = await self.move_application(parameter, command)
+ if app_moved:
+ function_response = "Application moved"
+ else:
+ function_response = "There was a problem moving that application. The application may not support moving it through automation."
+
+ elif "list" in parameters["command"].lower():
+ apps_listed = await self.list_applications()
+ if apps_listed:
+ function_response = apps_listed
+ else:
+ function_response = (
+ "There was a problem getting your list of applications."
+ )
+
+ elif "read_clipboard" in parameters["command"].lower():
+ text_received = self.get_text_from_clipboard()
+ function_response = text_received
+
+ elif "clipboard" in parameters["command"].lower():
+ text_placed = self.place_text_on_clipboard(parameter)
+ function_response = text_placed
+
+ else:
+ command = parameters["command"]
+ app_minimize = self.execute_ui_command(parameter, command)
+ if app_minimize:
+ function_response = f"Application {command}."
+
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+
+ return function_response, instant_response
diff --git a/templates/migration/1_5_0/skills/control_windows/requirements.txt b/templates/migration/1_5_0/skills/control_windows/requirements.txt
new file mode 100644
index 00000000..b7b47cdc
--- /dev/null
+++ b/templates/migration/1_5_0/skills/control_windows/requirements.txt
@@ -0,0 +1,2 @@
+pygetwindow==0.0.9
+clip-util==0.1.27
\ No newline at end of file
diff --git a/templates/migration/1_5_0/skills/file_manager/default_config.yaml b/templates/migration/1_5_0/skills/file_manager/default_config.yaml
new file mode 100644
index 00000000..736a91d8
--- /dev/null
+++ b/templates/migration/1_5_0/skills/file_manager/default_config.yaml
@@ -0,0 +1,49 @@
+name: FileManager
+module: skills.file_manager.main
+category: general
+description:
+ en: Manage local files, save, load and create directories. Supports various text-based file formats.
+ de: Verwalte lokale Dateien, speichere, lade oder erstelle Verzeichnisse. Unterstützt verschiedene text-basierte Formate.
+hint:
+ en:
You should provide an exact file path and name for where you want to create a directory or save or load a text file. For example "save that text to a file called samplefile in my C drive in the directory called Documents."
If you do not, a directory called "files" in your Wingman config dir will be created and used.
Supported file formats are plain text file formats, such as txt, md, log, yaml, py, json, etc.
+ de:
Gib einen möglichst genauen Speicherort für deine Verzeichnisse oder Dateien an, beispielsweise "Speichere diesen Text in eine Daten namens beispieldatei auf meinem C-Laufwerk im Verzeichnis Dokumente".
Wenn du das nicht machst, wird ein Verzeichnis namens "files" in deinem Wingman-Konfigurationsverzeichnis erstellt und verwendet.
Unterstützte Dateiformate sind einfache Textdateiformate wie txt, md, log, yaml, py, json usw
+examples:
+ - question:
+ en: Save 'Hello, World!' to hello.txt.
+ de: Speichere 'Hallo, Welt!' in hello.txt.
+ answer:
+ en: (saves 'Hello, World!' to hello.txt in the default directory)
+ de: (speichert 'Hallo, Welt!' in hello.txt im Standardverzeichnis)
+ - question:
+ en: Load the content from notes.md.
+ de: Lade den Inhalt aus notes.md.
+ answer:
+ en: (loads the content of notes.md and reads it out loud)
+ de: (lädt den Inhalt von notes.md und liest ihn vor)
+ - question:
+ en: Create a directory named 'Projects'.
+ de: Erstelle ein Verzeichnis namens 'Projekte'.
+ answer:
+ en: (creates a directory named 'Projects' in the default directory)
+ de: (erstellt ein Verzeichnis namens 'Projekte' im Standardverzeichnis)
+prompt: |
+ You can also save text to various file formats, load text from files, or create directories as specified by the user.
+ You support all plain text file formats.
+ When adding text to an existing file, you follow these rules:
+ (1) determine if it is appropriate to add a new line before the added text or ask the user if you do not know.
+ (2) only add content to an existing file if you are sure that is what the user wants.
+ (3) when adding content to a file, only add the specific additional content the user wants added, not a duplicate of all of the original content.
+ You can also aid the user in opening folders / directories in the user interface.
+custom_properties:
+ - hint: The default directory for file operations. If left blank, will default to your WingmanAI config directory in a sub-directory called "files".
+ id: default_directory
+ name: Default directory
+ property_type: string
+ required: false
+ value: ""
+ - hint: Allow WingmanAI FileManager to overwrite existing files. CAUTION - ADVANCED USERS ONLY - Only activate this option if you have backed up existing files.
+ id: allow_overwrite_existing
+ name: Allow overwrite existing files
+ property_type: boolean
+ required: true
+ value: false
diff --git a/templates/migration/1_5_0/skills/file_manager/logo.png b/templates/migration/1_5_0/skills/file_manager/logo.png
new file mode 100644
index 00000000..81ea6846
Binary files /dev/null and b/templates/migration/1_5_0/skills/file_manager/logo.png differ
diff --git a/templates/migration/1_5_0/skills/file_manager/main.py b/templates/migration/1_5_0/skills/file_manager/main.py
new file mode 100644
index 00000000..b06f6653
--- /dev/null
+++ b/templates/migration/1_5_0/skills/file_manager/main.py
@@ -0,0 +1,390 @@
+import os
+import json
+from typing import TYPE_CHECKING
+from api.interface import SettingsConfig, SkillConfig, WingmanInitializationError
+from api.enums import LogType
+from skills.skill_base import Skill
+from services.file import get_writable_dir
+from showinfm import show_in_file_manager
+if TYPE_CHECKING:
+ from wingmen.open_ai_wingman import OpenAiWingman
+
+DEFAULT_MAX_TEXT_SIZE = 15000
+SUPPORTED_FILE_EXTENSIONS = [
+ "adoc",
+ "asc",
+ "bat",
+ "bib",
+ "cfg",
+ "conf",
+ "cpp",
+ "c",
+ "cs",
+ "css",
+ "csv",
+ "dockerfile",
+ "dot",
+ "env",
+ "fo",
+ "gd",
+ "gitconfig",
+ "gitignore",
+ "go",
+ "graphql",
+ "h",
+ "htaccess",
+ "html",
+ "http",
+ "ini",
+ "ipynb",
+ "java",
+ "json",
+ "jsonl",
+ "js",
+ "lua",
+ "log",
+ "m3u",
+ "map",
+ "md",
+ "pyd",
+ "plist",
+ "pl",
+ "po",
+ "ps1",
+ "pxd",
+ "py",
+ "resx",
+ "rpy",
+ "rs",
+ "rst",
+ "rtf",
+ "srt",
+ "sh",
+ "sql",
+ "svg",
+ "ts",
+ "tcl",
+ "tex",
+ "tmpl",
+ "toml",
+ "tpl",
+ "tsv",
+ "txt",
+ "vtt",
+ "wsdl",
+ "wsgi",
+ "xlf",
+ "xml",
+ "yaml",
+]
+
+
+class FileManager(Skill):
+
+ def __init__(
+ self,
+ config: SkillConfig,
+ settings: SettingsConfig,
+ wingman: "OpenAiWingman",
+ ) -> None:
+ super().__init__(config=config, settings=settings, wingman=wingman)
+
+ self.allowed_file_extensions = SUPPORTED_FILE_EXTENSIONS
+ self.default_file_extension = "txt"
+ self.max_text_size = DEFAULT_MAX_TEXT_SIZE
+ self.default_directory = "" # Set in validate
+ self.allow_overwrite_existing = False
+
+ async def validate(self) -> list[WingmanInitializationError]:
+ errors = await super().validate()
+
+ self.default_directory = self.retrieve_custom_property_value(
+ "default_directory", errors
+ )
+ if not self.default_directory or self.default_directory == "":
+ self.default_directory = self.get_default_directory()
+
+ self.allow_overwrite_existing = self.retrieve_custom_property_value(
+ "allow_overwrite_existing", errors
+ )
+ return errors
+
+ def get_tools(self) -> list[tuple[str, dict]]:
+ tools = [
+ (
+ "load_text_from_file",
+ {
+ "type": "function",
+ "function": {
+ "name": "load_text_from_file",
+ "description": "Loads the content of a specified text file.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "file_name": {
+ "type": "string",
+ "description": "The name of the file to load, including the file extension.",
+ },
+ "directory_path": {
+ "type": "string",
+ "description": "The directory from where the file should be loaded. Defaults to the configured directory.",
+ },
+ },
+ "required": ["file_name"],
+ },
+ },
+ },
+ ),
+ (
+ "save_text_to_file",
+ {
+ "type": "function",
+ "function": {
+ "name": "save_text_to_file",
+ "description": "Saves the provided text to a file.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "file_name": {
+ "type": "string",
+ "description": "The name of the file where the text should be saved, including the file extension.",
+ },
+ "text_content": {
+ "type": "string",
+ "description": "The text content to save to the file.",
+ },
+ "directory_path": {
+ "type": "string",
+ "description": "The directory where the file should be saved. Defaults to the configured directory.",
+ },
+ "add_to_existing_file": {
+ "type": "boolean",
+ "description": "Boolean True/False indicator of whether the user wants to add text to an already existing file. Defaults to False unless user expresses clear intent to add to existing file.",
+ },
+ },
+ "required": [
+ "file_name",
+ "text_content",
+ "add_to_existing_file",
+ ],
+ },
+ },
+ },
+ ),
+ (
+ "create_folder",
+ {
+ "type": "function",
+ "function": {
+ "name": "create_folder",
+ "description": "Creates a folder in the specified directory.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "folder_name": {
+ "type": "string",
+ "description": "The name of the folder to create.",
+ },
+ "directory_path": {
+ "type": "string",
+ "description": "The path of the directory where the folder should be created. Defaults to the configured directory.",
+ },
+ },
+ "required": ["folder_name"],
+ },
+ },
+ },
+ ),
+ (
+ "open_folder",
+ {
+ "type": "function",
+ "function": {
+ "name": "open_folder",
+ "description": "Opens a specified directory in the GUI.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "folder_name": {
+ "type": "string",
+ "description": "The name of the folder to open.",
+ },
+ "directory_path": {
+ "type": "string",
+ "description": "The path of the directory where the folder to open is located. Defaults to the configured directory.",
+ },
+ },
+ "required": ["folder_name"],
+ },
+ },
+ },
+ ),
+ ]
+ return tools
+
+ async def execute_tool(
+ self, tool_name: str, parameters: dict[str, any]
+ ) -> tuple[str, str]:
+ function_response = "Operation not completed."
+ instant_response = ""
+
+ if tool_name == "load_text_from_file":
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+ await self.printr.print_async(
+ f"Executing load_text_from_file with parameters: {parameters}",
+ color=LogType.INFO,
+ )
+ file_name = parameters.get("file_name")
+ directory = parameters.get("directory_path", self.default_directory)
+ if directory == "":
+ directory = self.default_directory
+ if not file_name or file_name == "":
+ function_response = "File name not provided."
+ else:
+ file_extension = file_name.split(".")[-1]
+ if file_extension not in self.allowed_file_extensions:
+ function_response = f"Unsupported file extension: {file_extension}"
+ else:
+ file_path = os.path.join(directory, file_name)
+ try:
+ with open(file_path, "r", encoding="utf-8") as file:
+ file_content = file.read()
+ if len(file_content) > self.max_text_size:
+ function_response = (
+ "File content exceeds the maximum allowed size."
+ )
+ else:
+ function_response = f"File content loaded from {file_path}:\n{file_content}"
+ except FileNotFoundError:
+ function_response = (
+ f"File '{file_name}' not found in '{directory}'."
+ )
+ except Exception as e:
+ function_response = (
+ f"Failed to read file '{file_name}': {str(e)}"
+ )
+
+ elif tool_name == "save_text_to_file":
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+ await self.printr.print_async(
+ f"Executing save_text_to_file with parameters: {parameters}",
+ color=LogType.INFO,
+ )
+ file_name = parameters.get("file_name")
+ text_content = parameters.get("text_content")
+ add_to_existing_file = parameters.get("add_to_existing_file", False)
+ directory = parameters.get("directory_path", self.default_directory)
+ if directory == "":
+ directory = self.default_directory
+ if not file_name or not text_content or file_name == "":
+ function_response = "File name or text content not provided."
+ else:
+ file_extension = file_name.split(".")[-1]
+ if file_extension not in self.allowed_file_extensions:
+ file_name += f".{self.default_file_extension}"
+ if len(text_content) > self.max_text_size:
+ function_response = "Text content exceeds the maximum allowed size."
+ else:
+ if file_extension == "json":
+ try:
+ json_content = json.loads(text_content)
+ text_content = json.dumps(json_content, indent=4)
+ except json.JSONDecodeError as e:
+ function_response = f"Invalid JSON content: {str(e)}"
+ return function_response, instant_response
+ os.makedirs(directory, exist_ok=True)
+ file_path = os.path.join(directory, file_name)
+
+ # If file already exists, and user does not have overwrite option on, and LLM did not detect an intent to add to the existing file, stop
+ if (
+ os.path.isfile(file_path)
+ and not self.allow_overwrite_existing
+ and not add_to_existing_file
+ ):
+ function_response = f"File '{file_name}' already exists at {directory} and overwrite is not allowed."
+
+ # Otherwise, if file exists but LLM detected user wanted to add to existing file, do that.
+ elif os.path.isfile(file_path) and add_to_existing_file:
+ try:
+ with open(file_path, "a", encoding="utf-8") as file:
+ file.write(text_content)
+ function_response = (
+ f"Text added to existing file at {file_path}."
+ )
+ except Exception as e:
+ function_response = (
+ f"Failed to append text to {file_path}: {str(e)}"
+ )
+
+ # We are either fine with completely overwriting the file or it does not exist already
+ else:
+ try:
+ with open(file_path, "w", encoding="utf-8") as file:
+ file.write(text_content)
+ function_response = f"Text saved to {file_path}."
+ except Exception as e:
+ function_response = (
+ f"Failed to save text to {file_path}: {str(e)}"
+ )
+
+ elif tool_name == "create_folder":
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+ await self.printr.print_async(
+ f"Executing create_folder with parameters: {parameters}",
+ color=LogType.INFO,
+ )
+ folder_name = parameters.get("folder_name")
+ directory_path = parameters.get("directory_path", self.default_directory)
+ if directory_path == "":
+ directory_path = self.default_directory
+ if not folder_name or folder_name == "":
+ function_response = "Folder name not provided."
+ else:
+ full_path = os.path.join(directory_path, folder_name)
+ try:
+ os.makedirs(full_path, exist_ok=True)
+ function_response = (
+ f"Folder '{folder_name}' created at '{directory_path}'."
+ )
+ except Exception as e:
+ function_response = (
+ f"Failed to create folder '{folder_name}': {str(e)}"
+ )
+
+ elif tool_name == "open_folder":
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+ await self.printr.print_async(
+ f"Executing open_folder with parameters: {parameters}",
+ color=LogType.INFO,
+ )
+ folder_name = parameters.get("folder_name")
+ directory_path = parameters.get("directory_path", self.default_directory)
+ if directory_path == "":
+ directory_path = self.default_directory
+ if not folder_name or folder_name == "":
+ function_response = "Folder name not provided."
+ else:
+ full_path = os.path.join(directory_path, folder_name)
+ try:
+ show_in_file_manager(full_path)
+ function_response = (
+ f"Folder '{folder_name}' opened in '{directory_path}'."
+ )
+ except Exception as e:
+ function_response = (
+ f"Failed to open folder '{folder_name}': {str(e)}"
+ )
+
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Finished calling {tool_name} tool and returned function response: {function_response}",
+ color=LogType.INFO,
+ )
+ return function_response, instant_response
+
+ def get_default_directory(self) -> str:
+ return get_writable_dir("files")
diff --git a/templates/migration/1_5_0/skills/nms_assistant/default_config.yaml b/templates/migration/1_5_0/skills/nms_assistant/default_config.yaml
new file mode 100644
index 00000000..3504dc60
--- /dev/null
+++ b/templates/migration/1_5_0/skills/nms_assistant/default_config.yaml
@@ -0,0 +1,19 @@
+name: NMSAssistant
+module: skills.nms_assistant.main
+category: no_mans_sky
+description:
+ en: Fetch information about No Man's Sky items, elements, crafting, cooking, expeditions, community missions, game news, and patch notes. Powered by NMS Assistant API.
+ de: Rufe Informationen über No Man's Sky-Artikel, Elemente, Handwerk, Kochen, Expeditionen, Gemeinschaftsmissionen, Spielemeldungen und Patch-Notizen ab. Unterstützt von NMS Assistant API.
+hint:
+ en: Try asking about an item generally first so the skill retrieves enough identifying information, before asking more detailed crafting or cooking information about it.
+ de: Versuch erstmal allgemein nach einem Gegenstand zu fragen, damit die Fähigkeit genug identifizierende Informationen erhält, bevor du nach detaillierten Handwerks- oder Kochinformationen fragst.
+examples:
+ - question:
+ en: What are the latest patch notes?
+ de: Was sind die neuesten Patch-Notes?
+ answer:
+ en: (fetches and displays the latest patch notes from the game)
+ de: (ruft die neuesten Patch-Notes aus dem Spiel ab und zeigt sie an)
+prompt: |
+ You are an assistant for No Man's Sky players. You can fetch information about items, elements, crafting, cooking, expeditions, community missions, and game news.
+ Use the relevant APIs to provide detailed information requested by the user.
diff --git a/templates/migration/1_5_0/skills/nms_assistant/logo.png b/templates/migration/1_5_0/skills/nms_assistant/logo.png
new file mode 100644
index 00000000..36be7aa4
Binary files /dev/null and b/templates/migration/1_5_0/skills/nms_assistant/logo.png differ
diff --git a/templates/migration/1_5_0/skills/nms_assistant/main.py b/templates/migration/1_5_0/skills/nms_assistant/main.py
new file mode 100644
index 00000000..aba51977
--- /dev/null
+++ b/templates/migration/1_5_0/skills/nms_assistant/main.py
@@ -0,0 +1,402 @@
+import requests
+import json
+from typing import TYPE_CHECKING
+from api.enums import LogType
+from api.interface import (
+ SettingsConfig,
+ SkillConfig,
+ WingmanInitializationError,
+)
+from skills.skill_base import Skill
+import asyncio
+
+if TYPE_CHECKING:
+ from wingmen.open_ai_wingman import OpenAiWingman
+
+API_BASE_URL = "https://api.nmsassistant.com"
+
+class NMSAssistant(Skill):
+ def __init__(
+ self,
+ config: SkillConfig,
+ settings: SettingsConfig,
+ wingman: "OpenAiWingman",
+ ) -> None:
+ super().__init__(config=config, settings=settings, wingman=wingman)
+
+ async def validate(self) -> list[WingmanInitializationError]:
+ errors = await super().validate()
+ return errors
+
+ def get_tools(self) -> list[tuple[str, dict]]:
+ tools = [
+ (
+ "get_release_info",
+ {
+ "type": "function",
+ "function": {
+ "name": "get_release_info",
+ "description": "Fetch release information from No Man's Sky website.",
+ },
+ },
+ ),
+ (
+ "get_news",
+ {
+ "type": "function",
+ "function": {
+ "name": "get_news",
+ "description": "Fetch news from No Man's Sky website.",
+ },
+ },
+ ),
+ (
+ "get_community_mission_info",
+ {
+ "type": "function",
+ "function": {
+ "name": "get_community_mission_info",
+ "description": "Fetch current community mission information.",
+ },
+ },
+ ),
+ (
+ "get_latest_expedition_info",
+ {
+ "type": "function",
+ "function": {
+ "name": "get_latest_expedition_info",
+ "description": "Fetch latest expedition information.",
+ },
+ },
+ ),
+ (
+ "get_item_info_by_name",
+ {
+ "type": "function",
+ "function": {
+ "name": "get_item_info_by_name",
+ "description": "Fetch game item details based on name and language.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "name": {
+ "type": "string",
+ "description": "The name of the item.",
+ },
+ "languageCode": {
+ "type": "string",
+ "description": "The language code (e.g., 'en' for English)",
+ },
+ },
+ "required": ["name", "languageCode"],
+ },
+ },
+ },
+ ),
+ (
+ "get_extra_item_info",
+ {
+ "type": "function",
+ "function": {
+ "name": "get_extra_item_info",
+ "description": "Fetch extra item details using appId.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "appId": {
+ "type": "string",
+ "description": "The appId of the item.",
+ },
+ "languageCode": {
+ "type": "string",
+ "description": "The language code (e.g., 'en' for English)",
+ },
+ },
+ "required": ["appId", "languageCode"],
+ },
+ },
+ },
+ ),
+ (
+ "get_refiner_recipes_by_input",
+ {
+ "type": "function",
+ "function": {
+ "name": "get_refiner_recipes_by_input",
+ "description": "Fetch refiner recipes by input item using appId.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "appId": {
+ "type": "string",
+ "description": "The appId of the item.",
+ },
+ "languageCode": {
+ "type": "string",
+ "description": "The language code (e.g., 'en' for English)",
+ },
+ },
+ "required": ["appId", "languageCode"],
+ },
+ },
+ },
+ ),
+ (
+ "get_refiner_recipes_by_output",
+ {
+ "type": "function",
+ "function": {
+ "name": "get_refiner_recipes_by_output",
+ "description": "Fetch refiner recipes by output item using appId.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "appId": {
+ "type": "string",
+ "description": "The appId of the item.",
+ },
+ "languageCode": {
+ "type": "string",
+ "description": "The language code (e.g., 'en' for English)",
+ },
+ },
+ "required": ["appId", "languageCode"],
+ },
+ },
+ },
+ ),
+ (
+ "get_cooking_recipes_by_input",
+ {
+ "type": "function",
+ "function": {
+ "name": "get_cooking_recipes_by_input",
+ "description": "Fetch cooking recipes by input item using appId.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "appId": {
+ "type": "string",
+ "description": "The appId of the item.",
+ },
+ "languageCode": {
+ "type": "string",
+ "description": "The language code (e.g., 'en' for English)",
+ },
+ },
+ "required": ["appId", "languageCode"],
+ },
+ },
+ },
+ ),
+ (
+ "get_cooking_recipes_by_output",
+ {
+ "type": "function",
+ "function": {
+ "name": "get_cooking_recipes_by_output",
+ "description": "Fetch cooking recipes by output item using appId.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "appId": {
+ "type": "string",
+ "description": "The appId of the item.",
+ },
+ "languageCode": {
+ "type": "string",
+ "description": "The language code (e.g., 'en' for English)",
+ },
+ },
+ "required": ["appId", "languageCode"],
+ },
+ },
+ },
+ ),
+ ]
+ return tools
+
+ async def request_api(self, endpoint: str) -> dict:
+ response = requests.get(f"{API_BASE_URL}{endpoint}")
+ if response.status_code == 200:
+ return response.json()
+ else:
+ if self.settings.debug_mode:
+ await self.printr.print_async(f"API request failed to {API_BASE_URL}{endpoint}, status code: {response.status_code}.", color=LogType.INFO)
+ return {}
+
+ async def parse_nms_assistant_api_response(self, api_response) -> dict:
+ def extract_app_ids(data):
+ app_ids = []
+ # Parse the JSON string into Python objects
+ #data = json.loads(json_data)
+ for entry in data:
+ # Extract the appId from the main entry
+ app_ids.append(entry['appId'])
+ # Extract the appIds from the inputs
+ for input_item in entry['inputs']:
+ app_ids.append(input_item['appId'])
+ # Extract the appId from the output
+ app_ids.append(entry['output']['appId'])
+ return app_ids
+ # Extract appIds
+ app_ids = extract_app_ids(api_response)
+ async def fetch_item_name(app_id: str) -> str:
+ data = await self.request_api(f"/ItemInfo/{app_id}/en")
+ return data.get('name', 'Unknown')
+ # Get names for each appId as a key for the LLM
+ tasks = [fetch_item_name(item) for item in app_ids]
+ results = await asyncio.gather(*tasks)
+ return {item: name for item, name in zip(app_ids, results)}
+
+ async def check_if_appId_is_valid(self, appId, languageCode) -> bool:
+ if self.settings.debug_mode:
+ await self.printr.print_async(f"Checking if appID {appId} is valid before proceeding.", color=LogType.INFO)
+ check_response = await self.request_api(f"/ItemInfo/{appId}/{languageCode}")
+ if check_response and check_response != {}:
+ return True
+ else:
+ return False
+
+ async def is_waiting_response_needed(self, tool_name: str) -> bool:
+ return True
+
+ async def execute_tool(self, tool_name: str, parameters: dict[str, any]) -> tuple[str, str]:
+ function_response = "Operation failed."
+ instant_response = ""
+
+ if tool_name == "get_release_info":
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+ data = await self.request_api("/HelloGames/Release")
+ function_response = data if data else function_response
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+
+ elif tool_name == "get_news":
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+ data = await self.request_api("/HelloGames/News")
+ function_response = data if data else function_response
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+
+ elif tool_name == "get_community_mission_info":
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+ data = await self.request_api("/HelloGames/CommunityMission")
+ function_response = data if data else function_response
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+
+ elif tool_name == "get_latest_expedition_info":
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+ data = await self.request_api("/HelloGames/Expedition")
+ function_response = data if data else function_response
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+
+ elif tool_name == "get_item_info_by_name":
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+ name = parameters.get("name")
+ language_code = parameters.get("languageCode")
+ data = await self.request_api(f"/ItemInfo/Name/{name}/{language_code}")
+ function_response = data if data else function_response
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+
+ elif tool_name == "get_extra_item_info":
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+ app_id = parameters.get("appId")
+ language_code = parameters.get("languageCode")
+ if app_id and language_code:
+ appId_found = await self.check_if_appId_is_valid(app_id, language_code)
+ if not appId_found:
+ # Assume maybe the appId is actually the plain text item name, so get appId from that
+ name_check = await self.request_api(f"/ItemInfo/Name/{app_id}/{language_code}")
+ app_id = name_check.get('appId') if name_check else app_id
+ data = await self.request_api(f"/ItemInfo/ExtraProperties/{app_id}/{language_code}")
+ function_response = data if data else function_response
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+
+ elif tool_name == "get_refiner_recipes_by_input":
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+ app_id = parameters.get("appId")
+ language_code = parameters.get("languageCode")
+ if app_id and language_code:
+ appId_found = await self.check_if_appId_is_valid(app_id, language_code)
+ if not appId_found:
+ # Assume maybe the appId is actually the plain text item name, so get appId from that
+ name_check = await self.request_api(f"/ItemInfo/Name/{app_id}/{language_code}")
+ app_id = name_check.get('appId') if name_check else app_id
+ data = await self.request_api(f"/ItemInfo/RefinerByInput/{app_id}/{language_code}")
+ if data:
+ parsed_data = await self.parse_nms_assistant_api_response(data)
+ function_response = f"{data}; key for item names used in above data: {parsed_data}"
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+
+ elif tool_name == "get_refiner_recipes_by_output":
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+ app_id = parameters.get("appId")
+ language_code = parameters.get("languageCode")
+ if app_id and language_code:
+ appId_found = await self.check_if_appId_is_valid(app_id, language_code)
+ if not appId_found:
+ # Assume maybe the appId is actually the plain text item name, so get appId from that
+ name_check = await self.request_api(f"/ItemInfo/Name/{app_id}/{language_code}")
+ app_id = name_check.get('appId') if name_check else app_id
+ data = await self.request_api(f"/ItemInfo/RefinerByOutut/{app_id}/{language_code}")
+ if data:
+ parsed_data = await self.parse_nms_assistant_api_response(data)
+ function_response = f"{data}; key for item names used in above data: {parsed_data}"
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+
+ elif tool_name == "get_cooking_recipes_by_input":
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+ app_id = parameters.get("appId")
+ language_code = parameters.get("languageCode")
+ if app_id and language_code:
+ appId_found = await self.check_if_appId_is_valid(app_id, language_code)
+ if not appId_found:
+ # Assume maybe the appId is actually the plain text item name, so get appId from that
+ name_check = await self.request_api(f"/ItemInfo/Name/{app_id}/{language_code}")
+ app_id = name_check.get('appId') if name_check else app_id
+ data = await self.request_api(f"/ItemInfo/CookingByInput/{app_id}/{language_code}")
+ if data:
+ parsed_data = await self.parse_nms_assistant_api_response(data)
+ function_response = f"{data}; key for item names used in above data: {parsed_data}"
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+
+ elif tool_name == "get_cooking_recipes_by_output":
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+ app_id = parameters.get("appId")
+ language_code = parameters.get("languageCode")
+ if app_id and language_code:
+ appId_found = await self.check_if_appId_is_valid(app_id, language_code)
+ if not appId_found:
+ # Assume maybe the appId is actually the plain text item name, so get appId from that
+ name_check = await self.request_api(f"/ItemInfo/Name/{app_id}/{language_code}")
+ app_id = name_check.get('appId') if name_check else app_id
+ data = await self.request_api(f"/ItemInfo/CookingByOutut/{app_id}/{language_code}")
+ if data:
+ parsed_data = await self.parse_nms_assistant_api_response(data)
+ function_response = f"{data}; key for item names used in above data: {parsed_data}"
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+
+ if self.settings.debug_mode:
+ await self.printr.print_async(f"Executed {tool_name} with parameters {parameters}. Result: {function_response}", color=LogType.INFO)
+
+ return function_response, instant_response
\ No newline at end of file
diff --git a/templates/migration/1_5_0/skills/quick_commands/default_config.yaml b/templates/migration/1_5_0/skills/quick_commands/default_config.yaml
new file mode 100644
index 00000000..8cf1a777
--- /dev/null
+++ b/templates/migration/1_5_0/skills/quick_commands/default_config.yaml
@@ -0,0 +1,13 @@
+name: QuickCommands
+module: skills.quick_commands.main
+category: general
+description:
+ en: Automatically learns instant activation phrases for commands you use regularly, speeding up their execution time.
+ de: Lernt automatisch Aktivierungsphrasen für Sofort-Befehle, die du regelmäßig verwendest und beschleunigt so die Ausführungszeit.
+custom_properties:
+ - id: quick_commands_learning_rule_count
+ name: Hits until learn
+ hint: The number of times a command got executed by the same phrase before it is learned.
+ value: 3
+ required: true
+ property_type: number
diff --git a/templates/migration/1_5_0/skills/quick_commands/logo.png b/templates/migration/1_5_0/skills/quick_commands/logo.png
new file mode 100644
index 00000000..bf8c9aaf
Binary files /dev/null and b/templates/migration/1_5_0/skills/quick_commands/logo.png differ
diff --git a/templates/migration/1_5_0/skills/quick_commands/main.py b/templates/migration/1_5_0/skills/quick_commands/main.py
new file mode 100644
index 00000000..9e6592b3
--- /dev/null
+++ b/templates/migration/1_5_0/skills/quick_commands/main.py
@@ -0,0 +1,289 @@
+from os import path
+import json
+import datetime
+from typing import TYPE_CHECKING
+from api.interface import SettingsConfig, SkillConfig, WingmanInitializationError
+from api.enums import LogType
+from services.file import get_writable_dir
+from skills.skill_base import Skill
+
+if TYPE_CHECKING:
+ from wingmen.open_ai_wingman import OpenAiWingman
+
+
+class QuickCommands(Skill):
+
+ def __init__(
+ self,
+ config: SkillConfig,
+ settings: SettingsConfig,
+ wingman: "OpenAiWingman",
+ ) -> None:
+ super().__init__(config=config, settings=settings, wingman=wingman)
+
+ # get file paths
+ self.data_path = get_writable_dir(path.join("skills", "quick_commands", "data"))
+ self.file_ipl = path.join(self.data_path, "instant_phrase_learning.json")
+
+ # learning data
+ self.learning_blacklist = []
+ self.learning_data = {}
+ self.learning_learned = {}
+
+ # rules
+ self.rule_count = 3
+
+ async def validate(self) -> list[WingmanInitializationError]:
+ errors = await super().validate()
+
+ self.rule_count = self.retrieve_custom_property_value(
+ "quick_commands_learning_rule_count", errors
+ )
+ if not self.rule_count or self.rule_count < 0:
+ self.rule_count = 3
+
+ self.threaded_execution(self._init_skill)
+ return errors
+
+ async def _init_skill(self) -> None:
+ """Initialize the skill."""
+ await self._load_learning_data()
+ await self._cleanup_learning_data()
+ for phrase, commands in self.learning_learned.items():
+ await self._add_instant_activation_phrase(phrase, commands)
+
+ async def _add_instant_activation_phrase(
+ self, phrase: str, commands: list[str]
+ ) -> None:
+ """Add an instant activation phrase."""
+ for command in commands:
+ command = self.wingman.get_command(command)
+ if not command.instant_activation:
+ command.instant_activation = []
+
+ if phrase not in command.instant_activation:
+ command.instant_activation.append(phrase)
+
+ async def on_add_assistant_message(self, message: str, tool_calls: list) -> None:
+ """Hook to start learning process."""
+ if tool_calls:
+ self.threaded_execution(
+ self._process_messages, tool_calls, self.wingman.messages[-1]
+ )
+
+ async def _process_messages(self, tool_calls, last_message) -> None:
+ """Process messages to learn phrases and commands."""
+ phrase = ""
+ command_names = []
+
+ for tool_call in tool_calls:
+ if tool_call.function.name == "execute_command":
+ if isinstance(tool_call.function.arguments, dict):
+ arguments = tool_call.function.arguments
+ else:
+ try:
+ arguments = json.loads(tool_call.function.arguments)
+ except json.JSONDecodeError:
+ return
+ if "command_name" in arguments:
+ command_names.append(arguments["command_name"])
+ else:
+ return
+ else:
+ return
+
+ role = (
+ last_message.role
+ if hasattr(last_message, "role")
+ else last_message.get("role", False)
+ )
+ if role != "user":
+ return
+ phrase = (
+ last_message.content
+ if hasattr(last_message, "content")
+ else last_message.get("content", False)
+ )
+
+ if not phrase or not command_names:
+ return
+
+ await self._learn_phrase(phrase.lower(), command_names)
+
+ async def _cleanup_learning_data(self) -> None:
+ """Cleanup learning data. (Remove for commands that are no loner available)"""
+ pops = []
+ for phrase, commands in self.learning_learned.items():
+ for command in commands:
+ if not self.wingman.get_command(command):
+ pops.append(phrase)
+ if pops:
+ for phrase in pops:
+ self.learning_learned.pop(phrase)
+
+ pops = []
+ finished = []
+ for phrase in self.learning_data.keys():
+ commands = self.learning_data[phrase]["commands"]
+ for command in commands:
+ if not self.wingman.get_command(command):
+ pops.append(phrase)
+ elif self.learning_data[phrase]["count"] >= self.rule_count:
+ finished.append(phrase)
+
+ if pops:
+ for phrase in pops:
+ self.learning_data.pop(phrase)
+ if finished:
+ for phrase in finished:
+ await self._finish_learning(phrase)
+
+ await self._save_learning_data()
+
+ async def _learn_phrase(self, phrase: str, command_names: list[str]) -> None:
+ """Learn a phrase and the tool calls that should be executed for it."""
+ # load the learning data
+ await self._load_learning_data()
+
+ # check if the phrase is on the blacklist
+ if phrase in self.learning_blacklist:
+ return
+
+ # get and check the command
+ for command_name in command_names:
+ command = self.wingman.get_command(command_name)
+ if not command:
+ # AI probably hallucinated
+ return
+
+ # add / increase count of the phrase
+ if phrase in self.learning_data:
+ if len(self.learning_data[phrase]["commands"]) != len(command_names):
+ # phrase is ambiguous, add to blacklist
+ await self._add_to_blacklist(phrase)
+ return
+
+ for command_name in command_names:
+ if command_name not in self.learning_data[phrase]["commands"]:
+ # phrase is ambiguous, add to blacklist
+ await self._add_to_blacklist(phrase)
+ return
+
+ self.learning_data[phrase]["count"] += 1
+ else:
+ self.learning_data[phrase] = {"commands": command_names, "count": 1}
+
+ if self.learning_data[phrase]["count"] >= self.rule_count:
+ await self._finish_learning(phrase)
+
+ # save the learning data
+ await self._save_learning_data()
+
+ async def _finish_learning(self, phrase: str) -> None:
+ """Finish learning a phrase.
+ A gpt call will be made to check if the phrases makes sense.
+ This will add the phrase to the learned phrases."""
+
+ commands = self.learning_data[phrase]["commands"]
+
+ messages = [
+ {
+ "role": "system",
+ "content": """
+ I'll give you one or multiple commands and a phrase. You have to decide, if the commands fit to the phrase or not.
+ Return 'yes' if the commands fit to the phrase and 'no' if they dont.
+
+ Samples:
+ - Phrase: "What is the weather like?" Command: "checkWeather" -> yes
+ - Phrase: "What is the weather like?" Command: "playMusic" -> no
+ - Phrase: "Please do that." Command: "enableShields" -> no
+ - Phrase: "Yes, please." Command: "enableShields" -> no
+ - Phrase: "We are being attacked by rockets." Command: "throwCountermessures" -> yes
+ - Phrase: "Its way too dark in here." Command: "toggleLight" -> yes
+ """,
+ },
+ {
+ "role": "user",
+ "content": f"Phrase: '{phrase}' Commands: '{', '.join(commands)}'",
+ },
+ ]
+ completion = await self.llm_call(messages)
+ answer = completion.choices[0].message.content or ""
+ if answer.lower() == "yes":
+ await self.printr.print_async(
+ f"Instant activation phrase for '{', '.join(commands)}' learned.",
+ color=LogType.INFO,
+ )
+ self.learning_learned[phrase] = commands
+ self.learning_data.pop(phrase)
+ await self._add_instant_activation_phrase(phrase, commands)
+ else:
+ await self._add_to_blacklist(phrase)
+
+ await self._save_learning_data()
+
+ async def _add_to_blacklist(self, phrase: str) -> None:
+ """Add a phrase to the blacklist."""
+ await self.printr.print_async(
+ f"Added phrase to blacklist: '{phrase if len(phrase) <= 25 else phrase[:25]+'...'}'",
+ color=LogType.INFO,
+ )
+ self.learning_blacklist.append(phrase)
+ self.learning_data.pop(phrase)
+ await self._save_learning_data()
+
+ async def _is_phrase_on_blacklist(self, phrase: str) -> bool:
+ """Check if a phrase is on the blacklist."""
+ if phrase in self.learning_blacklist:
+ return True
+ return False
+
+ async def _load_learning_data(self):
+ """Load the learning data file."""
+
+ # create the file if it does not exist
+ if not path.exists(self.file_ipl):
+ return
+
+ # load the learning data
+ with open(self.file_ipl, "r", encoding="utf-8") as file:
+ try:
+ data = json.load(file)
+ except json.JSONDecodeError:
+ await self.printr.print_async(
+ "Could not read learning data file. Resetting learning data..",
+ color=LogType.ERROR,
+ )
+ # if file wasnt empty, save it as backup
+ if file.read():
+ timestamp = datetime.datetime.now().strftime("%Y-%m-%d-%H:%M:%S")
+ with open(
+ self.file_ipl + f"_{timestamp}.backup", "w", encoding="utf-8"
+ ) as backup_file:
+ backup_file.write(file.read())
+ # reset learning data
+ with open(self.file_ipl, "w", encoding="utf-8") as file:
+ file.write("{}")
+ data = {}
+
+ self.learning_data = (
+ data["learning_data"] if "learning_data" in data else {}
+ )
+ self.learning_blacklist = (
+ data["learning_blacklist"] if "learning_blacklist" in data else []
+ )
+ self.learning_learned = (
+ data["learning_learned"] if "learning_learned" in data else {}
+ )
+
+ async def _save_learning_data(self) -> None:
+ """Save the learning data."""
+
+ learning_data = {
+ "learning_data": self.learning_data,
+ "learning_blacklist": self.learning_blacklist,
+ "learning_learned": self.learning_learned,
+ }
+
+ with open(self.file_ipl, "w", encoding="utf-8") as file:
+ json.dump(learning_data, file, indent=4)
diff --git a/templates/migration/1_5_0/skills/radio_chatter/default_config.yaml b/templates/migration/1_5_0/skills/radio_chatter/default_config.yaml
new file mode 100644
index 00000000..a73869ee
--- /dev/null
+++ b/templates/migration/1_5_0/skills/radio_chatter/default_config.yaml
@@ -0,0 +1,120 @@
+name: RadioChatter
+module: skills.radio_chatter.main
+category: general
+description:
+ en: Randomly playback radio chatter over time. Customize the participants and their voices.
+ de: Spielt zufällige Funkgespräche ab. Passe die Teilnehmer und ihre Stimmen an.
+examples:
+ - question:
+ en: What is the status of the radio?
+ de: Was ist der Status des Funkgeräts?
+ answer:
+ en: The radio is currently turned off.
+ de: Das Funkgerät ist derzeit ausgeschaltet.
+ - question:
+ en: Please turn the radio on.
+ de: Bitte schalte das Funkgerät ein.
+ answer:
+ en: The radio is now turned on.
+ de: Das Funkgerät wurde eingeschaltet.
+custom_properties:
+ - id: prompt
+ name: Prompt for message generation
+ hint: A prompt used on voice change to generate a new personality. Leave empty to disable.
+ required: false
+ value: "Generate a dialog between random pilots in the Star Citizen universe. Feel free to throw in some random details. Keep in mind that Port Olisar does no longer exist."
+ property_type: textarea
+ - id: voices
+ name: Available voices
+ hint: The voices used in the radio chatter
+ value: []
+ required: false
+ property_type: voice_selection
+ options:
+ - label: "multiple"
+ value: true
+ - id: interval_min
+ name: Min interval
+ hint: The minimum time in seconds between radio chatter. This is also the time used until the first chatter event occurs.
+ value: 30
+ required: true
+ property_type: number
+ - id: interval_max
+ name: Max interval
+ hint: The maximum time in seconds between radio chatter.
+ value: 600
+ required: true
+ property_type: number
+ - id: messages_min
+ name: Min messages
+ hint: The minimum number of messages to play for on chatter event.
+ value: 1
+ required: true
+ property_type: number
+ - id: messages_max
+ name: Max messages
+ hint: The maximum number of messages to play for on chatter event.
+ value: 5
+ required: true
+ property_type: number
+ - id: participants_min
+ name: Min participants
+ hint: The minimum number of participants in the chatter.
+ value: 2
+ required: true
+ property_type: number
+ - id: participants_max
+ name: Max participants
+ hint: The maximum number of participants in the chatter.
+ value: 3
+ required: true
+ property_type: number
+ - id: force_radio_sound
+ name: Force radio effect
+ hint: Overwrites wingman sound effects for radio chatter with selected radio effects. Needs at least one valid value in "Availale radio effects".
+ value: True
+ required: false
+ property_type: boolean
+ - id: radio_sounds
+ name: Availale radio effects
+ hint: A list of radio effects seperated by commas that are randomly used when "Force radio effects" is enabled. Possible values are "low", "medium" and "high".
+ value: "low, medium"
+ required: false
+ property_type: string
+ - id: auto_start
+ name: Auto start
+ hint: Automatically start the radio chatter with your Wingman.
+ value: False
+ required: false
+ property_type: boolean
+ - id: volume
+ name: Volume
+ hint: The volume (relative to the Wingman's volume) of the radio chatter. Must be between 0.0 (silent) and 1.0 (same volume as Wingman).
+ value: 0.5
+ required: false
+ property_type: slider
+ options:
+ - label: "min"
+ value: 0.0
+ - label: "max"
+ value: 1.0
+ - label: "step"
+ value: 0.1
+ - id: print_chatter
+ name: Print chatter
+ hint: Print the generated chatter to message overview.
+ value: True
+ required: false
+ property_type: boolean
+ - id: radio_knowledge
+ name: Radio knowledge
+ hint: If enabled, the radio chatter messages will be added to the wingman conversation. This way you can talk with your wingman about the radio chatter.
+ value: False
+ required: false
+ property_type: boolean
+ - id: use_beeps
+ name: Use beeps
+ hint: Use beeps to indicate the start and end of a radio chatter messages.
+ value: True
+ required: false
+ property_type: boolean
diff --git a/templates/migration/1_5_0/skills/radio_chatter/logo.png b/templates/migration/1_5_0/skills/radio_chatter/logo.png
new file mode 100644
index 00000000..d2284125
Binary files /dev/null and b/templates/migration/1_5_0/skills/radio_chatter/logo.png differ
diff --git a/templates/migration/1_5_0/skills/radio_chatter/main.py b/templates/migration/1_5_0/skills/radio_chatter/main.py
new file mode 100644
index 00000000..71994968
--- /dev/null
+++ b/templates/migration/1_5_0/skills/radio_chatter/main.py
@@ -0,0 +1,540 @@
+import time
+import copy
+from os import path
+from random import randrange
+from typing import TYPE_CHECKING
+from api.interface import (
+ SettingsConfig,
+ SkillConfig,
+ VoiceSelection,
+ WingmanInitializationError,
+)
+from api.enums import (
+ LogType,
+ WingmanInitializationErrorType,
+ TtsProvider,
+ WingmanProTtsProvider,
+ SoundEffect,
+)
+from services.file import get_writable_dir
+from skills.skill_base import Skill
+
+if TYPE_CHECKING:
+ from wingmen.open_ai_wingman import OpenAiWingman
+
+class RadioChatter(Skill):
+
+ def __init__(
+ self,
+ config: SkillConfig,
+ settings: SettingsConfig,
+ wingman: "OpenAiWingman",
+ ) -> None:
+ super().__init__(config=config, settings=settings, wingman=wingman)
+
+ self.file_path = get_writable_dir(path.join("skills", "radio_chatter", "data"))
+
+ self.last_message = None
+ self.radio_status = False
+ self.loaded = False
+
+ self.prompt = None
+ self.voices = list[VoiceSelection]
+ self.interval_min = None
+ self.interval_max = None
+ self.messages_min = None
+ self.messages_max = None
+ self.participants_min = None
+ self.participants_max = None
+ self.force_radio_sound = False
+ self.radio_sounds = []
+ self.use_beeps = False
+ self.auto_start = False
+ self.volume = 1.0
+ self.print_chatter = False
+ self.radio_knowledge = False
+
+ async def validate(self) -> list[WingmanInitializationError]:
+ errors = await super().validate()
+
+ self.prompt = self.retrieve_custom_property_value("prompt", errors)
+
+ # prepare voices
+ voices: list[VoiceSelection] = self.retrieve_custom_property_value(
+ "voices", errors
+ )
+ if voices:
+ # we have to initiate all providers here
+ # we do no longer check voice availability or validate the structure
+
+ initiated_providers = []
+ initiate_provider_error = False
+
+ for voice in voices:
+ voice_provider = voice.provider
+ if voice_provider not in initiated_providers:
+ initiated_providers.append(voice_provider)
+
+ # initiate provider
+ if voice_provider == TtsProvider.OPENAI and not self.wingman.openai:
+ await self.wingman.validate_and_set_openai(errors)
+ if len(errors) > 0:
+ initiate_provider_error = True
+ elif (
+ voice_provider == TtsProvider.AZURE
+ and not self.wingman.openai_azure
+ ):
+ await self.wingman.validate_and_set_azure(errors)
+ if len(errors) > 0:
+ initiate_provider_error = True
+ elif (
+ voice_provider == TtsProvider.ELEVENLABS
+ and not self.wingman.elevenlabs
+ ):
+ await self.wingman.validate_and_set_elevenlabs(errors)
+ if len(errors) > 0:
+ initiate_provider_error = True
+ elif (
+ voice_provider == TtsProvider.WINGMAN_PRO
+ and not self.wingman.wingman_pro
+ ):
+ await self.wingman.validate_and_set_wingman_pro()
+
+ if not initiate_provider_error:
+ self.voices = voices
+
+ self.interval_min = self.retrieve_custom_property_value("interval_min", errors)
+ if self.interval_min is not None and self.interval_min < 1:
+ errors.append(
+ WingmanInitializationError(
+ wingman_name=self.wingman.name,
+ message="Invalid value for 'interval_min'. Expected a number of one or larger.",
+ error_type=WingmanInitializationErrorType.INVALID_CONFIG,
+ )
+ )
+ self.interval_max = self.retrieve_custom_property_value("interval_max", errors)
+ if (
+ self.interval_max is not None
+ and self.interval_max < 1
+ or (self.interval_min is not None and self.interval_max < self.interval_min)
+ ):
+ errors.append(
+ WingmanInitializationError(
+ wingman_name=self.wingman.name,
+ message="Invalid value for 'interval_max'. Expected a number greater than or equal to 'interval_min'.",
+ error_type=WingmanInitializationErrorType.INVALID_CONFIG,
+ )
+ )
+ self.messages_min = self.retrieve_custom_property_value("messages_min", errors)
+ if self.messages_min is not None and self.messages_min < 1:
+ errors.append(
+ WingmanInitializationError(
+ wingman_name=self.wingman.name,
+ message="Invalid value for 'messages_min'. Expected a number of one or larger.",
+ error_type=WingmanInitializationErrorType.INVALID_CONFIG,
+ )
+ )
+ self.messages_max = self.retrieve_custom_property_value("messages_max", errors)
+ if (
+ self.messages_max is not None
+ and self.messages_max < 1
+ or (self.messages_min is not None and self.messages_max < self.messages_min)
+ ):
+ errors.append(
+ WingmanInitializationError(
+ wingman_name=self.wingman.name,
+ message="Invalid value for 'messages_max'. Expected a number greater than or equal to 'messages_min'.",
+ error_type=WingmanInitializationErrorType.INVALID_CONFIG,
+ )
+ )
+ self.participants_min = self.retrieve_custom_property_value(
+ "participants_min", errors
+ )
+ if self.participants_min is not None and self.participants_min < 1:
+ errors.append(
+ WingmanInitializationError(
+ wingman_name=self.wingman.name,
+ message="Invalid value for 'participants_min'. Expected a number of one or larger.",
+ error_type=WingmanInitializationErrorType.INVALID_CONFIG,
+ )
+ )
+ self.participants_max = self.retrieve_custom_property_value(
+ "participants_max", errors
+ )
+ if (
+ self.participants_max is not None
+ and self.participants_max < 1
+ or (
+ self.participants_min is not None
+ and self.participants_max < self.participants_min
+ )
+ ):
+ errors.append(
+ WingmanInitializationError(
+ wingman_name=self.wingman.name,
+ message="Invalid value for 'participants_max'. Expected a number greater than or equal to 'participants_min'.",
+ error_type=WingmanInitializationErrorType.INVALID_CONFIG,
+ )
+ )
+
+ if not self.voices or self.participants_max > len(self.voices):
+ errors.append(
+ WingmanInitializationError(
+ wingman_name=self.wingman.name,
+ message="Not enough voices available for the configured number of max participants.",
+ error_type=WingmanInitializationErrorType.INVALID_CONFIG,
+ )
+ )
+
+ self.force_radio_sound = self.retrieve_custom_property_value(
+ "force_radio_sound", errors
+ )
+
+ self.auto_start = self.retrieve_custom_property_value("auto_start", errors)
+
+ self.volume = self.retrieve_custom_property_value("volume", errors) or 0.5
+ if self.volume < 0 or self.volume > 1:
+ errors.append(
+ WingmanInitializationError(
+ wingman_name=self.wingman.name,
+ message="Invalid value for 'volume'. Expected a number between 0 and 1.",
+ error_type=WingmanInitializationErrorType.INVALID_CONFIG,
+ )
+ )
+ self.print_chatter = self.retrieve_custom_property_value(
+ "print_chatter", errors
+ )
+ self.radio_knowledge = self.retrieve_custom_property_value(
+ "radio_knowledge", errors
+ )
+ radio_sounds = self.retrieve_custom_property_value("radio_sounds", errors)
+ # split by comma
+ if radio_sounds:
+ radio_sounds = radio_sounds.lower().replace(" ", "").split(",")
+ if "low" in radio_sounds:
+ self.radio_sounds.append(SoundEffect.LOW_QUALITY_RADIO)
+ if "medium" in radio_sounds:
+ self.radio_sounds.append(SoundEffect.MEDIUM_QUALITY_RADIO)
+ if "high" in radio_sounds:
+ self.radio_sounds.append(SoundEffect.HIGH_END_RADIO)
+ if not self.radio_sounds:
+ self.force_radio_sound = False
+ self.use_beeps = self.retrieve_custom_property_value("use_beeps", errors)
+
+ return errors
+
+ async def prepare(self) -> None:
+ self.loaded = True
+ if self.auto_start:
+ self.threaded_execution(self._init_chatter)
+
+ async def unload(self) -> None:
+ self.loaded = False
+ self.radio_status = False
+
+ def randrange(self, start, stop=None):
+ if start == stop:
+ return start
+ random = randrange(start, stop)
+ return random
+
+ def get_tools(self) -> list[tuple[str, dict]]:
+ tools = [
+ (
+ "turn_on_radio",
+ {
+ "type": "function",
+ "function": {
+ "name": "turn_on_radio",
+ "description": "Turn the radio on to pick up some chatter on open frequencies.",
+ },
+ },
+ ),
+ (
+ "turn_off_radio",
+ {
+ "type": "function",
+ "function": {
+ "name": "turn_off_radio",
+ "description": "Turn the radio off to no longer pick up pick up chatter on open frequencies.",
+ },
+ },
+ ),
+ (
+ "radio_status",
+ {
+ "type": "function",
+ "function": {
+ "name": "radio_status",
+ "description": "Get the status (on/off) of the radio.",
+ },
+ },
+ ),
+ ]
+ return tools
+
+ async def execute_tool(
+ self, tool_name: str, parameters: dict[str, any]
+ ) -> tuple[str, str]:
+ function_response = ""
+ instant_response = ""
+
+ if tool_name in ["turn_on_radio", "turn_off_radio", "radio_status"]:
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+
+ if tool_name == "turn_on_radio":
+ if self.radio_status:
+ function_response = "Radio is already on."
+ else:
+ self.threaded_execution(self._init_chatter)
+ function_response = "Radio is now on."
+ elif tool_name == "turn_off_radio":
+ if self.radio_status:
+ self.radio_status = False
+ function_response = "Radio is now off."
+ else:
+ function_response = "Radio is already off."
+ elif tool_name == "radio_status":
+ if self.radio_status:
+ function_response = "Radio is on."
+ else:
+ function_response = "Radio is off."
+
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+
+ return function_response, instant_response
+
+ async def _init_chatter(self) -> None:
+ """Start the radio chatter."""
+
+ self.radio_status = True
+ time.sleep(max(5, self.interval_min)) # sleep for min 5s else min interval
+
+ while self.is_active():
+ await self._generate_chatter()
+ interval = self.randrange(self.interval_min, self.interval_max)
+ time.sleep(interval)
+
+ def is_active(self) -> bool:
+ return self.radio_status and self.loaded
+
+ async def _generate_chatter(self):
+ if not self.is_active():
+ return
+
+ count_message = self.randrange(self.messages_min, self.messages_max)
+ count_participants = self.randrange(
+ self.participants_min, self.participants_max
+ )
+
+ messages = [
+ {
+ "role": "system",
+ "content": f"""
+ ## Must follow these rules
+ - There are {count_participants} participant(s) in the conversation/monolog
+ - The conversation/monolog must contain exactly {count_message} messages between the participants or in the monolog
+ - Each new line in your answer represents a new message
+ - Use matching call signs for the participants
+
+ ## Sample response
+ Name1: Message Content
+ Name2: Message Content
+ Name3: Message Content
+ Name2: Message Content
+ ...
+ """,
+ },
+ {
+ "role": "user",
+ "content": str(self.prompt),
+ },
+ ]
+ completion = await self.llm_call(messages)
+ messages = (
+ completion.choices[0].message.content
+ if completion and completion.choices
+ else ""
+ )
+
+ if not messages:
+ return
+
+ clean_messages = []
+ voice_participant_mapping = {}
+ for message in messages.split("\n"):
+ if not message:
+ continue
+
+ # get name before first ":"
+ name = message.split(":")[0].strip()
+ text = message.split(":", 1)[1].strip()
+
+ if name not in voice_participant_mapping:
+ voice_participant_mapping[name] = None
+
+ clean_messages.append((name, text))
+
+ original_voice_setting = await self._get_original_voice_setting()
+ elevenlabs_streaming = self.wingman.config.elevenlabs.output_streaming
+ original_sound_config = copy.deepcopy(self.wingman.config.sound)
+
+ # copy for volume and effects
+ custom_sound_config = copy.deepcopy(self.wingman.config.sound)
+ custom_sound_config.play_beep = self.use_beeps
+ custom_sound_config.play_beep_apollo = False
+ custom_sound_config.volume = custom_sound_config.volume * self.volume
+
+ voice_index = await self._get_random_voice_index(len(voice_participant_mapping))
+ if not voice_index:
+ return
+ for i, name in enumerate(voice_participant_mapping):
+ sound_config = original_sound_config
+ if self.force_radio_sound:
+ sound_config = copy.deepcopy(custom_sound_config)
+ sound_config.effects = [
+ self.radio_sounds[self.randrange(len(self.radio_sounds))]
+ ]
+
+ voice_participant_mapping[name] = (voice_index[i], sound_config)
+
+ for name, text in clean_messages:
+ if not self.is_active():
+ return
+
+ # wait for audio_player idleing
+ while self.wingman.audio_player.is_playing:
+ time.sleep(2)
+
+ if not self.is_active():
+ return
+
+ voice_index, sound_config = voice_participant_mapping[name]
+ voice_setting = self.voices[voice_index]
+
+ await self._switch_voice(voice_setting)
+ if self.print_chatter:
+ await self.printr.print_async(
+ text=f"Background radio ({name}): {text}",
+ color=LogType.INFO,
+ source_name=self.wingman.name,
+ )
+ self.threaded_execution(self.wingman.play_to_user, text, True, sound_config)
+ if self.radio_knowledge:
+ await self.wingman.add_assistant_message(
+ f"Background radio chatter: {text}"
+ )
+ while not self.wingman.audio_player.is_playing:
+ time.sleep(0.1)
+ await self._switch_voice(original_voice_setting, elevenlabs_streaming)
+
+ while self.wingman.audio_player.is_playing:
+ time.sleep(1) # stay in function call until last message got played
+
+ async def _get_random_voice_index(self, count: int) -> list[int]:
+ """Switch voice to a random voice from the list."""
+
+ if count > len(self.voices):
+ return []
+
+ if count == len(self.voices):
+ return list(range(len(self.voices)))
+
+ voice_index = []
+ for i in range(count):
+ while True:
+ index = self.randrange(len(self.voices)) - 1
+ if index not in voice_index:
+ voice_index.append(index)
+ break
+
+ return voice_index
+
+ async def _switch_voice(
+ self, voice_setting: VoiceSelection = None, elevenlabs_streaming: bool = False
+ ) -> None:
+ """Switch voice to the given voice setting."""
+
+ if not voice_setting:
+ return
+
+ voice_provider = voice_setting.provider
+ voice = voice_setting.voice
+ voice_name = None
+ error = False
+
+ if voice_provider == TtsProvider.WINGMAN_PRO:
+ if voice_setting.subprovider == WingmanProTtsProvider.OPENAI:
+ voice_name = voice.value
+ self.wingman.config.openai.tts_voice = voice
+ elif voice_setting.subprovider == WingmanProTtsProvider.AZURE:
+ voice_name = voice
+ self.wingman.config.azure.tts.voice = voice
+ elif voice_provider == TtsProvider.OPENAI:
+ voice_name = voice.value
+ self.wingman.config.openai.tts_voice = voice
+ elif voice_provider == TtsProvider.ELEVENLABS:
+ voice_name = voice.name or voice.id
+ self.wingman.config.elevenlabs.voice = voice
+ self.wingman.config.elevenlabs.output_streaming = elevenlabs_streaming
+ elif voice_provider == TtsProvider.AZURE:
+ voice_name = voice
+ self.wingman.config.azure.tts.voice = voice
+ elif voice_provider == TtsProvider.XVASYNTH:
+ voice_name = voice.voice_name
+ self.wingman.config.xvasynth.voice = voice
+ elif voice_provider == TtsProvider.EDGE_TTS:
+ voice_name = voice
+ self.wingman.config.edge_tts.voice = voice
+ else:
+ error = True
+
+ if error or not voice_name or not voice_provider:
+ await self.printr.print_async(
+ "Voice switching failed due to an unknown voice provider/subprovider.",
+ LogType.ERROR,
+ )
+ return
+
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Switching voice to {voice_name} ({voice_provider.value})"
+ )
+
+ self.wingman.config.features.tts_provider = voice_provider
+
+ async def _get_original_voice_setting(self) -> VoiceSelection:
+ voice_provider = self.wingman.config.features.tts_provider
+ voice_subprovider = None
+ voice = None
+
+ if voice_provider == TtsProvider.EDGE_TTS:
+ voice = self.wingman.config.edge_tts.voice
+ elif voice_provider == TtsProvider.ELEVENLABS:
+ voice = self.wingman.config.elevenlabs.voice
+ elif voice_provider == TtsProvider.AZURE:
+ voice = self.wingman.config.azure.tts.voice
+ elif voice_provider == TtsProvider.XVASYNTH:
+ voice = self.wingman.config.xvasynth.voice
+ elif voice_provider == TtsProvider.OPENAI:
+ voice = self.wingman.config.openai.tts_voice
+ elif voice_provider == TtsProvider.WINGMAN_PRO:
+ voice_subprovider = self.wingman.config.wingman_pro.tts_provider
+ if (
+ self.wingman.config.wingman_pro.tts_provider
+ == WingmanProTtsProvider.OPENAI
+ ):
+ voice = self.wingman.config.openai.tts_voice
+ elif (
+ self.wingman.config.wingman_pro.tts_provider
+ == WingmanProTtsProvider.AZURE
+ ):
+ voice = self.wingman.config.azure.tts.voice
+ else:
+ return None
+
+ return VoiceSelection(
+ provider=voice_provider, subprovider=voice_subprovider, voice=voice
+ )
diff --git a/templates/migration/1_5_0/skills/spotify/default_config.yaml b/templates/migration/1_5_0/skills/spotify/default_config.yaml
new file mode 100644
index 00000000..36cb8568
--- /dev/null
+++ b/templates/migration/1_5_0/skills/spotify/default_config.yaml
@@ -0,0 +1,72 @@
+name: Spotify
+module: skills.spotify.main
+category: general
+description:
+ en: Control Spotify using the the WebAPI. Play songs, artists playlists. Control playback, volume, and more. All powered by AI!
+ de: Steuere Spotify mit der WebAPI. Spiele Lieder, Künstler und Playlists ab. Steuere die Wiedergabe, Lautstärke und mehr. Alles von KI gesteuert!
+hint:
+ en:
Wähle "Web API" und "Web Playback SDK" als verwendete APIs
Füge http://127.0.0.1:8082 as Redirect URI hinzu
Gib die Client ID and das Client Secret hier ein
+examples:
+ - question:
+ en: Play the song Californication.
+ de: Spiele das Lied Californication.
+ answer:
+ en: Now playing 'Californication' by Red Hot Chili Peppers.
+ de: \'Californication\' von Red Hot Chili Peppers wird abgespielt.
+ - question:
+ en: What's the current song?
+ de: Wie heißt das aktuelle Lied?
+ answer:
+ en: You are currently listening to 'Californication' by Red Hot Chili Peppers.
+ de: Du hörst gerade 'Californication' von Red Hot Chili Peppers.
+ - question:
+ en: My girlfriend left me. Play a really sad song to match my mood.
+ de: Meine Freundin hat mich verlassen. Spiele ein wirklich trauriges Lied, das zu meiner Stimmung passt.
+ answer:
+ en: I'm sorry for you. Now playing 'Someone Like You' by Adele.
+ de: Es tut mir leid für dich. Jetzt wird "Someone Like You" von Adele gespielt.
+ - question:
+ en: Play the most popular song from the musical Les Miserables.
+ de: Spiele das beliebteste Lied aus dem Musical Les Miserables.
+ answer:
+ en: Playing 'I Dreamed a Dream' from Les Miserables.
+ de: \'I Dreamed a Dream\' aus Les Miserables wird abgespielt.
+ - question:
+ en: That's a cover song. Play the real version!
+ de: Das ist ein Cover-Song. Spiele die echte Version aus dem Film!
+ answer:
+ en: Playing 'I Dreamed a Dream' by Anne Hathaway from Les Miserables.
+ de: Spiele 'I Dreamed a Dream' von Anne Hathaway aus Les Miserables.
+ - question:
+ en: What are my Spotify devices?
+ de: Was sind meine Spotify-Geräte?
+ answer:
+ en: You have 2 devices available - 'Gaming PC' and 'iPhone'.
+ de: Du hast 2 Geräte verfügbar - 'Gaming PC' und 'iPhone'.
+ - question:
+ en: Play the music on my iPhone.
+ de: Spiele die Musik auf meinem iPhone ab.
+ answer:
+ en: Moves the current playback to your iPhone
+ de: Überträgt die Spotify-Wiedergabe auf das iPhone
+prompt: |
+ You are also an expert DJ and music player interface responsible to control the Spotify music player client of the user.
+ You have access to different tools or functions you can call to control the Spotify client using its API.
+ If the user asks you to play a song, resume, stop or pause the current playback etc. use your tools to do so.
+ For some functions, you need parameters like the song or artist name. Try to extract these values from the
+ player's request.
+ Never invent any function parameters. Ask the user for clarification if you are not sure or cannot extract function parameters.
+custom_properties:
+ - hint: Create an app in the Spotify Dashboard at https://developer.spotify.com/dashboard. You'll find the Client ID in the Settings of that app.
+ id: spotify_client_id
+ name: Spotify Client ID
+ required: true
+ value: enter-your-client-id-here
+ property_type: string
+ - hint: Create an app in the Spotify Dashboard at https://developer.spotify.com/dashboard. In the Settings of the app, add http://127.0.0.1:8082 (or any other free port) as Redirect URL. Then enter the same value here.
+ id: spotify_redirect_url
+ name: Spotify Redirect URL
+ required: true
+ value: http://127.0.0.1:8082
+ property_type: string
diff --git a/templates/migration/1_5_0/skills/spotify/logo.png b/templates/migration/1_5_0/skills/spotify/logo.png
new file mode 100644
index 00000000..4ceb1369
Binary files /dev/null and b/templates/migration/1_5_0/skills/spotify/logo.png differ
diff --git a/templates/migration/1_5_0/skills/spotify/main.py b/templates/migration/1_5_0/skills/spotify/main.py
new file mode 100644
index 00000000..82d3cd2d
--- /dev/null
+++ b/templates/migration/1_5_0/skills/spotify/main.py
@@ -0,0 +1,376 @@
+from os import path
+from typing import TYPE_CHECKING
+import spotipy
+from spotipy.oauth2 import SpotifyOAuth
+from api.enums import LogType
+from api.interface import SettingsConfig, SkillConfig, WingmanInitializationError
+from services.file import get_writable_dir
+from skills.skill_base import Skill
+
+if TYPE_CHECKING:
+ from wingmen.open_ai_wingman import OpenAiWingman
+
+
+class Spotify(Skill):
+
+ def __init__(
+ self,
+ config: SkillConfig,
+ settings: SettingsConfig,
+ wingman: "OpenAiWingman",
+ ) -> None:
+ super().__init__(config=config, settings=settings, wingman=wingman)
+
+ self.data_path = get_writable_dir(path.join("skills", "spotify", "data"))
+ self.spotify: spotipy.Spotify = None
+ self.available_devices = []
+
+ async def validate(self) -> list[WingmanInitializationError]:
+ errors = await super().validate()
+
+ secret = await self.retrieve_secret("spotify_client_secret", errors)
+ client_id = self.retrieve_custom_property_value("spotify_client_id", errors)
+ redirect_url = self.retrieve_custom_property_value(
+ "spotify_redirect_url", errors
+ )
+ if secret and client_id and redirect_url:
+ # now that we have everything, initialize the Spotify client
+ cache_handler = spotipy.cache_handler.CacheFileHandler(
+ cache_path=f"{self.data_path}/.cache"
+ )
+ self.spotify = spotipy.Spotify(
+ auth_manager=SpotifyOAuth(
+ client_id=client_id,
+ client_secret=secret,
+ redirect_uri=redirect_url,
+ scope=[
+ "user-library-read",
+ "user-read-currently-playing",
+ "user-read-playback-state",
+ "user-modify-playback-state",
+ "streaming",
+ # "playlist-modify-public",
+ "playlist-read-private",
+ # "playlist-modify-private",
+ "user-library-modify",
+ # "user-read-recently-played",
+ # "user-top-read"
+ ],
+ cache_handler=cache_handler,
+ )
+ )
+ return errors
+
+ def get_tools(self) -> list[tuple[str, dict]]:
+ tools = [
+ (
+ "control_spotify_device",
+ {
+ "type": "function",
+ "function": {
+ "name": "control_spotify_device",
+ "description": "Retrieves or sets the audio device of he user that Spotify songs are played on.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "action": {
+ "type": "string",
+ "description": "The playback action to take",
+ "enum": ["get_devices", "set_active_device"],
+ },
+ "device_name": {
+ "type": "string",
+ "description": "The name of the device to set as the active device.",
+ "enum": [
+ device["name"]
+ for device in self.get_available_devices()
+ ],
+ },
+ },
+ "required": ["action"],
+ },
+ },
+ },
+ ),
+ (
+ "control_spotify_playback",
+ {
+ "type": "function",
+ "function": {
+ "name": "control_spotify_playback",
+ "description": "Control the Spotify audio playback with actions like play, pause/stop or play the previous/next track or set the volume level.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "action": {
+ "type": "string",
+ "description": "The playback action to take",
+ "enum": [
+ "play",
+ "pause",
+ "stop",
+ "play_next_track",
+ "play_previous_track",
+ "set_volume",
+ "mute",
+ "get_current_track",
+ "like_song",
+ ],
+ },
+ "volume_level": {
+ "type": "number",
+ "description": "The volume level to set (in percent)",
+ },
+ },
+ "required": ["action"],
+ },
+ },
+ },
+ ),
+ (
+ "play_song_with_spotify",
+ {
+ "type": "function",
+ "function": {
+ "name": "play_song_with_spotify",
+ "description": "Find a song with Spotify to either play it immediately or queue it.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "track": {
+ "type": "string",
+ "description": "The name of the track to play",
+ },
+ "artist": {
+ "type": "string",
+ "description": "The artist that created the track",
+ },
+ "queue": {
+ "type": "boolean",
+ "description": "If true, the song will be queued and played later. Otherwise it will be played immediately.",
+ },
+ },
+ },
+ },
+ },
+ ),
+ (
+ "interact_with_spotify_playlists",
+ {
+ "type": "function",
+ "function": {
+ "name": "interact_with_spotify_playlists",
+ "description": "Play a song from a Spotify playlist or add a song to a playlist.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "action": {
+ "type": "string",
+ "description": "The action to take",
+ "enum": ["get_playlists", "play_playlist"],
+ },
+ "playlist": {
+ "type": "string",
+ "description": "The name of the playlist to interact with",
+ "enum": [
+ playlist["name"]
+ for playlist in self.get_user_playlists()
+ ],
+ },
+ },
+ "required": ["action"],
+ },
+ },
+ },
+ ),
+ ]
+ return tools
+
+ async def execute_tool(
+ self, tool_name: str, parameters: dict[str, any]
+ ) -> tuple[str, str]:
+ instant_response = "" # not used here
+ function_response = "Unable to control Spotify."
+
+ if tool_name not in [
+ "control_spotify_device",
+ "control_spotify_playback",
+ "play_song_with_spotify",
+ "interact_with_spotify_playlists",
+ ]:
+ return function_response, instant_response
+
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+ await self.printr.print_async(
+ f"Spotify: Executing {tool_name} with parameters: {parameters}",
+ color=LogType.INFO,
+ )
+
+ action = parameters.get("action", None)
+ parameters.pop("action", None)
+ function = getattr(self, action if action else tool_name)
+ function_response = function(**parameters)
+
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+
+ return function_response, instant_response
+
+ # HELPERS
+
+ def get_available_devices(self):
+ devices = [
+ device
+ for device in self.spotify.devices().get("devices")
+ if not device["is_restricted"]
+ ]
+ return devices
+
+ def get_active_devices(self):
+ active_devices = [
+ device
+ for device in self.spotify.devices().get("devices")
+ if device["is_active"]
+ ]
+ return active_devices
+
+ def get_user_playlists(self):
+ playlists = self.spotify.current_user_playlists()
+ return playlists["items"]
+
+ def get_playlist_uri(self, playlist_name: str):
+ playlists = self.spotify.current_user_playlists()
+ playlist = next(
+ (
+ playlist
+ for playlist in playlists["items"]
+ if playlist["name"].lower() == playlist_name.lower()
+ ),
+ None,
+ )
+ return playlist["uri"] if playlist else None
+
+ # ACTIONS
+
+ def get_devices(self):
+ active_devices = self.get_active_devices()
+ active_device_names = ", ".join([device["name"] for device in active_devices])
+ available_device_names = ", ".join(
+ [device["name"] for device in self.get_available_devices()]
+ )
+ if active_devices and len(active_devices) > 0:
+ return f"Your available devices are: {available_device_names}. Your active devices are: {active_device_names}."
+ if available_device_names:
+ return f"No active device found but these are the available devices: {available_device_names}"
+
+ return "No devices found. Start Spotify on one of your devices first, then try again."
+
+ def set_active_device(self, device_name: str):
+ if device_name:
+ device = next(
+ (
+ device
+ for device in self.get_available_devices()
+ if device["name"] == device_name
+ ),
+ None,
+ )
+ if device:
+ self.spotify.transfer_playback(device["id"])
+ return "OK"
+ else:
+ return f"Device '{device_name}' not found."
+
+ return "Device name not provided."
+
+ def play(self):
+ self.spotify.start_playback()
+ return "OK"
+
+ def pause(self):
+ self.spotify.pause_playback()
+ return "OK"
+
+ def stop(self):
+ return self.pause()
+
+ def play_previous_track(self):
+ self.spotify.previous_track()
+ return "OK"
+
+ def play_next_track(self):
+ self.spotify.next_track()
+ return "OK"
+
+ def set_volume(self, volume_level: int):
+ if volume_level:
+ self.spotify.volume(volume_level)
+ return "OK"
+
+ return "Volume level not provided."
+
+ def mute(self):
+ self.spotify.volume(0)
+ return "OK"
+
+ def get_current_track(self):
+ current_playback = self.spotify.current_playback()
+ if current_playback:
+ artist = current_playback["item"]["artists"][0]["name"]
+ track = current_playback["item"]["name"]
+ return f"Currently playing '{track}' by '{artist}'."
+
+ return "No track playing."
+
+ def like_song(self):
+ current_playback = self.spotify.current_playback()
+ if current_playback:
+ track_id = current_playback["item"]["id"]
+ self.spotify.current_user_saved_tracks_add([track_id])
+ return "Track saved to 'Your Music' library."
+
+ return "No track playing. Play a song, then tell me to like it."
+
+ def play_song_with_spotify(
+ self, track: str = None, artist: str = None, queue: bool = False
+ ):
+ if not track and not artist:
+ return "What song or artist would you like to play?"
+ results = self.spotify.search(q=f"{track} {artist}", type="track", limit=1)
+ track = results["tracks"]["items"][0]
+ if track:
+ track_name = track["name"]
+ artist_name = track["artists"][0]["name"]
+ try:
+ if queue:
+ self.spotify.add_to_queue(track["uri"])
+ return f"Added '{track_name}' by '{artist_name}' to the queue."
+ else:
+ self.spotify.start_playback(uris=[track["uri"]])
+ return f"Now playing '{track_name}' by '{artist_name}'."
+ except spotipy.SpotifyException as e:
+ if e.reason == "NO_ACTIVE_DEVICE":
+ return "No active device found. Start Spotify on one of your devices first, then play a song or tell me to activate a device."
+ return f"An error occurred while trying to play the song. Code: {e.code}, Reason: '{e.reason}'"
+
+ return "No track found."
+
+ def get_playlists(self):
+ playlists = self.get_user_playlists()
+ playlist_names = ", ".join([playlist["name"] for playlist in playlists])
+ if playlist_names:
+ return f"Your playlists are: {playlist_names}"
+
+ return "No playlists found."
+
+ def play_playlist(self, playlist: str = None):
+ if not playlist:
+ return "Which playlist would you like to play?"
+
+ playlist_uri = self.get_playlist_uri(playlist)
+ if playlist_uri:
+ self.spotify.start_playback(context_uri=playlist_uri)
+ return f"Playing playlist '{playlist}'."
+
+ return f"Playlist '{playlist}' not found."
diff --git a/templates/migration/1_5_0/skills/spotify/requirements.txt b/templates/migration/1_5_0/skills/spotify/requirements.txt
new file mode 100644
index 00000000..dbc2911c
--- /dev/null
+++ b/templates/migration/1_5_0/skills/spotify/requirements.txt
@@ -0,0 +1 @@
+spotipy==2.23.0
\ No newline at end of file
diff --git a/templates/migration/1_5_0/skills/star_head/default_config.yaml b/templates/migration/1_5_0/skills/star_head/default_config.yaml
new file mode 100644
index 00000000..478cdff0
--- /dev/null
+++ b/templates/migration/1_5_0/skills/star_head/default_config.yaml
@@ -0,0 +1,46 @@
+name: StarHead
+module: skills.star_head.main
+category: star_citizen
+description:
+ en: Use the StarHead API to retrieve detailed information about spaceships, weapons and more. StarHead can also calculate optimal trading routes based on live data.
+ de: Nutze die StarHead API, um detaillierte Informationen über Raumschiffe, Waffen und mehr abzurufen. StarHead kann auch optimale Handelsrouten anhand von Live-Daten berechnen.
+# hint:
+# en:
+# de:
+examples:
+ - question:
+ en: I want to trade. What's the best route?
+ de: Ich möchte handeln. Was ist die beste Route?
+ answer:
+ en: To provide you with the best trading route, I need to know your ship model, your current location, and your available budget. Could you please provide these details?
+ de: Um dir die beste Handelsroute anbieten zu können, muss ich dein Schiffsmodell, deinen aktuellen Standort und dein verfügbares Budget kennen. Kannst du mir diese Angaben bitte mitteilen?
+ - question:
+ en: I'm flying a Caterpillar and am near Yela. I have 100.000 credits to spend.
+ de: Ich fliege eine Caterpillar und bin in der Nähe von Yela. Ich habe 100.000 Credits auszugeben.
+ answer:
+ en: You can buy Stims at Deakins Research Outpost near Yela for 2.8 credits/unit and sell them at CRU-L1 Ambitious Dream Station for 3.85 credits/unit. The total profit for this route is approximately 37499 credits, and the travel time estimation is 41 minutes.
+ de: Du kannst Stims bei Deakins Research Outpost in der Nähe von Yela für 2,8 Credits/Stück kaufen und sie bei CRU-L1 Ambitious Dream Station für 3,85 Credits/Stück verkaufen. Der Gesamtgewinn für diese Route beträgt ca. 37499 Credits, und die geschätzte Reisezeit beträgt 41 Minuten.
+ - question:
+ en: What can you tell me about the Caterpillar?
+ de: Was kannst du mir über die Caterpillar erzählen?
+ answer:
+ en: The Constellation Taurus is a dedicated freighter, designed for hauling cargo. It has a cargo capacity of 174 SCU and is fully configurable but without all the bells and whistles found on other Constellation variants. On the other hand, the Constellation Andromeda is a multi-person freighter and the most popular ship in RSI's current production array. It has a cargo capacity of 96 SCU and is beloved by smugglers and merchants alike for its modular and high-powered capabilities. Both are part of the Constellation series, but the Taurus specifically caters to dedicated freight operations whereas the Andromeda serves as a multi-person versatile ship.
+ de: Die Constellation Taurus ist ein reiner Frachter, der für den Transport von Fracht entwickelt wurde. Er hat eine Ladekapazität von 174 SCU und ist voll konfigurierbar, hat aber nicht den ganzen Schnickschnack der anderen Constellation-Varianten. Die Constellation Andromeda hingegen ist ein Mehrpersonen-Frachter und das beliebteste Schiff in der aktuellen Produktion von RSI. Sie hat eine Ladekapazität von 96 SCU und ist bei Schmugglern und Händlern wegen ihrer modularen und leistungsstarken Fähigkeiten gleichermaßen beliebt. Beide gehören zur Constellation-Serie, aber die Taurus ist speziell für den reinen Frachtverkehr gedacht, während die Andromeda ein vielseitiges Schiff für mehrere Personen ist.
+prompt: |
+ You also have access to the StarHead API which you can use to access live trading data and to retrieve additional information about spaceships in Star Citizen.
+ Your job is to find good trading routes for the user based on his/her ship, current location and available budget.
+ The user can also ask you about details of specific ships, components, weapons, and more.
+ You always use the tools available to you to retrieve the required information and to provide the user with the information.
+custom_properties:
+ - hint: The URL of the StarHead API.
+ id: starhead_api_url
+ name: StarHead API URL
+ required: true
+ value: https://api.star-head.de
+ property_type: string
+ - hint: The URL of the Star Citizen Wiki API.
+ id: star_citizen_wiki_api_url
+ name: Star Citizen Wiki API URL
+ required: true
+ value: https://api.star-citizen.wiki/api/v2
+ property_type: string
diff --git a/templates/migration/1_5_0/skills/star_head/logo.png b/templates/migration/1_5_0/skills/star_head/logo.png
new file mode 100644
index 00000000..80c534cb
Binary files /dev/null and b/templates/migration/1_5_0/skills/star_head/logo.png differ
diff --git a/templates/migration/1_5_0/skills/star_head/main.py b/templates/migration/1_5_0/skills/star_head/main.py
new file mode 100644
index 00000000..a5432b61
--- /dev/null
+++ b/templates/migration/1_5_0/skills/star_head/main.py
@@ -0,0 +1,275 @@
+import json
+from typing import Optional
+from typing import TYPE_CHECKING
+import requests
+from api.enums import LogType, WingmanInitializationErrorType
+from api.interface import SettingsConfig, SkillConfig, WingmanInitializationError
+from skills.skill_base import Skill
+
+if TYPE_CHECKING:
+ from wingmen.open_ai_wingman import OpenAiWingman
+
+
+class StarHead(Skill):
+
+ def __init__(
+ self,
+ config: SkillConfig,
+ settings: SettingsConfig,
+ wingman: "OpenAiWingman",
+ ) -> None:
+ super().__init__(config=config, settings=settings, wingman=wingman)
+
+ # config entry existence not validated yet. Assign later when checked!
+ self.starhead_url = ""
+ """The base URL of the StarHead API"""
+
+ self.headers = {"x-origin": "wingman-ai"}
+ """Requireds header for the StarHead API"""
+
+ self.timeout = 5
+ """Global timeout for calls to the the StarHead API (in seconds)"""
+
+ self.star_citizen_wiki_url = ""
+
+ self.vehicles = []
+ self.ship_names = []
+ self.celestial_objects = []
+ self.celestial_object_names = []
+ self.quantum_drives = []
+
+ async def validate(self) -> list[WingmanInitializationError]:
+ errors = await super().validate()
+
+ self.starhead_url = self.retrieve_custom_property_value(
+ "starhead_api_url", errors
+ )
+ self.star_citizen_wiki_url = self.retrieve_custom_property_value(
+ "star_citizen_wiki_api_url", errors
+ )
+
+ try:
+ await self._prepare_data()
+ except Exception as e:
+ errors.append(
+ WingmanInitializationError(
+ wingman_name=self.name,
+ message=f"Failed to load data from StarHead API: {e}",
+ error_type=WingmanInitializationErrorType.UNKNOWN,
+ )
+ )
+
+ return errors
+
+ async def _prepare_data(self):
+ self.vehicles = await self._fetch_data("vehicle")
+ self.ship_names = [
+ self._format_ship_name(vehicle)
+ for vehicle in self.vehicles
+ if vehicle["type"] == "Ship"
+ ]
+
+ self.celestial_objects = await self._fetch_data("celestialobject")
+ self.celestial_object_names = [
+ celestial_object["name"] for celestial_object in self.celestial_objects
+ ]
+
+ self.quantum_drives = await self._fetch_data(
+ "vehiclecomponent", {"typeFilter": 8}
+ )
+
+ async def _fetch_data(
+ self, endpoint: str, params: Optional[dict[str, any]] = None
+ ) -> list[dict[str, any]]:
+ url = f"{self.starhead_url}/{endpoint}"
+
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+ await self.printr.print_async(
+ f"Retrieving {url}",
+ color=LogType.INFO,
+ )
+
+ response = requests.get(
+ url, params=params, timeout=self.timeout, headers=self.headers
+ )
+ response.raise_for_status()
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+
+ return response.json()
+
+ def _format_ship_name(self, vehicle: dict[str, any]) -> str:
+ """Formats name by combining model and name, avoiding repetition"""
+ return vehicle["name"]
+
+ async def execute_tool(
+ self, tool_name: str, parameters: dict[str, any]
+ ) -> tuple[str, str]:
+ instant_response = ""
+ function_response = ""
+
+ if tool_name == "get_best_trading_route":
+ function_response = await self._get_best_trading_route(**parameters)
+ if tool_name == "get_ship_information":
+ function_response = await self._get_ship_information(**parameters)
+ return function_response, instant_response
+
+ async def is_waiting_response_needed(self, tool_name: str) -> bool:
+ return True
+
+ def get_tools(self) -> list[tuple[str, dict]]:
+ tools = [
+ (
+ "get_best_trading_route",
+ {
+ "type": "function",
+ "function": {
+ "name": "get_best_trading_route",
+ "description": "Finds the best trade route for a given spaceship and position.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "ship": {"type": "string", "enum": self.ship_names},
+ "position": {
+ "type": "string",
+ "enum": self.celestial_object_names,
+ },
+ "moneyToSpend": {"type": "number"},
+ },
+ "required": ["ship", "position", "moneyToSpend"],
+ },
+ },
+ },
+ ),
+ (
+ "get_ship_information",
+ {
+ "type": "function",
+ "function": {
+ "name": "get_ship_information",
+ "description": "Gives information about the given ship.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "ship": {"type": "string", "enum": self.ship_names},
+ },
+ "required": ["ship"],
+ },
+ },
+ },
+ ),
+ ]
+
+ return tools
+
+ async def _get_ship_information(self, ship: str) -> str:
+ try:
+ response = requests.get(
+ url=f"{self.star_citizen_wiki_url}/vehicles/{ship}",
+ timeout=self.timeout,
+ headers=self.headers,
+ )
+ response.raise_for_status()
+ except requests.exceptions.RequestException as e:
+ return f"Failed to fetch ship information: {e}"
+ ship_details = json.dumps(response.json())
+ return ship_details
+
+ async def _get_best_trading_route(
+ self, ship: str, position: str, moneyToSpend: float
+ ) -> str:
+ """Calculates the best trading route for the specified ship and position.
+ Note that the function arguments have to match the funtion_args from OpenAI, hence the camelCase!
+ """
+
+ cargo, qd = await self._get_ship_details(ship)
+ if not cargo or not qd:
+ return f"Could not find ship '{ship}' in the StarHead database."
+
+ celestial_object_id = self._get_celestial_object_id(position)
+ if not celestial_object_id:
+ return f"Could not find celestial object '{position}' in the StarHead database."
+
+ data = {
+ "startCelestialObjectId": celestial_object_id,
+ "quantumDriveId": qd["id"] if qd else None,
+ "maxAvailablScu": cargo,
+ "maxAvailableMoney": moneyToSpend,
+ "useOnlyWeaponFreeZones": False,
+ "onlySingleSections": True,
+ }
+ url = f"{self.starhead_url}/trading"
+ try:
+ response = requests.post(
+ url=url,
+ json=data,
+ timeout=self.timeout,
+ headers=self.headers,
+ )
+ response.raise_for_status()
+ except requests.exceptions.RequestException as e:
+ return f"Failed to fetch trading route: {e}"
+
+ parsed_response = response.json()
+ if parsed_response:
+ section = parsed_response[0]
+ return json.dumps(section)
+ return f"No route found for ship '{ship}' at '{position}' with '{moneyToSpend}' aUEC."
+
+ def _get_celestial_object_id(self, name: str) -> Optional[int]:
+ """Finds the ID of the celestial object with the specified name."""
+ return next(
+ (
+ obj["id"]
+ for obj in self.celestial_objects
+ if obj["name"].lower() == name.lower()
+ ),
+ None,
+ )
+
+ async def _get_ship_details(
+ self, ship_name: str
+ ) -> tuple[Optional[int], Optional[dict[str, any]]]:
+ """Gets ship details including cargo capacity and quantum drive information."""
+ vehicle = next(
+ (
+ v
+ for v in self.vehicles
+ if self._format_ship_name(v).lower() == ship_name.lower()
+ ),
+ None,
+ )
+ if vehicle:
+ cargo = vehicle.get("scuCargo")
+ loadouts = await self._get_ship_loadout(vehicle.get("id"))
+ if loadouts:
+ loadout = next(
+ (l for l in loadouts.get("loadouts") if l["isDefaultLayout"]), None
+ )
+ qd = next(
+ (
+ qd
+ for qd in self.quantum_drives
+ for item in loadout.get("data")
+ if item.get("componentId") == qd.get("id")
+ ),
+ None,
+ )
+ return cargo, qd
+ return None, None
+
+ async def _get_ship_loadout(
+ self, ship_id: Optional[int]
+ ) -> Optional[dict[str, any]]:
+ """Retrieves loadout data for a given ship ID."""
+ if ship_id:
+ try:
+ loadout = await self._fetch_data(f"vehicle/{ship_id}/loadout")
+ return loadout or None
+ except requests.HTTPError:
+ await self.printr.print_async(
+ f"Failed to fetch loadout data for ship with ID: {ship_id}",
+ color=LogType.ERROR,
+ )
+ return None
diff --git a/templates/migration/1_5_0/skills/timer/default_config.yaml b/templates/migration/1_5_0/skills/timer/default_config.yaml
new file mode 100644
index 00000000..14137a6b
--- /dev/null
+++ b/templates/migration/1_5_0/skills/timer/default_config.yaml
@@ -0,0 +1,24 @@
+name: Timer
+module: skills.timer.main
+category: general
+description:
+ en: Gives your Wingman the ability to delay actions - be it a reminder, a command or a function call.
+ de: Gibt deinem Wingman die Möglichkeit, Aktionen zu verzögern - sei es eine Erinnerung, ein Befehl oder ein Funktionsaufruf.
+examples:
+ - question:
+ en: Remind me in 60 minutes to take a break.
+ de: Erinnere mich in 60 Minuten daran, eine Pause zu machen.
+ answer:
+ en: (Tells you to take a break in 60 minutes)
+ de: (Erinnert dich in 60 Minuten daran, eine Pause zu machen)
+prompt: |
+ You can also delay actions with the `set_timer` function. The function has 3 parameters:
+
+ - The first argument is the time in seconds,
+ - the second argument is the function to call
+ - the third argument is a list of arguments for the function.
+
+ Only use the `set_timer` function if the user requests to delay a function.
+ Never say a timer id, always use a decriptive name.
+ When confirming, keep it short and simple.
+ When reading out current timers, use the most suitable time unit.
diff --git a/templates/migration/1_5_0/skills/timer/logo.png b/templates/migration/1_5_0/skills/timer/logo.png
new file mode 100644
index 00000000..ba63e750
Binary files /dev/null and b/templates/migration/1_5_0/skills/timer/logo.png differ
diff --git a/templates/migration/1_5_0/skills/timer/main.py b/templates/migration/1_5_0/skills/timer/main.py
new file mode 100644
index 00000000..3cfbdeb7
--- /dev/null
+++ b/templates/migration/1_5_0/skills/timer/main.py
@@ -0,0 +1,569 @@
+import random
+import string
+import asyncio
+import time
+import json
+from typing import TYPE_CHECKING
+from api.interface import SettingsConfig, SkillConfig
+from api.enums import (
+ LogSource,
+ LogType,
+)
+from skills.skill_base import Skill
+
+if TYPE_CHECKING:
+ from wingmen.open_ai_wingman import OpenAiWingman
+
+
+class Timer(Skill):
+
+ def __init__(
+ self,
+ config: SkillConfig,
+ settings: SettingsConfig,
+ wingman: "OpenAiWingman",
+ ) -> None:
+ super().__init__(config=config, settings=settings, wingman=wingman)
+
+ self.timers = {}
+ self.available_tools = []
+ self.active = False
+
+ async def prepare(self) -> None:
+ self.active = True
+ self.threaded_execution(self.start_timer_worker)
+
+ async def unload(self) -> None:
+ self.active = False
+
+ def get_tools(self) -> list[tuple[str, dict]]:
+ tools = [
+ (
+ "set_timer",
+ {
+ "type": "function",
+ "function": {
+ "name": "set_timer",
+ "description": "set_timer function to delay other available functions.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "delay": {
+ "type": "number",
+ "description": "The delay/timer in seconds.",
+ },
+ "is_loop": {
+ "type": "boolean",
+ "description": "If the timer should loop or not.",
+ },
+ "loops": {
+ "type": "number",
+ "description": "The amount of loops the timer should do. -1 for infinite loops.",
+ },
+ "function": {
+ "type": "string",
+ # "enum": self._get_available_tools(), # end up beeing a recursive nightmare
+ "description": "The name of the function to execute after the delay. Must be a function name from the available tools.",
+ },
+ "parameters": {
+ "type": "object",
+ "description": "The parameters for the function to execute after the delay. Must be a valid object with the required properties to their values. Can not be empty.",
+ },
+ },
+ "required": ["delay", "function", "parameters"],
+ "optional": ["is_loop", "loops"],
+ },
+ },
+ },
+ ),
+ (
+ "get_timer_status",
+ {
+ "type": "function",
+ "function": {
+ "name": "get_timer_status",
+ "description": "Get a list of all running timers and their remaining time and id.",
+ },
+ },
+ ),
+ (
+ "cancel_timer",
+ {
+ "type": "function",
+ "function": {
+ "name": "cancel_timer",
+ "description": "Cancel a running timer by its id. Use this in combination with set_timer to change timers.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "id": {
+ "type": "string",
+ "description": "The id of the timer to cancel.",
+ },
+ },
+ "required": ["id"],
+ },
+ },
+ },
+ ),
+ (
+ "change_timer_settings",
+ {
+ "type": "function",
+ "function": {
+ "name": "change_timer_settings",
+ "description": "Change a timers loop and delay settings. Requires the id of the timer to change.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "id": {
+ "type": "string",
+ "description": "The id of the timer to change.",
+ },
+ "delay": {
+ "type": "number",
+ "description": "The new delay/timer in seconds.",
+ },
+ "is_loop": {
+ "type": "boolean",
+ "description": "If the timer should loop or not.",
+ },
+ "loops": {
+ "type": "number",
+ "description": "The amount of remaining loops the timer should do. -1 for infinite loops.",
+ },
+ },
+ "required": ["id", "delay", "is_loop", "loops"],
+ },
+ },
+ },
+ ),
+ (
+ "remind_me",
+ {
+ "type": "function",
+ "function": {
+ "name": "remind_me",
+ "description": "Must only be called with the set_timer function. Will remind the user with the given message.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "message": {
+ "type": "string",
+ "description": 'The message of the reminder to say to the user. For example User: "Remind me to take a break." -> Message: "This is your reminder to take a break."',
+ },
+ },
+ "required": ["message"],
+ },
+ },
+ },
+ ),
+ ]
+ return tools
+
+ async def is_waiting_response_needed(self, tool_name: str) -> bool:
+ return tool_name in ["set_timer"]
+
+ def _get_available_tools(self) -> list[dict[str, any]]:
+ tools = self.wingman.build_tools()
+ tool_names = []
+ for tool in tools:
+ name = tool.get("function", {}).get("name", None)
+ if name:
+ tool_names.append(name)
+
+ return tool_names
+
+ async def execute_tool(
+ self, tool_name: str, parameters: dict[str, any]
+ ) -> tuple[str, str]:
+ function_response = ""
+ instant_response = ""
+
+ if tool_name in [
+ "set_timer",
+ "get_timer_status",
+ "cancel_timer",
+ "change_timer_settings",
+ "remind_me",
+ ]:
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+
+ if tool_name == "set_timer":
+ function_response = await self.set_timer(
+ delay=parameters.get("delay", None),
+ is_loop=parameters.get("is_loop", False),
+ loops=parameters.get("loops", 1),
+ function=parameters.get("function", None),
+ parameters=parameters.get("parameters", {}),
+ )
+ elif tool_name == "get_timer_status":
+ function_response = await self.get_timer_status()
+ elif tool_name == "cancel_timer":
+ function_response = await self.cancel_timer(
+ timer_id=parameters.get("id", None)
+ )
+ elif tool_name == "change_timer_settings":
+ function_response = await self.change_timer_settings(
+ timer_id=parameters.get("id", None),
+ delay=parameters.get("delay", None),
+ is_loop=parameters.get("is_loop", False),
+ loops=parameters.get("loops", 1),
+ )
+ elif tool_name == "remind_me":
+ function_response = await self.reminder(
+ message=parameters.get("message", None)
+ )
+
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+
+ return function_response, instant_response
+
+ async def _get_tool_parameter_type_by_name(self, type_name: str) -> any:
+ if type_name == "object":
+ return dict
+ elif type_name == "string":
+ return str
+ elif type_name == "number":
+ return int
+ elif type_name == "boolean":
+ return bool
+ elif type_name == "array":
+ return list
+ else:
+ return None
+
+ async def start_timer_worker(self) -> None:
+ while self.active:
+ await asyncio.sleep(2)
+ timers_to_delete = []
+ for timer_id, timer in self.timers.items():
+ delay = timer[0]
+ # function = timer[1]
+ # parameters = timer[2]
+ start_time = timer[3]
+ is_loop = timer[4]
+ loops = timer[5]
+
+ if is_loop and loops == 0:
+ timers_to_delete.append(timer_id)
+ continue # skip timers marked for deletion
+
+ if time.time() - start_time >= delay:
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+ await self.execute_timer(timer_id)
+ if self.settings.debug_mode:
+ await self.print_execution_time(True)
+
+ # delete timers marked for deletion
+ for timer_id in timers_to_delete:
+ del self.timers[timer_id]
+
+ self.timers = {} # clear timers
+
+ async def execute_timer(self, timer_id: str) -> None:
+ if timer_id not in self.timers:
+ return
+
+ # delay = self.timers[timer_id][0]
+ function = self.timers[timer_id][1]
+ parameters = self.timers[timer_id][2]
+ # start_time = self.timers[timer_id][3]
+ is_loop = self.timers[timer_id][4]
+ loops = self.timers[timer_id][5]
+
+ response = await self.wingman.execute_command_by_function_call(
+ function, parameters
+ )
+ if response:
+ summary = await self._summarize_timer_execution(
+ function, parameters, response
+ )
+ await self.wingman.add_assistant_message(summary)
+ await self.printr.print_async(
+ f"{summary}",
+ color=LogType.POSITIVE,
+ source=LogSource.WINGMAN,
+ source_name=self.wingman.name,
+ skill_name=self.name,
+ )
+ await self.wingman.play_to_user(summary, True)
+
+ if not is_loop or loops == 1:
+ # we cant delete it here, because we are iterating over the timers in a sepereate thread
+ self.timers[timer_id][4] = True
+ self.timers[timer_id][5] = 0
+ return
+
+ self.timers[timer_id][3] = time.time() # reset start time
+ if loops > 0:
+ self.timers[timer_id][5] -= 1 # decrease remaining loops
+
+ async def set_timer(
+ self,
+ delay: int = None,
+ is_loop: bool = False,
+ loops: int = -1,
+ function: str = None,
+ parameters: dict[str, any] = None,
+ ) -> str:
+ check_counter = 0
+ max_checks = 2
+ errors = []
+
+ while (check_counter == 0 or errors) and check_counter < max_checks:
+ errors = []
+
+ if delay is None or function is None:
+ errors.append("Missing delay or function.")
+ elif delay < 0:
+ errors.append("No timer set, delay must be greater than 0.")
+
+ if "." in function:
+ function = function.split(".")[1]
+
+ # check if tool call exists
+ tool_call = None
+ tool_call = next(
+ (
+ tool
+ for tool in self.wingman.build_tools()
+ if tool.get("function", {}).get("name", False) == function
+ ),
+ None,
+ )
+
+ # if not valid it might be a command
+ if not tool_call and self.wingman.get_command(function):
+ parameters = {"command_name": function}
+ function = "execute_command"
+
+ if not tool_call:
+ errors.append(f"Function {function} does not exist.")
+ else:
+ if tool_call.get("function", False) and tool_call.get(
+ "function", {}
+ ).get("parameters", False):
+ properties = (
+ tool_call.get("function", {})
+ .get("parameters", {})
+ .get("properties", {})
+ )
+ required_parameters = (
+ tool_call.get("function", {})
+ .get("parameters", {})
+ .get("required", [])
+ )
+
+ for name, value in properties.items():
+ if name in parameters:
+ real_type = await self._get_tool_parameter_type_by_name(
+ value.get("type", "string")
+ )
+ if not isinstance(parameters[name], real_type):
+ errors.append(
+ f"Parameter {name} must be of type {value.get('type', None)}, but is {type(parameters[name])}."
+ )
+ elif value.get("enum", False) and parameters[
+ name
+ ] not in value.get("enum", []):
+ errors.append(
+ f"Parameter {name} must be one of {value.get('enum', [])}, but is {parameters[name]}."
+ )
+ if name in required_parameters:
+ required_parameters.remove(name)
+
+ if required_parameters:
+ errors.append(
+ f"Missing required parameters: {required_parameters}."
+ )
+
+ check_counter += 1
+ if errors:
+ # try to let it fix itself
+ message_history = []
+ for message in self.wingman.messages:
+ role = (
+ message.role
+ if hasattr(message, "role")
+ else message.get("role", False)
+ )
+ if role in ["user", "assistant", "system"]:
+ message_history.append(
+ {
+ "role": role,
+ "content": (
+ message.content
+ if hasattr(message, "content")
+ else message.get("content", False)
+ ),
+ }
+ )
+ data = {
+ "original_set_timer_call": {
+ "delay": delay,
+ "is_loop": is_loop,
+ "loops": loops,
+ "function": function,
+ "parameters": parameters,
+ },
+ "message_history": (
+ message_history
+ if len(message_history) <= 10
+ else message_history[:1] + message_history[-9:]
+ ),
+ "tool_calls_definition": self.wingman.build_tools(),
+ "errors": errors,
+ }
+
+ messages = [
+ {
+ "role": "system",
+ "content": """
+ The **set_timer** tool got called by a request with parameters that are incomplete or do not match the given requirements.
+ Please adjust the parameters "function" and "parameters" to match the requirements of the designated tool.
+ Make use of the message_history with the user previously to figure out missing parameters or wrong types.
+ And the tool_calls_definition to see the available tools and their requirements.
+ Use the **errors** information to figure out what it missing or wrong.
+
+ Provide me an answer **only containing a valid JSON object** with the following structure for example:
+ {
+ "delay": 10,
+ "is_loop": false,
+ "loops": 1,
+ "function": "function_name",
+ "parameters": {
+ "parameter_name": "parameter_value"
+ }
+ }
+ """,
+ },
+ {"role": "user", "content": json.dumps(data, indent=4)},
+ ]
+ json_retry = 0
+ max_json_retries = 1
+ valid_json = False
+ while not valid_json and json_retry < max_json_retries:
+ completion = await self.llm_call(messages)
+ data = completion.choices[0].message.content
+ messages.append(
+ {
+ "role": "assistant",
+ "content": data,
+ }
+ )
+ # check if data is valid json
+ try:
+ if data.startswith("```json") and data.endswith("```"):
+ data = data[len("```json") : -len("```")].strip()
+ data = json.loads(data)
+ except json.JSONDecodeError:
+ messages.append(
+ {
+ "role": "user",
+ "content": "Data is not valid JSON. Please provide valid JSON data.",
+ }
+ )
+ json_retry += 1
+ continue
+
+ valid_json = True
+ delay = data.get("delay", False)
+ is_loop = data.get("is_loop", False)
+ loops = data.get("loops", 1)
+ function = data.get("function", False)
+ parameters = data.get("parameters", {})
+
+ if errors:
+ return f"""
+ No timer set. Communicate these errors to the user.
+ But make sure to align them with the message history so far: {errors}
+ """
+
+ # generate a unique id for the timer
+ letters_and_digits = string.ascii_letters + string.digits
+ timer_id = "".join(random.choice(letters_and_digits) for _ in range(10))
+
+ # set timer
+ current_time = time.time()
+ self.timers[timer_id] = [
+ delay,
+ function,
+ parameters,
+ current_time,
+ is_loop,
+ loops,
+ ]
+
+ return f"Timer set with id {timer_id}.\n\n{await self.get_timer_status()}"
+
+ async def _summarize_timer_execution(
+ self, function: str, parameters: dict[str, any], response: str
+ ) -> str:
+ self.wingman.messages.append(
+ {
+ "role": "user",
+ "content": f"""
+ Timed "{function}" with "{parameters}" was executed.
+ Summarize the respone while you must stay in character!
+ Dont mention it was a function call, go by the meaning:
+ {response}
+ """,
+ },
+ )
+ await self.wingman.add_context(self.wingman.messages)
+ completion = await self.llm_call(self.wingman.messages)
+ answer = (
+ completion.choices[0].message.content
+ if completion and completion.choices
+ else ""
+ )
+ return answer
+
+ async def get_timer_status(self) -> list[dict[str, any]]:
+ timers = []
+ for timer_id, timer in self.timers.items():
+ if timer[4] and timer[5] == 0:
+ continue # skip timers marked for deletion
+ timers.append(
+ {
+ "id": timer_id,
+ "delay": timer[0],
+ "is_loop": timer[4],
+ "remaining_loops": (
+ (timer[5] if timer[5] > 0 else "infinite")
+ if timer[4]
+ else "N/A"
+ ),
+ "remaining_time_in_seconds": round(
+ max(0, timer[0] - (time.time() - timer[3]))
+ ),
+ }
+ )
+ return timers
+
+ async def cancel_timer(self, timer_id: str) -> str:
+ if timer_id not in self.timers:
+ return f"Timer with id {timer_id} not found."
+ # we cant delete it here, because we are iterating over the timers in a sepereate thread
+ self.timers[timer_id][4] = True
+ self.timers[timer_id][5] = 0
+ return f"Timer with id {timer_id} cancelled.\n\n{await self.get_timer_status()}"
+
+ async def change_timer_settings(
+ self, timer_id: str, delay: int, is_loop: bool, loops: int
+ ) -> str:
+ if timer_id not in self.timers:
+ return f"Timer with id {timer_id} not found."
+ self.timers[timer_id][0] = delay
+ self.timers[timer_id][4] = is_loop
+ self.timers[timer_id][5] = loops
+ return f"Timer with id {timer_id} settings changed.\n\n{await self.get_timer_status()}"
+
+ async def reminder(self, message: str = None) -> str:
+ if not message:
+ return "This is your reminder, no message was given."
+ return message
diff --git a/templates/migration/1_5_0/skills/typing_assistant/default_config.yaml b/templates/migration/1_5_0/skills/typing_assistant/default_config.yaml
new file mode 100644
index 00000000..32c977c6
--- /dev/null
+++ b/templates/migration/1_5_0/skills/typing_assistant/default_config.yaml
@@ -0,0 +1,27 @@
+name: TypingAssistant
+module: skills.typing_assistant.main
+category: general
+description:
+ en: Let your Wingman type text for you!
+ de: Lass deinen Wingman Texte für dich tippen!
+hint:
+ en: Types what you say, either by transcribing your speech or the response to your input. You can use this in Office or messaging programs etc. Not meant for pressing keys or macros - use Wingman commands for that.
+ de: Tippt, was du sagst, indem deine Spracheingabe oder die Antwort darauf transkribiert werden. Verwende den Skill in Office- oder Messaging-Programmen usw. Er ist nicht für das Drücken einzelner Tasten oder Makros gedacht - dafür kannst du Wingman-Befehle verwenden.
+examples:
+ - question:
+ en: Type "How are you today?".
+ de: Tippe "Wie geht's dir heute?".
+ answer:
+ en: (types "How are you today?" in active window at the current cursor location)
+ de: (Tippt "Wie geht's dir heute?" im aktiven Fenster an der aktuellen Cursor-Position ein)
+ - question:
+ en: Type a poem about trees.
+ de: Schreibe ein Gedicht über Bäume.
+ answer:
+ en: (imagines a poem about trees and then types it in the active window)
+ de: (Erfindet ein Gedicht über Bäume und tippt es dann in das aktive Fenster)
+prompt: |
+ You can also type what the user says if they ask you to. The user might dictate what you type, word for word.
+ The user might also ask you to imagine something, such as a poem, an email, or a speech, and then you type that content.
+ Use the context of the user's request to determine what content the user wants you to type.
+ Always use the tool assist_with_typing to type but only type if the user specifically asks you to.
diff --git a/templates/migration/1_5_0/skills/typing_assistant/logo.png b/templates/migration/1_5_0/skills/typing_assistant/logo.png
new file mode 100644
index 00000000..d896bcf9
Binary files /dev/null and b/templates/migration/1_5_0/skills/typing_assistant/logo.png differ
diff --git a/templates/migration/1_5_0/skills/typing_assistant/main.py b/templates/migration/1_5_0/skills/typing_assistant/main.py
new file mode 100644
index 00000000..db79fe85
--- /dev/null
+++ b/templates/migration/1_5_0/skills/typing_assistant/main.py
@@ -0,0 +1,83 @@
+import time
+from typing import TYPE_CHECKING
+from api.interface import (
+ SettingsConfig,
+ SkillConfig,
+)
+from api.enums import LogType
+from skills.skill_base import Skill
+import keyboard.keyboard as keyboard
+
+if TYPE_CHECKING:
+ from wingmen.open_ai_wingman import OpenAiWingman
+
+
+class TypingAssistant(Skill):
+
+ def __init__(
+ self,
+ config: SkillConfig,
+ settings: SettingsConfig,
+ wingman: "OpenAiWingman",
+ ) -> None:
+ super().__init__(config=config, settings=settings, wingman=wingman)
+
+ def get_tools(self) -> list[tuple[str, dict]]:
+ tools = [
+ (
+ "assist_with_typing",
+ {
+ "type": "function",
+ "function": {
+ "name": "assist_with_typing",
+ "description": "Identifies what the user wants the AI to type into an active application window. This may be either transcribing exactly what the user says or typing something the user wants the AI to imagine and then type. Also identifies whether to end the typed content with a press of the Enter / Return key, common typically for typing a response to a chat message or form field.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "content_to_type": {
+ "type": "string",
+ "description": "The content the user wants the assistant to type.",
+ },
+ "end_by_pressing_enter": {
+ "type": "boolean",
+ "description": "Boolean True/False indicator of whether the typed content should end by pressing the enter key on the keyboard. Default False. Typically True when typing a response in a chat program.",
+ },
+ },
+ "required": ["content_to_type"],
+ },
+ },
+ },
+ ),
+ ]
+ return tools
+
+ async def execute_tool(
+ self, tool_name: str, parameters: dict[str, any]
+ ) -> tuple[str, str]:
+ function_response = "Error in typing. Can you please try your command again?"
+ instant_response = ""
+
+ if tool_name == "assist_with_typing":
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+ await self.printr.print_async(
+ f"Executing assist_with_typing function with parameters: {parameters}",
+ color=LogType.INFO,
+ )
+
+ content_to_type = parameters.get("content_to_type")
+ press_enter = parameters.get("end_by_pressing_enter")
+
+ keyboard.write(content_to_type, delay=0.01, hold=0.01)
+
+ if press_enter is True:
+ keyboard.press("enter")
+ time.sleep(0.2)
+ keyboard.release("enter")
+
+ function_response = "Typed user request at active mouse cursor position."
+
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+
+ return function_response, instant_response
diff --git a/templates/migration/1_5_0/skills/uexcorp/default_config.yaml b/templates/migration/1_5_0/skills/uexcorp/default_config.yaml
new file mode 100644
index 00000000..c82d742e
--- /dev/null
+++ b/templates/migration/1_5_0/skills/uexcorp/default_config.yaml
@@ -0,0 +1,203 @@
+name: UEXCorp
+module: skills.uexcorp.main
+category: star_citizen
+description:
+ en: Use the UEXCorp API to get live trading and lore information about ships, locations, commodities and more in Star Citizen.
+ de: Nutze die UEXCorp API, um live Handels- und Lore-Informationen über Schiffe, Orte, Rohstoffe und mehr in Star Citizen zu erhalten.
+examples:
+ - question:
+ en: Please provide me a the best two trading routes for my Caterpillar, Im currently at Hurston.
+ de: Bitte nenne mir die zwei besten Handelsrouten für meine Caterpillar, die derzeit in Hurston steht.
+ answer:
+ en: You have two highly profitable trading routes available. The first route involves transporting Recycled Material Composite from Pickers Field on Hurston to Orison - Trade & Development Division on the planet Crusader, offering a profit of 2,148,480 aUEC for 576 SCU of cargo. The second route entails shipping Laranite from HDMS-Lathan on the satellite Arial back to Central Business District in Lorville, resulting in a profit of 297216 aUEC for the same cargo capacity.
+ de: Du hast zwei hochprofitable Handelsrouten zur Verfügung. Auf der ersten Route transportierst du Recyclingmaterial von Pickers Field auf Hurston nach Orison - Trade & Development Division auf dem Planeten Crusader und erzielst einen Gewinn von 2.148.480 AUEC für 576 SCU Fracht. Auf der zweiten Route wird Laranit von HDMS-Lathan auf dem Satelliten Arial zurück zum Central Business District in Lorville transportiert, was einen Gewinn von 297216 AUEC für die gleiche Frachtkapazität bedeutet.
+ - question:
+ en: I got 3000 SCU of Hydrogen loaded in my Hull-C, where can I sell it?
+ de: Ich habe 3000 SCU Hydrogen in meinem Hull-C geladen. Wo kann ich ihn verkaufen?
+ answer:
+ en: You can sell the 3000 SCU of Hydrogen at Baijini Point, located on ArcCorp in the Stanton system.
+ de: Du kannst die 3000 SCU Hydrogen am Baijini Point auf ArcCorp im Stanton-System verkaufen.
+ - question:
+ en: What can you tell me about the Hull-C?
+ de: Was kannst du mir über den Hull-C erzählen?
+ answer:
+ en: The Hull-C is manufactured by Musashi Industrial & Starflight Concern and falls under the 'HULL' series. It serves as a freighter and has a cargo capacity of 4608 SCU. The ship can be purchased at New Deal in Lorville, Hurston for 15,750,000 aUEC. It can accommodate a crew of 1-4 and is designed for trading on suitable space stations with a cargo deck.
+ de: Die Hull-C wird von Musashi Industrial & Starflight Concern hergestellt und gehört zur "HULL"-Serie. Sie dient als Frachter und hat eine Frachtkapazität von 4608 SCU. Das Schiff kann bei New Deal in Lorville, Hurston für 15.750.000 AUEC erworben werden. Es kann eine Besatzung von 1-4 Personen aufnehmen und ist für den Handel mit geeigneten Raumstationen mit einem Frachtdeck ausgelegt.
+ - question:
+ en: What is the difference between the Mole and Prospector?
+ de: Was ist der Unterschied zwischen dem Maulwurf und dem Prospector?
+ answer:
+ en: The MOLE is a mining ship manufactured by ARGO Astronautics, with a capacity for 2-4 crew members and a cargo hold of 96 SCU. It is flyable since version 3.8.0 and is available for purchase at Lorville on Hurston for 5,130,500 aUEC. On the other hand, the Prospector, manufactured by Musashi Industrial & Starflight Concern, is a smaller mining vessel designed for a single crew member, with a cargo capacity of 32 SCU. It is flyable since version 3.0.0 and can be purchased at Lorville for 2,061,000 aUEC. The Prospector is also available for rent at various locations in the Stanton system, unlike the MOLE.
+ de: Die MOLE ist ein von ARGO Astronautics hergestelltes Bergbauschiff mit einer Kapazität für 2-4 Besatzungsmitglieder und einem Laderaum von 96 SCU. Sie ist seit Version 3.8.0 flugfähig und kann bei Lorville auf Hurston für 5.130.500 AUEC gekauft werden. Der Prospector, hergestellt von Musashi Industrial & Starflight Concern, ist ein kleineres Bergbauschiff für ein einziges Besatzungsmitglied und hat eine Ladekapazität von 32 SCU. Er ist seit Version 3.0.0 flugfähig und kann auf Lorville für 2.061.000 AUEC gekauft werden. Im Gegensatz zur MOLE kann der Prospector auch an verschiedenen Orten im Stanton-System gemietet werden.
+ - question:
+ en: What do you know about the commodity Neon?
+ de: Was weißt du über die Ware Neon?
+ answer:
+ en: The commodity 'Neon' is categorized as a drug and is available for purchase at Nuen Waste Management for 7,000 aUEC. It is also sellable at Grim HEX for 8,428 aUEC and Jumptown for 8,805 aUEC. However, it's important to note that Neon is illegal, and engaging in activities involving this commodity may lead to fines and a crimestat. It's advisable to avoid ship scans to mitigate the risk of penalties.
+ de: Die Ware "Neon" wird als Droge eingestuft und kann bei Nuen Waste Management für 7.000 AUEC gekauft werden. Außerdem kann es bei Grim HEX für 8.428 AUEC und bei Jumptown für 8.805 AUEC verkauft werden. Es ist jedoch wichtig zu wissen, dass Neon illegal ist und dass der Handel mit dieser Ware zu Geldstrafen und einer Strafe führen kann. Es ist ratsam, Schiffsscans zu vermeiden, um das Risiko von Strafen zu minimieren.
+ - question:
+ en: What is the best place to buy Laranite?
+ de: Wo kann ich Laranit am besten kaufen?
+ answer:
+ en: The best place to buy Laranite for your Caterpillar is at ArcCorp Mining Area 045, located on the satellite Wala in the Stanton system. It's available for purchase at 2,322 aUEC per SCU.
+ de: Der beste Ort, um Laranit für deine Caterpillar zu kaufen, ist das ArcCorp-Bergbaugebiet 045, das sich auf dem Satelliten Wala im Stanton-System befindet. Du kannst es für 2.322 AUEC pro SCU kaufen.
+ - question:
+ en: Where is the satellite Wala located?
+ de: Wo befindet sich der Satellit Wala?
+ answer:
+ en: The satellite Wala is located in the ArcCorp system, orbiting the planet ArcCorp. It hosts various trading options, including locations like ArcCorp Mining Area 045, ArcCorp Mining Area 048, ArcCorp Mining Area 056, ArcCorp Mining Area 061, Shady Glen Farms, and Samson & Son's Salvage Center.
+ de: Der Satellit Wala befindet sich im ArcCorp-System und umkreist den Planeten ArcCorp. Er beherbergt verschiedene Handelsmöglichkeiten, darunter Orte wie ArcCorp Mining Area 045, ArcCorp Mining Area 048, ArcCorp Mining Area 056, ArcCorp Mining Area 061, Shady Glen Farms und Samson & Son's Salvage Center.
+prompt: |
+ You also have tools to access the UEXcorp API which you can use to retrieve live trading data and additional information about ships, locations, commodities and more in Star Citizen.
+ Here are some examples when to use the different tools at your disposal:
+
+ Do not use markdown formatting (e.g. **name**) in your answers, but prefer lists to show multiple options or information.
+ Do not (never) translate any properties when giving them to the player. They must stay in english or untouched.
+ Only give functions parameters that were previously clearly provided by a request. Never assume any values, not the current ship, not the location, not the available money, nothing! Always send a None-value instead.
+ If you are not using one of the definied functions, dont give any trading recommendations.
+ If you execute a function that requires a commodity name, make sure to always provide the name in english, not in german or any other language.
+ Never mention optional function (tool) parameters to the user. Only mention the required parameters, if some are missing.
+
+ Samples when to use function "get_trading_routes":
+ - "Best trading route": Indicates a user's intent to find the best trading route.
+ - "Trade route": Suggests the user is seeking information on trading routes.
+ - "Profitable trade route": Implies a focus on finding profitable trading routes.
+ - "Trading advice": Indicates the user wants guidance on trading decisions.
+
+ Samples when to use function "get_locations_to_sell_to":
+ - "Sell commodity": Indicates a user's intent to sell a specific item.
+ - "Best place to sell": Suggests the user is seeking information on optimal selling locations.
+ - "Seller's market": Implies a focus on finding favorable selling conditions.
+ - "Selling advice": Indicates the user wants guidance on selling decisions.
+ - "Seller's guide": Suggests a request for assistance in the selling process.
+ - "Find buyer": Indicates the user's interest in locating potential buyers.
+ - "Sell item": Implies a user's intent to sell an item.
+ - "Sell cargo": Suggests a focus on selling cargo or goods.
+ - "Offload inventory": Signals the intention to sell available inventory.
+
+ Samples when to use function "get_locations_to_buy_from":
+ - "Buy commodity": Indicates a user's intent to purchase a specific item.
+ - "Best place to buy": Suggests the user is seeking information on optimal buying locations.
+ - "Buyer's market": Implies a focus on finding favorable buying conditions.
+ - "Purchase location": Signals interest in identifying a suitable location for buying.
+ - "Buying advice": Indicates the user wants guidance on purchasing decisions.
+ - "Buyer's guide": Suggests a request for assistance in the buying process.
+
+ Samples when to use function "get_location_information":
+ - "Location information": Indicates a user's intent to gather information about a specific location.
+ - "Location details": Suggests the user is seeking detailed information about a specific location.
+
+ Samples when to use function "get_ship_information":
+ - "Ship information": Indicates a user's intent to gather information about a specific ship.
+ - "Ship details": Suggests the user is seeking detailed information about a specific ship.
+
+ Samples when to use function "get_ship_comparison ":
+ - "Ship comparison": Indicates a user's intent to compare two ships. And everytime at least two ships are mentioned in the request.
+
+ Samples when to use function "get_commodity_information":
+ - "Commodity information": Indicates a user's intent to gather information about a specific commodity.
+ - "Commodity details": Suggests the user is seeking detailed information about a specific commodity.
+ - "Commodity prices": Implies a focus on obtaining current prices for a specific commodity.
+
+ Samples when to use function "reload_current_commodity_prices" (Great to use before retrieving sell, buy and trade options):
+ - "Update commodity prices": Indicates a user's intent to update the commodity prices.
+ - "Get current prices": Suggests the user is seeking the current commodity prices.
+ - "Refresh prices": Implies a focus on updating the commodity prices.
+custom_properties:
+ - id: uexcorp_api_url
+ name: API URL
+ hint: The URL of the UEX corp API.
+ value: https://uexcorp.space/api/2.0/
+ required: true
+ property_type: string
+ - id: uexcorp_api_timeout
+ name: API Timeout
+ hint: The timeout for the UEX corp API and, if used, the starcitizen wiki api in seconds. (If set below 3s, 3s will be used.)
+ value: 6
+ required: true
+ property_type: number
+ - id: uexcorp_api_timeout_retries
+ name: API Timeout Retries
+ hint: How often the request to the uexcorp api should be retried in case of a timeout. (Timeout setting may increase automatically on each retry.)
+ value: 3
+ required: true
+ property_type: number
+ # Set this option to "true" to enable caching of the UEX corp API responses. This is recommended, as the API key's quota is very limited.
+ # If you set this option to "false", the Wingman will fetch all data from the UEX corp API on every start.
+ # If you want to update the prices, just tell the Wingman to do so.
+ # If all data should be fetched again, delete the cache file. (wingman_data\uexcorp\cache.json)
+ - id: uexcorp_cache
+ name: Enable Cache
+ hint: Set this option to "true" to enable caching of the UEX corp API responses. This is recommended, as the API key's quota is very limited.
+ value: true
+ required: true
+ property_type: boolean
+
+ # Set this option to the amount of seconds you want to cache the UEX corp API responses.
+ # We recommend a day ("86400"), as the ship, planet, etc. information does not change that often.
+ - id: uexcorp_cache_duration
+ name: Cache Duration
+ hint: Set this option to the amount of seconds you want to cache the UEX corp API responses. We recommend a day ("86400").
+ value: 86400
+ required: true
+ property_type: number
+
+ # Set this option to "true" to show only one of the most profitable routes for each commodity.
+ # Set this option to "false" to show all routes. This may include multiple routes for the same commodity.
+ # Recommended: "true"
+ - id: uexcorp_summarize_routes_by_commodity
+ name: Summarize Routes by Commodity
+ hint: Set this option to "true" to show only the most profitable routes per commodity. "false" shows multiple options per commodity.
+ value: true
+ required: true
+ property_type: boolean
+
+ # Set this option to "true" to make the start location for trade route calculation a mandatory information.
+ # Set this option to "false" to make the start location for trade route calculation a optional information.
+ # If "false" and no start location is given, all tradeports are taken into account.
+ - id: uexcorp_tradestart_mandatory
+ name: Trade Start Mandatory
+ hint: Set this option to "true" to make the start location for trade route calculation a mandatory information. If "false" and no start location is given, all tradeports are taken into account.
+ value: true
+ required: true
+ property_type: boolean
+
+ # Use this to blacklist certain trade ports or commodities or combinations of both.
+ # Default value is '[]', which means no trade ports or commodities are blacklisted.
+ # If we want to add a trade port to the blacklist, we add something like this: {"tradeport":"Baijini Point"} This will blacklist the trade port completely from trade route calculations.
+ # If we want to add a commodity to the blacklist, we add something like this: {"commodity":"Medical Supplies"} This will blacklist the commodity completely from trade route calculations.
+ # If we want to add a combination to the blacklist, we add something like this: {"tradeport":"Baijini Point", "commodity":"Medical Supplies"} This will blacklist this commodity for the given trade port.
+ # If we want to add multiple trade ports or commodities or combinations of both, we add them in a list like this: [{"tradeport":"Baijini Point", "commodity":"Medical Supplies"}, {"commodity":"Medical Supplies"}, {"tradeport":"Port Tressler"}]
+ # This value is a JSON string, if you have created a list, use a JSON validator like https://jsonlint.com/ to check if the list is valid.
+ - id: uexcorp_trade_blacklist
+ name: Trade Blacklist
+ hint: JSON string to blacklist certain trade ports or commodities or combinations of both. Default value is empty ('[]'). Sample -> [{"tradeport":"Baijini Point", "commodity":"Medical Supplies"}, {"commodity":"Medical Supplies"}, {"tradeport":"Port Tressler"}]
+ value: "[]"
+ required: true
+ property_type: string
+
+ # Set this option to the amount of trade routes you want to show at default.
+ # You can always tell Wingman AI to show more or less trade routes for one request, if that number is not given, this setting is used.
+ - id: uexcorp_default_trade_route_count
+ name: Default Trade Route Count
+ hint: Set this option to the amount of trade routes you want to show at default.
+ value: 1
+ required: true
+ property_type: number
+
+ # Set this option to true to take estimated scu availability into account for trade route calculations.
+ # This will reduce the amount of trade routes shown, but will give you more accurate results.
+ - id: uexcorp_use_estimated_availability
+ name: Use Estimated Availability
+ hint: Enable this option to take estimated scu availability into account for trade route calculations.
+ value: true
+ required: true
+ property_type: boolean
+
+ # Set this option to true to get additional lore information from api.star-citizen.wiki on ships, locations, commodities and more.
+ - id: uexcorp_add_lore
+ name: Add Lore Information
+ hint: Enable this to retrieve additional lore information from api.star-citizen.wiki on ships, locations, commodities and more. But this will add a delay to the response.
+ value: false
+ required: true
+ property_type: boolean
diff --git a/templates/migration/1_5_0/skills/uexcorp/logo.png b/templates/migration/1_5_0/skills/uexcorp/logo.png
new file mode 100644
index 00000000..94392d3f
Binary files /dev/null and b/templates/migration/1_5_0/skills/uexcorp/logo.png differ
diff --git a/templates/migration/1_5_0/skills/uexcorp/main.py b/templates/migration/1_5_0/skills/uexcorp/main.py
new file mode 100644
index 00000000..630247d1
--- /dev/null
+++ b/templates/migration/1_5_0/skills/uexcorp/main.py
@@ -0,0 +1,3185 @@
+import asyncio
+import difflib
+import heapq
+import itertools
+import json
+import math
+import traceback
+from os import path
+import collections
+import re
+from typing import Optional, TYPE_CHECKING
+from datetime import datetime
+import requests
+from api.enums import LogType, WingmanInitializationErrorType
+from api.interface import SettingsConfig, SkillConfig, WingmanInitializationError
+from services.file import get_writable_dir
+from skills.skill_base import Skill
+
+if TYPE_CHECKING:
+ from wingmen.open_ai_wingman import OpenAiWingman
+
+
+class UEXCorp(Skill):
+ """Wingman AI Skill to utalize uexcorp api for trade recommendations"""
+
+ # enable for verbose logging
+ DEV_MODE = False
+
+ def __init__(
+ self,
+ config: SkillConfig,
+ settings: SettingsConfig,
+ wingman: "OpenAiWingman",
+ ) -> None:
+ super().__init__(config=config, settings=settings, wingman=wingman)
+
+ self.data_path = get_writable_dir(path.join("skills", "uexcorp", "data"))
+ self.logfileerror = path.join(self.data_path, "error.log")
+ self.logfiledebug = path.join(self.data_path, "debug.log")
+ self.cachefile = path.join(self.data_path, "cache.json")
+
+ self.skill_version = "v13"
+ self.skill_loaded = False
+ self.skill_loaded_asked = False
+ self.game_version = "unknown"
+
+ # init of config options
+ self.uexcorp_api_url: str = None
+ # self.uexcorp_api_key: str = None
+ self.uexcorp_api_timeout: int = None
+ self.uexcorp_api_timeout_retries: int = None
+ self.uexcorp_cache: bool = None
+ self.uexcorp_cache_duration: int = None
+ self.uexcorp_summarize_routes_by_commodity: bool = None
+ self.uexcorp_tradestart_mandatory: bool = None
+ self.uexcorp_trade_blacklist = []
+ self.uexcorp_default_trade_route_count: int = None
+ self.uexcorp_use_estimated_availability: bool = None
+ self.uexcorp_add_lore: bool = None
+
+ self.ships = []
+ self.ship_names = []
+ self.ship_dict = {}
+ self.ship_code_dict = {}
+
+ self.commodities = []
+ self.commodity_names = []
+ self.commodity_dict = {}
+ self.commodity_code_dict = {}
+
+ self.systems = []
+ self.system_names = []
+ self.system_dict = {}
+ self.system_code_dict = {}
+
+ self.terminals = []
+ self.terminal_names = []
+ self.terminal_names_trading = []
+ self.terminal_dict = {}
+ self.terminal_code_dict = {}
+ self.terminals_by_system = collections.defaultdict(list)
+ self.terminals_by_planet = collections.defaultdict(list)
+ self.terminals_by_moon = collections.defaultdict(list)
+ self.terminals_by_city = collections.defaultdict(list)
+
+ self.planets = []
+ self.planet_names = []
+ self.planet_dict = {}
+ self.planet_code_dict = {}
+ self.planets_by_system = collections.defaultdict(list)
+
+ self.moons = []
+ self.moon_names = []
+ self.moon_dict = {}
+ self.moon_code_dict = {}
+ self.moons_by_planet = collections.defaultdict(list)
+
+ self.cities = []
+ self.city_names = []
+ self.city_dict = {}
+ self.city_code_dict = {}
+ self.cities_by_planet = collections.defaultdict(list)
+
+ self.location_names_set = set()
+ self.location_names_set_trading = set()
+
+ self.cache_enabled = True
+ self.cache = {
+ "function_args": {},
+ "search_matches": {},
+ "readable_objects": {},
+ }
+
+ self.dynamic_context = ""
+
+ async def _print(
+ self, message: str | dict, is_extensive: bool = False, is_debug: bool = True
+ ) -> None:
+ """
+ Prints a message if debug mode is enabled. Will be sent to the server terminal, log file and client.
+
+ Args:
+ message (str | dict): The message to be printed.
+ is_extensive (bool, optional): Whether the message is extensive. Defaults to False.
+
+ Returns:
+ None
+ """
+ if (not is_extensive and self.settings.debug_mode) or not is_debug:
+ await self.printr.print_async(
+ message,
+ color=LogType.INFO,
+ )
+ elif self.DEV_MODE:
+ with open(self.logfiledebug, "a", encoding="UTF-8") as f:
+ f.write(f"#### Time: {datetime.now()} ####\n")
+ f.write(f"{message}\n\n")
+
+ def _log(self, message: str | dict, is_extensive: bool = False) -> None:
+ """
+ Prints a debug message (synchronously) only on the server (and in the log file).
+
+ Args:
+ message (str | dict): The message to be printed.
+ is_extensive (bool, optional): Whether the message is extensive. Defaults to False.
+
+ Returns:
+ None
+ """
+ if not is_extensive and self.settings.debug_mode:
+ self.printr.print(
+ message,
+ color=LogType.INFO,
+ server_only=True,
+ )
+ elif self.DEV_MODE:
+ with open(self.logfiledebug, "a", encoding="UTF-8") as f:
+ f.write(f"#### Time: {datetime.now()} ####\n")
+ f.write(f"{message}\n\n")
+
+ def _get_function_arg_from_cache(
+ self, arg_name: str, arg_value: str | int = None
+ ) -> str | int | None:
+ """
+ Retrieves a function argument from the cache if available, otherwise returns the provided argument value.
+
+ Args:
+ arg_name (str): The name of the function argument.
+ arg_value (str | int, optional): The default value for the argument. Defaults to None.
+
+ Returns:
+ dict[str, any]: The cached value of the argument if available, otherwise the provided argument value.
+ """
+ if not self.cache_enabled:
+ return arg_value
+
+ if arg_value is None or (
+ isinstance(arg_value, str) and arg_value.lower() == "current"
+ ):
+ cached_arg = self.cache["function_args"].get(arg_name)
+ if cached_arg is not None:
+ self._log(
+ f"'{arg_name}' was not given and got overwritten by cache: {cached_arg}"
+ )
+ return cached_arg
+
+ return arg_value
+
+ def _set_function_arg_to_cache(
+ self, arg_name: str, arg_value: str | int | float = None
+ ) -> None:
+ """
+ Sets the value of a function argument to the cache.
+
+ Args:
+ arg_name (str): The name of the argument.
+ arg_value (str | int, optional): The value of the argument. Defaults to None.
+ """
+ if not self.cache_enabled:
+ return
+
+ function_args = self.cache["function_args"]
+ old_value = function_args.get(arg_name, "None")
+
+ if arg_value is not None:
+ self._log(
+ f"Set function arg '{arg_name}' to cache. Previous value: {old_value} >> New value: {arg_value}",
+ True,
+ )
+ function_args[arg_name] = arg_value
+ elif arg_name in function_args:
+ self._log(
+ f"Removing function arg '{arg_name}' from cache. Previous value: {old_value}",
+ True,
+ )
+ function_args.pop(arg_name, None)
+
+ async def validate(self) -> list[WingmanInitializationError]:
+ errors = await super().validate()
+
+ # self.uexcorp_api_key = await self.retrieve_secret(
+ # "uexcorp", errors, "You can create your own API key here: https://uexcorp.space/api/apps/"
+ # )
+ self.uexcorp_api_url = self.retrieve_custom_property_value(
+ "uexcorp_api_url", errors
+ )
+ self.uexcorp_api_timeout = self.retrieve_custom_property_value(
+ "uexcorp_api_timeout", errors
+ )
+ self.uexcorp_api_timeout_retries = self.retrieve_custom_property_value(
+ "uexcorp_api_timeout_retries", errors
+ )
+ self.uexcorp_cache = self.retrieve_custom_property_value(
+ "uexcorp_cache", errors
+ )
+ self.uexcorp_cache_duration = self.retrieve_custom_property_value(
+ "uexcorp_cache_duration", errors
+ )
+ self.uexcorp_summarize_routes_by_commodity = (
+ self.retrieve_custom_property_value(
+ "uexcorp_summarize_routes_by_commodity", errors
+ )
+ )
+ self.uexcorp_tradestart_mandatory = self.retrieve_custom_property_value(
+ "uexcorp_tradestart_mandatory", errors
+ )
+ self.uexcorp_default_trade_route_count = self.retrieve_custom_property_value(
+ "uexcorp_default_trade_route_count", errors
+ )
+ self.uexcorp_use_estimated_availability = self.retrieve_custom_property_value(
+ "uexcorp_use_estimated_availability", errors
+ )
+ self.uexcorp_add_lore = self.retrieve_custom_property_value(
+ "uexcorp_add_lore", errors
+ )
+
+ trade_backlist_str: str = self.retrieve_custom_property_value(
+ "uexcorp_trade_blacklist", errors
+ )
+ if trade_backlist_str:
+ try:
+ self.uexcorp_trade_blacklist = json.loads(trade_backlist_str)
+ except json.decoder.JSONDecodeError:
+ errors.append(
+ WingmanInitializationError(
+ wingman_name=self.name,
+ message="Invalid custom property 'uexcorp_trade_blacklist' in config. Value must be a valid JSON string.",
+ error_type=WingmanInitializationErrorType.INVALID_CONFIG,
+ )
+ )
+
+ try:
+ await self._start_loading_data()
+ except Exception as e:
+ errors.append(
+ WingmanInitializationError(
+ wingman_name=self.name,
+ message=f"Failed to load data: {e}",
+ error_type=WingmanInitializationErrorType.UNKNOWN,
+ )
+ )
+
+ return errors
+
+ async def _load_data(self, reload_prices: bool = False, callback=None) -> None:
+ """
+ Load data for UEX corp wingman.
+
+ Args:
+ reload (bool, optional): Whether to reload the data from the source. Defaults to False.
+ """
+
+ if reload_prices:
+ await self._load_commodity_prices()
+ self.threaded_execution(self._save_to_cachefile)
+ return
+
+ self.game_version = (await self._fetch_uex_data("game_versions"))["live"]
+
+ async def _load_from_cache():
+ if not self.uexcorp_cache:
+ return
+
+ # check file age
+ data = {}
+ try:
+ with open(self.cachefile, "r", encoding="UTF-8") as f:
+ data = json.load(f)
+ except (FileNotFoundError, json.decoder.JSONDecodeError):
+ pass
+
+ # check file age
+ if (
+ data.get("timestamp")
+ and data.get("timestamp") + self.uexcorp_cache_duration
+ > self._get_timestamp()
+ and data.get("skill_version") == self.skill_version
+ and data.get("game_version") == self.game_version
+ ):
+ if data.get("ships"):
+ self.ships = data["ships"]
+ if data.get("commodities"):
+ self.commodities = data["commodities"]
+ if data.get("systems"):
+ self.systems = data["systems"]
+ if data.get("terminals"):
+ self.terminals = data["terminals"]
+ # fix prices keys (from string to integer due to unintentional json conversion)
+ for terminal in self.terminals:
+ if "prices" in terminal:
+ terminal["prices"] = {
+ int(key): value
+ for key, value in terminal["prices"].items()
+ }
+ if data.get("planets"):
+ self.planets = data["planets"]
+ if data.get("moons"):
+ self.moons = data["moons"]
+ if data.get("cities"):
+ self.cities = data["cities"]
+
+ async def _load_missing_data():
+ load_purchase_and_rental = False
+
+ if not self.ships:
+ load_purchase_and_rental = True
+ self.ships = await self._fetch_uex_data("vehicles")
+ self.ships = [ship for ship in self.ships if ship["game_version"]]
+
+ if not self.commodities:
+ self.commodities = await self._fetch_uex_data("commodities")
+ self.commodities = [
+ commodity
+ for commodity in self.commodities
+ if commodity["is_available"] == 1
+ ]
+
+ if not self.systems:
+ load_purchase_and_rental = True
+ self.systems = await self._fetch_uex_data("star_systems")
+ self.systems = [
+ system for system in self.systems if system["is_available"] == 1
+ ]
+ for system in self.systems:
+ self.terminals += await self._fetch_uex_data(
+ f"terminals/id_star_system/{system['id']}/is_available/1/is_visible/1"
+ )
+ self.cities += await self._fetch_uex_data(
+ f"cities/id_star_system/{system['id']}"
+ )
+ self.moons += await self._fetch_uex_data(
+ f"moons/id_star_system/{system['id']}"
+ )
+ self.planets += await self._fetch_uex_data(
+ f"planets/id_star_system/{system['id']}"
+ )
+ await self._load_commodity_prices()
+
+ # data manipulation
+ planet_codes = []
+ for planet in self.planets:
+ if planet["code"] not in planet_codes:
+ planet_codes.append(planet["code"])
+
+ for terminal in self.terminals:
+ if terminal["id_space_station"]:
+ parts = terminal["nickname"].split(" ")
+ for part in parts:
+ if (
+ len(part.split("-")) == 2
+ and part.split("-")[0] in planet_codes
+ and re.match(r"^L\d+$", part.split("-")[1])
+ ):
+ terminal["id_planet"] = ""
+ break
+
+ if load_purchase_and_rental:
+ await self._load_purchase_and_rental()
+
+ async def _load_data(callback=None):
+ await _load_from_cache()
+ await _load_missing_data()
+ self.threaded_execution(self._save_to_cachefile)
+
+ if callback:
+ await callback()
+
+ self.threaded_execution(_load_data, callback)
+
+ async def _save_to_cachefile(self) -> None:
+ if (
+ self.uexcorp_cache
+ and self.uexcorp_cache_duration > 0
+ and self.ships
+ and self.commodities
+ and self.systems
+ and self.terminals
+ and self.planets
+ and self.moons
+ and self.cities
+ ):
+ data = {
+ "timestamp": self._get_timestamp(),
+ "skill_version": self.skill_version,
+ "game_version": self.game_version,
+ "ships": self.ships,
+ "commodities": self.commodities,
+ "systems": self.systems,
+ "terminals": self.terminals,
+ "planets": self.planets,
+ "moons": self.moons,
+ "cities": self.cities,
+ }
+ with open(self.cachefile, "w", encoding="UTF-8") as f:
+ json.dump(data, f, indent=4)
+
+ async def _load_purchase_and_rental(self) -> None:
+ """
+ Load purchase and rental information for ships and vehicles.
+
+ Returns:
+ None
+ """
+ ships_purchase = await self._fetch_uex_data("vehicles_purchases_prices_all")
+ ships_rental = await self._fetch_uex_data("vehicles_rentals_prices_all")
+
+ for ship in self.ships:
+ ship["purchase"] = [
+ purchase
+ for purchase in ships_purchase
+ if purchase["id_vehicle"] == ship["id"]
+ ]
+ ship["rental"] = [
+ rental for rental in ships_rental if rental["id_vehicle"] == ship["id"]
+ ]
+
+ for terminal in self.terminals:
+ terminal["vehicle_rental"] = [
+ rental
+ for rental in ships_rental
+ if rental["id_terminal"] == terminal["id"]
+ ]
+ terminal["vehicle_purchase"] = [
+ purchase
+ for purchase in ships_purchase
+ if purchase["id_terminal"] == terminal["id"]
+ ]
+
+ async def _load_commodity_prices(self) -> None:
+ """
+ Load commodity prices from UEX corp API.
+
+ Returns:
+ None
+ """
+
+ self.cache["readable_objects"] = {}
+
+ # currently the prices are saved in api v1 style to minimize rework time for now
+ for i in range(0, len(self.terminals), 10):
+ terminals_batch = self.terminals[i : i + 10]
+ terminal_ids = [
+ terminal["id"]
+ for terminal in terminals_batch
+ if terminal["type"] in ["commodity", "commodity_raw"]
+ ]
+ if not terminal_ids:
+ continue
+
+ commodity_prices = await self._fetch_uex_data(
+ "commodities_prices/id_terminal/" + ",".join(map(str, terminal_ids))
+ )
+
+ for terminal in terminals_batch:
+ terminal["prices"] = {}
+
+ for commodity_price in commodity_prices:
+ if commodity_price["id_terminal"] == terminal["id"]:
+ commodity = next(
+ (
+ commodity
+ for commodity in self.commodities
+ if commodity["id"] == commodity_price["id_commodity"]
+ ),
+ None,
+ )
+ if commodity:
+ transaction_type = (
+ "buy" if commodity_price["price_buy"] > 0 else "sell"
+ )
+ price = {
+ "name": self._format_commodity_name(commodity),
+ "kind": commodity["kind"],
+ "operation": transaction_type,
+ "price_buy": commodity_price["price_buy"],
+ "price_sell": commodity_price["price_sell"],
+ "date_update": commodity_price["date_modified"],
+ "is_updated": bool(commodity_price["date_modified"]),
+ "scu": commodity_price[f"scu_{transaction_type}"]
+ or None,
+ "scu_average": commodity_price[
+ f"scu_{transaction_type}_avg"
+ ]
+ or None,
+ "scu_average_week": commodity_price[
+ f"scu_{transaction_type}_avg_week"
+ ]
+ or None,
+ }
+ # calculate expected scu
+ count = 0
+ total = 0
+ if price["scu"]:
+ count += 2
+ total += price["scu"] * 2
+ if price["scu_average"]:
+ count += 1
+ total += price["scu_average"]
+ if price["scu_average_week"]:
+ count += 1
+ total += price["scu_average_week"]
+ price["scu_expected"] = (
+ int(total / count) if count > 0 else None
+ )
+
+ terminal["prices"][commodity["id"]] = price
+
+ async def _start_loading_data(self) -> None:
+ """
+ Prepares the wingman for execution by initializing necessary variables and loading data.
+
+ This method retrieves configuration values, sets up API URL and timeout, and loads data
+ such as ship names, commodity names, system names, terminal names, city names,
+ moon names and planet names.
+ It also adds additional context information for function parameters.
+
+ Returns:
+ None
+ """
+
+ # fix api url
+ if self.uexcorp_api_url and self.uexcorp_api_url.endswith("/"):
+ self.uexcorp_api_url = self.uexcorp_api_url[:-1]
+
+ # fix timeout
+ self.uexcorp_api_timeout = max(3, self.uexcorp_api_timeout)
+ self.uexcorp_api_timeout_retries = max(0, self.uexcorp_api_timeout_retries)
+
+ await self._load_data(False, self._prepare_data)
+
+ async def _prepare_data(self) -> None:
+ """
+ Prepares the wingman for execution by initializing necessary variables.
+ """
+
+ self.planets = [
+ planet for planet in self.planets if planet["is_available"] == 1
+ ]
+
+ self.moons = [moon for moon in self.moons if moon["is_available"] == 1]
+
+ # remove urls from ships
+ for ship in self.ships:
+ ship.pop("url_store", None)
+ ship.pop("url_brochure", None)
+ ship.pop("url_hotsite", None)
+ ship.pop("url_video", None)
+ ship.pop("url_photos", None)
+
+ # remove screenshot from terminals
+ for terminal in self.terminals:
+ terminal.pop("screenshot", None)
+ terminal.pop("screenshot_thumbnail", None)
+ terminal.pop("screenshot_author", None)
+
+ # add hull trading option to trade ports
+ for terminal in self.terminals:
+ terminal["hull_trading"] = bool(terminal["has_loading_dock"])
+
+ # add hull trading option to ships
+ ships_for_hull_trading = [
+ "Hull C",
+ "Hull D",
+ "Hull E",
+ ]
+ for ship in self.ships:
+ ship["hull_trading"] = ship["name"] in ships_for_hull_trading
+
+ self.ship_names = [self._format_ship_name(ship) for ship in self.ships]
+ self.ship_dict = {
+ self._format_ship_name(ship).lower(): ship for ship in self.ships
+ }
+ self.ship_code_dict = {ship["id"]: ship for ship in self.ships}
+
+ self.commodity_names = [
+ self._format_commodity_name(commodity) for commodity in self.commodities
+ ]
+ self.commodity_dict = {
+ self._format_commodity_name(commodity).lower(): commodity
+ for commodity in self.commodities
+ }
+ self.commodity_code_dict = {
+ commodity["id"]: commodity for commodity in self.commodities
+ }
+
+ self.system_names = [
+ self._format_system_name(system) for system in self.systems
+ ]
+ self.system_dict = {
+ self._format_system_name(system).lower(): system for system in self.systems
+ }
+ self.system_code_dict = {system["id"]: system for system in self.systems}
+
+ self.terminal_names = [
+ self._format_terminal_name(terminal) for terminal in self.terminals
+ ]
+ self.terminal_names_trading = [
+ self._format_terminal_name(terminal)
+ for terminal in self.terminals
+ if terminal["type"] in ["commodity", "commodity_raw"]
+ ]
+ self.terminal_dict = {
+ self._format_terminal_name(terminal).lower(): terminal
+ for terminal in self.terminals
+ }
+ self.terminal_code_dict = {
+ terminal["id"]: terminal for terminal in self.terminals
+ }
+ for terminal in self.terminals:
+ if terminal["id_star_system"]:
+ self.terminals_by_system[terminal["id_star_system"]].append(terminal)
+ if terminal["id_planet"]:
+ self.terminals_by_planet[terminal["id_planet"]].append(terminal)
+ if terminal["id_moon"]:
+ self.terminals_by_moon[terminal["id_moon"]].append(terminal)
+ if terminal["id_city"]:
+ self.terminals_by_city[terminal["id_city"]].append(terminal)
+
+ self.city_names = [self._format_city_name(city) for city in self.cities]
+ self.city_dict = {
+ self._format_city_name(city).lower(): city for city in self.cities
+ }
+ self.city_code_dict = {city["id"]: city for city in self.cities}
+ for city in self.cities:
+ self.cities_by_planet[city["id_planet"]].append(city)
+
+ self.moon_names = [self._format_moon_name(moon) for moon in self.moons]
+ self.moon_dict = {
+ self._format_moon_name(moon).lower(): moon for moon in self.moons
+ }
+ self.moon_code_dict = {moon["id"]: moon for moon in self.moons}
+ for moon in self.moons:
+ self.moons_by_planet[moon["id_planet"]].append(moon)
+
+ self.planet_names = [
+ self._format_planet_name(planet) for planet in self.planets
+ ]
+ self.planet_dict = {
+ self._format_planet_name(planet).lower(): planet for planet in self.planets
+ }
+ self.planet_code_dict = {planet["id"]: planet for planet in self.planets}
+ for planet in self.planets:
+ self.planets_by_system[planet["id_star_system"]].append(planet)
+
+ self.location_names_set = set(
+ self.system_names
+ + self.terminal_names
+ + self.city_names
+ + self.moon_names
+ + self.planet_names
+ )
+ self.location_names_set_trading = set(
+ self.system_names
+ + self.terminal_names_trading
+ + self.city_names
+ + self.moon_names
+ + self.planet_names
+ )
+
+ self.skill_loaded = True
+ if self.skill_loaded_asked:
+ self.skill_loaded_asked = False
+ await self._print("UEXcorp skill data loading complete.", False, False)
+
+ def add_context(self, content: str):
+ """
+ Adds additional context to the first message content,
+ that represents the context given to open ai.
+
+ Args:
+ content (str): The additional context to be added.
+
+ Returns:
+ None
+ """
+ self.dynamic_context += "\n" + content
+
+ def _get_timestamp(self) -> int:
+ """
+ Get the current timestamp as an integer.
+
+ Returns:
+ int: The current timestamp.
+ """
+ return int(datetime.now().timestamp())
+
+ def _get_header(self):
+ """
+ Returns the header dictionary containing the API key.
+ Used for API requests.
+
+ Returns:
+ dict: The header dictionary with the API key.
+ """
+ return {} # no header needed anymore for currently used endpoints
+ key = self.uexcorp_api_key
+ return {"Authorization": f"Bearer {key}"}
+
+ async def _fetch_uex_data(
+ self, endpoint: str, params: Optional[dict[str, any]] = None
+ ) -> list[dict[str, any]]:
+ """
+ Fetches data from the specified endpoint.
+
+ Args:
+ endpoint (str): The API endpoint to fetch data from.
+ params (Optional[dict[str, any]]): Optional parameters to include in the request.
+
+ Returns:
+ list[dict[str, any]]: The fetched data as a list of dictionaries.
+ """
+ url = f"{self.uexcorp_api_url}/{endpoint}"
+ await self._print(f"Fetching data from {url} ...", True)
+
+ request_count = 1
+ timeout_error = False
+ requests_error = False
+
+ while request_count == 1 or (
+ request_count <= (self.uexcorp_api_timeout_retries + 1) and timeout_error
+ ):
+ if requests_error:
+ await self._print(f"Retrying request #{request_count}...", True)
+ requests_error = False
+
+ timeout_error = False
+ try:
+ response = requests.get(
+ url,
+ params=params,
+ timeout=(self.uexcorp_api_timeout * request_count),
+ headers=self._get_header(),
+ )
+ response.raise_for_status()
+ except requests.exceptions.RequestException as e:
+ await self._print(f"Error while retrieving data from {url}: {e}")
+ requests_error = True
+ if isinstance(e, requests.exceptions.Timeout):
+ timeout_error = True
+ request_count += 1
+
+ if requests_error:
+ return []
+
+ response_json = response.json()
+ if "status" not in response_json or response_json["status"] != "ok":
+ await self._print(f"Error while retrieving data from {url}")
+ return []
+
+ return response_json.get("data", [])
+
+ async def _fetch_lore(self, search: str) -> dict[str, any]:
+ """
+ Fetches data for a search query.
+
+ Args:
+ search (str): The search query to fetch data for.
+
+ Returns:
+ dict[str, any]: The fetched data as a dictionary.
+ """
+ url = "https://api.star-citizen.wiki/api/v2/galactapedia/search"
+ await self._print(
+ f"Fetching data from SC wiki ({url}) for search query '{search}' ...", True
+ )
+
+ request_count = 1
+ max_retries = 2
+ timeout_error = False
+ requests_error = False
+
+ while request_count == 1 or (
+ request_count <= (max_retries + 1) and timeout_error
+ ):
+ if requests_error:
+ await self._print(f"Retrying request #{request_count}...", True)
+ requests_error = False
+
+ timeout_error = False
+ try:
+ response = requests.post(
+ url,
+ headers={
+ "accept": "application/json",
+ "Content-Type": "application/json",
+ },
+ json={"query": search},
+ timeout=self.uexcorp_api_timeout,
+ )
+ response.raise_for_status()
+ except requests.exceptions.RequestException as e:
+ await self._print(f"Error while retrieving data from {url}: {e}")
+ requests_error = True
+ if isinstance(e, requests.exceptions.Timeout):
+ timeout_error = True
+ request_count += 1
+
+ if requests_error:
+ return {}
+
+ try:
+ response_json = response.json()
+ except json.decoder.JSONDecodeError as e:
+ await self._print(f"Error while retrieving data from {url}: {e}")
+ return None
+
+ max_articles = 5
+ min_articles = 3
+ max_length = 2000
+ max_time = self.uexcorp_api_timeout
+ loaded_articles = []
+ articles = response_json.get("data", [])[:max_articles]
+
+ async def _load_article(article_id: str):
+ article_url = (
+ f"https://api.star-citizen.wiki/api/v2/galactapedia/{article_id}"
+ )
+ try:
+ article_response = requests.get(
+ article_url, timeout=self.uexcorp_api_timeout
+ )
+ article_response.raise_for_status()
+ except requests.exceptions.RequestException as e:
+ await self._print(
+ f"Error while retrieving data from {article_url}: {e}"
+ )
+ return None
+
+ try:
+ article_response_json = article_response.json()
+ except json.decoder.JSONDecodeError as e:
+ await self._print(
+ f"Error while retrieving data from {article_url}: {e}"
+ )
+ return None
+
+ response = (
+ article_response_json.get("data", {})
+ .get("translations", {})
+ .get("en_EN", "")
+ )
+ # scrap links and only keep link name for example [link name](link)
+ response = re.sub(r"\[([^\]]+)\]\([^)]+\)", r"\1", response)
+ response = re.sub(r"\n", " ", response)
+ response = re.sub(r"\s+", " ", response)
+ response = response[:max_length]
+ loaded_articles.append(response)
+
+ start_time = datetime.now()
+ for article in articles:
+ self.threaded_execution(_load_article, article["id"])
+
+ while (
+ len(loaded_articles) < min_articles
+ and (datetime.now() - start_time).seconds < max_time
+ ):
+ await asyncio.sleep(0.1)
+
+ return loaded_articles
+
+ def _format_ship_name(self, ship: dict[str, any], full_name: bool = True) -> str:
+ """
+ Formats the name of a ship.
+ This represents a list of names that can be used by the player.
+ So if you like to use manufacturer names + ship names, do it here.
+
+ Args:
+ ship (dict[str, any]): The ship dictionary containing the ship details.
+
+ Returns:
+ str: The formatted ship name.
+ """
+ if full_name:
+ return ship["name_full"]
+
+ return ship["name"]
+
+ def _format_terminal_name(
+ self, terminal: dict[str, any], full_name: bool = False
+ ) -> str:
+ """
+ Formats the name of a terminal.
+
+ Args:
+ terminal (dict[str, any]): The terminal dictionary containing the name.
+
+ Returns:
+ str: The formatted terminal name.
+ """
+ if full_name:
+ return terminal["name"]
+
+ return terminal["nickname"]
+
+ def _format_city_name(self, city: dict[str, any]) -> str:
+ """
+ Formats the name of a city.
+
+ Args:
+ city (dict[str, any]): A dictionary representing a city.
+
+ Returns:
+ str: The formatted name of the city.
+ """
+ return city["name"]
+
+ def _format_planet_name(self, planet: dict[str, any]) -> str:
+ """
+ Formats the name of a planet.
+
+ Args:
+ planet (dict[str, any]): A dictionary representing a planet.
+
+ Returns:
+ str: The formatted name of the planet.
+ """
+ return planet["name"]
+
+ def _format_moon_name(self, moon: dict[str, any]) -> str:
+ """
+ Formats the name of a moon.
+
+ Args:
+ moon (dict[str, any]): The moon dictionary.
+
+ Returns:
+ str: The formatted moon name.
+ """
+ return moon["name"]
+
+ def _format_system_name(self, system: dict[str, any]) -> str:
+ """
+ Formats the name of a system.
+
+ Args:
+ system (dict[str, any]): The system dictionary containing the name.
+
+ Returns:
+ str: The formatted system name.
+ """
+ return system["name"]
+
+ def _format_commodity_name(self, commodity: dict[str, any]) -> str:
+ """
+ Formats the name of a commodity.
+
+ Args:
+ commodity (dict[str, any]): The commodity dictionary.
+
+ Returns:
+ str: The formatted commodity name.
+ """
+ return commodity["name"]
+
+ def get_tools(self) -> list[tuple[str, dict]]:
+ trading_routes_optional = [
+ "money_to_spend",
+ "free_cargo_space",
+ "position_end_name",
+ "commodity_name",
+ "illegal_commodities_allowed",
+ "maximal_number_of_routes",
+ ]
+ trading_routes_required = ["ship_name"]
+
+ if self.uexcorp_tradestart_mandatory:
+ trading_routes_required.append("position_start_name")
+ else:
+ trading_routes_optional.append("position_start_name")
+
+ tools = [
+ (
+ "get_trading_routes",
+ {
+ "type": "function",
+ "function": {
+ "name": "get_trading_routes",
+ "description": "Finds all possible commodity trade options and gives back a selection of the best trade routes. Needs ship name and start position.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "ship_name": {"type": "string"},
+ "position_start_name": {"type": "string"},
+ "money_to_spend": {"type": "number"},
+ "position_end_name": {"type": "string"},
+ "free_cargo_space": {"type": "number"},
+ "commodity_name": {"type": "string"},
+ "illegal_commodities_allowed": {"type": "boolean"},
+ "maximal_number_of_routes": {"type": "number"},
+ },
+ "required": trading_routes_required,
+ "optional": trading_routes_optional,
+ },
+ },
+ },
+ ),
+ (
+ "get_locations_to_sell_to",
+ {
+ "type": "function",
+ "function": {
+ "name": "get_locations_to_sell_to",
+ "description": "Finds the best locations at what the player can sell cargo at. Only give position_name if the player specifically wanted to filter for it. Needs commodity name.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "commodity_name": {"type": "string"},
+ "ship_name": {"type": "string"},
+ "position_name": {"type": "string"},
+ "commodity_amount": {"type": "number"},
+ "maximal_number_of_locations": {"type": "number"},
+ },
+ "required": ["commodity_name"],
+ "optional": [
+ "ship_name",
+ "position_name",
+ "commodity_amount",
+ "maximal_number_of_locations",
+ ],
+ },
+ },
+ },
+ ),
+ (
+ "get_locations_to_buy_from",
+ {
+ "type": "function",
+ "function": {
+ "name": "get_locations_to_buy_from",
+ "description": "Finds the best locations at what the player can buy cargo at. Only give position_name if the player specifically wanted to filter for it. Needs commodity name.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "commodity_name": {"type": "string"},
+ "ship_name": {"type": "string"},
+ "position_name": {"type": "string"},
+ "commodity_amount": {"type": "number"},
+ "maximal_number_of_locations": {"type": "number"},
+ },
+ "required": ["commodity_name"],
+ "optional": [
+ "ship_name",
+ "position_name",
+ "commodity_amount",
+ "maximal_number_of_locations",
+ ],
+ },
+ },
+ },
+ ),
+ (
+ "get_location_information",
+ {
+ "type": "function",
+ "function": {
+ "name": "get_location_information",
+ "description": "Gives information and commodity prices of this location. Execute this if the player asks for all buy or sell options for a specific location.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "location_name": {"type": "string"},
+ },
+ "required": ["location_name"],
+ },
+ },
+ },
+ ),
+ (
+ "get_ship_information",
+ {
+ "type": "function",
+ "function": {
+ "name": "get_ship_information",
+ "description": "Gives information about the given ship. If a player asks to rent something or buy a ship, this function needs to be executed.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "ship_name": {"type": "string"},
+ },
+ "required": ["ship_name"],
+ },
+ },
+ },
+ ),
+ (
+ "get_ship_comparison",
+ {
+ "type": "function",
+ "function": {
+ "name": "get_ship_comparison",
+ "description": "Gives information about given ships. Also execute this function if the player asks for a ship information on multiple ships or a model series.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "ship_names": {
+ "type": "array",
+ "items": {"type": "string"},
+ },
+ },
+ "required": ["ship_names"],
+ },
+ },
+ },
+ ),
+ (
+ "get_commodity_information",
+ {
+ "type": "function",
+ "function": {
+ "name": "get_commodity_information",
+ "description": "Gives information about the given commodity. If a player asks for information about a commodity, this function needs to be executed.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "commodity_name": {"type": "string"},
+ },
+ "required": ["commodity_name"],
+ },
+ },
+ },
+ ),
+ (
+ "get_commodity_prices_and_terminals",
+ {
+ "type": "function",
+ "function": {
+ "name": "get_commodity_prices_and_terminals",
+ "description": "Gives information about the given commodity and its buy and sell offers. If a player asks for buy and sell information or locations on a commodity, this function needs to be executed.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "commodity_name": {"type": "string"},
+ },
+ "required": ["commodity_name"],
+ },
+ },
+ },
+ ),
+ (
+ "reload_current_commodity_prices",
+ {
+ "type": "function",
+ "function": {
+ "name": "reload_current_commodity_prices",
+ "description": "Reloads the current commodity prices from UEX corp.",
+ "parameters": {
+ "type": "object",
+ "properties": {},
+ "required": [],
+ },
+ },
+ },
+ ),
+ ]
+
+ if self.DEV_MODE:
+ tools.append(
+ (
+ "show_cached_function_values",
+ {
+ "type": "function",
+ "function": {
+ "name": "show_cached_function_values",
+ "description": "Prints the cached function's argument values to the console.",
+ "parameters": {
+ "type": "object",
+ "properties": {},
+ "required": [],
+ },
+ },
+ },
+ ),
+ )
+
+ return tools
+
+ async def execute_tool(
+ self, tool_name: str, parameters: dict[str, any]
+ ) -> tuple[str, str]:
+ function_response = ""
+ instant_response = ""
+
+ functions = {
+ "get_trading_routes": "get_trading_routes",
+ "get_locations_to_sell_to": "get_locations_to_sell_to",
+ "get_locations_to_buy_from": "get_locations_to_buy_from",
+ "get_location_information": "get_location_information",
+ "get_ship_information": "get_ship_information",
+ "get_ship_comparison": "get_ship_comparison",
+ "get_commodity_information": "get_commodity_information",
+ "get_commodity_prices_and_terminals": "get_commodity_information",
+ "reload_current_commodity_prices": "reload_current_commodity_prices",
+ "show_cached_function_values": "show_cached_function_values",
+ }
+
+ try:
+ if tool_name in functions:
+ if not self.skill_loaded:
+ self.skill_loaded_asked = True
+ await self._print(
+ "UEXcorp skill is not loaded yet. Please wait a moment.",
+ False,
+ False,
+ )
+ function_response = (
+ "Data is still beeing loaded. Please wait a moment."
+ )
+ return function_response, instant_response
+
+ self.start_execution_benchmark()
+ await self._print(f"Executing function: {tool_name}")
+ function = getattr(self, "_gpt_call_" + functions[tool_name])
+ function_response = await function(**parameters)
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+ if self.DEV_MODE:
+ await self._print(
+ f"_gpt_call_{functions[tool_name]} response: {function_response}",
+ True,
+ )
+ except Exception:
+ file_object = open(self.logfileerror, "a", encoding="UTF-8")
+ file_object.write(traceback.format_exc())
+ file_object.write(
+ "========================================================================================\n"
+ )
+ file_object.write(
+ f"Above error while executing custom function: _gpt_call_{tool_name}\n"
+ )
+ file_object.write(f"With parameters: {parameters}\n")
+ file_object.write(f"On date: {datetime.now()}\n")
+ file_object.write(f"Version: {self.skill_version}\n")
+ file_object.write(
+ "========================================================================================\n"
+ )
+ file_object.close()
+ await self._print(
+ f"Error while executing custom function: {tool_name}\nCheck log file for more details."
+ )
+ function_response = f"Error while executing custom function: {tool_name}"
+ function_response += "\nTell user there seems to be an error. And you must say that it should be report to the 'uexcorp skill developer (JayMatthew on Discord)'."
+
+ return function_response, instant_response
+
+ async def is_waiting_response_needed(self, tool_name: str) -> bool:
+ return True
+
+ async def _find_closest_match(
+ self, search: str | None, lst: list[str] | set[str]
+ ) -> str | None:
+ """
+ Finds the closest match to a given string in a list.
+ Or returns an exact match if found.
+ If it is not an exact match, OpenAI is used to find the closest match.
+
+ Args:
+ search (str): The search to find a match for.
+ lst (list): The list of strings to search for a match.
+
+ Returns:
+ str or None: The closest match found in the list, or None if no match is found.
+ """
+ if search is None or search == "None":
+ return None
+
+ self._log(f"Searching for closest match to '{search}' in list.", True)
+
+ checksum = f"{hash(frozenset(lst))}-{hash(search)}"
+ if checksum in self.cache["search_matches"]:
+ match = self.cache["search_matches"][checksum]
+ self._log(f"Found closest match to '{search}' in cache: '{match}'", True)
+ return match
+
+ if search in lst:
+ self._log(f"Found exact match to '{search}' in list.", True)
+ return search
+
+ # make a list of possible matches
+ closest_matches = difflib.get_close_matches(search, lst, n=10, cutoff=0.4)
+ closest_matches.extend(item for item in lst if search.lower() in item.lower())
+ self._log(
+ f"Making a list for closest matches for search term '{search}': {', '.join(closest_matches)}",
+ True,
+ )
+
+ if not closest_matches:
+ self._log(
+ f"No closest match found for '{search}' in list. Returning None.", True
+ )
+ return None
+
+ messages = [
+ {
+ "role": "system",
+ "content": f"""
+ I'll give you just a string value.
+ You will figure out, what value in this list represents this value best: {', '.join(closest_matches)}
+ Keep in mind that the given string value can be misspelled or has missing words as it has its origin in a speech to text process.
+ You must only return the value of the closest match to the given value from the defined list, nothing else.
+ For example if "Hercules A2" is given and the list contains of "A2, C2, M2", you will return "A2" as string.
+ Or if "C2" is given and the list contains of "A2 Hercules Star Lifter, C2 Monster Truck, M2 Extreme cool ship", you will return "C2 Monster Truck" as string.
+ On longer search terms, prefer the exact match, if it is in the list.
+ The response must not contain anything else, than the exact value of the closest match from the list.
+ If you can't find a match, return 'None'. Do never return the given search value.
+ """,
+ },
+ {
+ "role": "user",
+ "content": search,
+ },
+ ]
+ completion = await self.llm_call(messages)
+ answer = (
+ completion.choices[0].message.content
+ if completion and completion.choices
+ else ""
+ )
+
+ if not answer:
+ dumb_match = difflib.get_close_matches(
+ search, closest_matches, n=1, cutoff=0.9
+ )
+ if dumb_match:
+ self._log(
+ f"OpenAI did not answer for '{search}'. Returning dumb match '{dumb_match}'",
+ True,
+ )
+ return dumb_match[0]
+ else:
+ self._log(
+ f"OpenAI did not answer for '{search}' and dumb match not possible. Returning None.",
+ True,
+ )
+ return None
+
+ self._log(f"OpenAI answered: '{answer}'", True)
+
+ if answer == "None" or answer not in closest_matches:
+ self._log(
+ f"No closest match found for '{search}' in list. Returning None.", True
+ )
+ return None
+
+ self._log(f"Found closest match to '{search}' in list: '{answer}'", True)
+ self.add_context(f"\n\nInstead of '{search}', you should use '{answer}'.")
+ self.cache["search_matches"][checksum] = answer
+ return answer
+
+ async def get_prompt(self) -> str | None:
+ """Return additional context."""
+ additional_context = self.config.prompt or ""
+ additional_context += "\n" + self.dynamic_context
+ return additional_context
+
+ async def _gpt_call_show_cached_function_values(self) -> str:
+ """
+ Prints the cached function's argument values to the console.
+
+ Returns:
+ str: A message indicating that the cached function's argument values have been printed to the console.
+ """
+ self._log(self.cache["function_args"], True)
+ return "The cached function values are: \n" + json.dumps(
+ self.cache["function_args"]
+ )
+
+ async def _gpt_call_reload_current_commodity_prices(self) -> str:
+ """
+ Reloads the current commodity prices from UEX corp.
+
+ Returns:
+ str: A message indicating that the current commodity prices have been reloaded.
+ """
+ await self._load_data(True)
+ # clear cached data
+ for key in self.cache:
+ self.cache[key] = {}
+
+ self._log("Reloaded current commodity prices from UEX corp.", True)
+ return "Reloaded current commodity prices from UEX corp."
+
+ async def _gpt_call_get_commodity_information(
+ self, commodity_name: str = None
+ ) -> str:
+ """
+ Retrieves information about a given commodity.
+
+ Args:
+ commodity_name (str, optional): The name of the commodity. Defaults to None.
+
+ Returns:
+ str: The information about the commodity in JSON format, or an error message if the commodity is not found.
+ """
+ self._log(f"Parameters: Commodity: {commodity_name}", True)
+
+ commodity_name = self._get_function_arg_from_cache(
+ "commodity_name", commodity_name
+ )
+
+ if commodity_name is None:
+ self._log("No commodity given. Ask for a commodity.", True)
+ return "No commodity given. Ask for a commodity."
+
+ misunderstood = []
+ closest_match = await self._find_closest_match(
+ commodity_name, self.commodity_names
+ )
+ if closest_match is None:
+ misunderstood.append(f"Commodity: {commodity_name}")
+ else:
+ commodity_name = closest_match
+
+ self._log(f"Interpreted Parameters: Commodity: {commodity_name}", True)
+
+ if misunderstood:
+ misunderstood_str = ", ".join(misunderstood)
+ self._log(
+ f"These given parameters do not exist in game. Exactly ask for clarification of these values: {misunderstood_str}",
+ True,
+ )
+ return f"These given parameters do not exist in game. Exactly ask for clarification of these values: {misunderstood_str}"
+
+ commodity = self._get_commodity_by_name(commodity_name)
+ if commodity is not None:
+ output_commodity = await self._get_converted_commodity_for_output(commodity)
+ self._log(output_commodity, True)
+ return json.dumps(output_commodity)
+
+ async def _gpt_call_get_ship_information(self, ship_name: str = None) -> str:
+ """
+ Retrieves information about a specific ship.
+
+ Args:
+ ship_name (str, optional): The name of the ship. Defaults to None.
+
+ Returns:
+ str: The ship information or an error message.
+
+ """
+ self._log(f"Parameters: Ship: {ship_name}", True)
+
+ ship_name = self._get_function_arg_from_cache("ship_name", ship_name)
+
+ if ship_name is None:
+ self._log("No ship given. Ask for a ship. Dont say sorry.", True)
+ return "No ship given. Ask for a ship. Dont say sorry."
+
+ misunderstood = []
+ closest_match = await self._find_closest_match(ship_name, self.ship_names)
+ if closest_match is None:
+ misunderstood.append(f"Ship: {ship_name}")
+ else:
+ ship_name = closest_match
+
+ self._log(f"Interpreted Parameters: Ship: {ship_name}", True)
+
+ if misunderstood:
+ misunderstood_str = ", ".join(misunderstood)
+ self._log(
+ f"These given parameters do not exist in game. Exactly ask for clarification of these values: {misunderstood_str}",
+ True,
+ )
+ return f"These given parameters do not exist in game. Exactly ask for clarification of these values: {misunderstood_str}"
+
+ ship = self._get_ship_by_name(ship_name)
+ if ship is not None:
+ output_ship = f"Summarize in natural language: {(await self._get_converted_ship_for_output(ship))}"
+ self._log(output_ship, True)
+ return json.dumps(output_ship)
+
+ async def _gpt_call_get_ship_comparison(self, ship_names: list[str] = None) -> str:
+ """
+ Retrieves information about multiple ships.
+
+ Args:
+ ship_names (list[str], optional): The names of the ships. Defaults to None.
+
+ Returns:
+ str: The ship information or an error message.
+ """
+ self._log(f"Parameters: Ships: {', '.join(ship_names)}", True)
+
+ if ship_names is None or not ship_names:
+ self._log("No ship given. Ask for a ship. Dont say sorry.", True)
+ return "No ship given. Ask for a ship. Dont say sorry."
+
+ misunderstood = []
+ ships = []
+ for ship_name in ship_names:
+ closest_match = await self._find_closest_match(ship_name, self.ship_names)
+ if closest_match is None:
+ misunderstood.append(ship_name)
+ else:
+ ship_name = closest_match
+ ships.append(self._get_ship_by_name(ship_name))
+
+ self._log(f"Interpreted Parameters: Ships: {', '.join(ship_names)}", True)
+
+ if misunderstood:
+ self._log(
+ f"These ship names do not exist in game. Exactly ask for clarification of these ships: {', '.join(misunderstood)}",
+ True,
+ )
+ return f"These ship names do not exist in game. Exactly ask for clarification of these ships: {', '.join(misunderstood)}"
+
+ output = {}
+ for ship in ships:
+
+ async def get_ship_info(ship):
+ output[self._format_ship_name(ship)] = (
+ await self._get_converted_ship_for_output(ship)
+ )
+
+ self.threaded_execution(get_ship_info, ship)
+
+ while len(output) < len(ships):
+ await asyncio.sleep(0.1)
+
+ output = (
+ "Point out differences between these ships in natural language and sentences without just listing differences. Describe them! And dont mention something both cant do:\n"
+ + json.dumps(output)
+ )
+ self._log(output, True)
+ return output
+
+ async def _gpt_call_get_location_information(
+ self, location_name: str = None
+ ) -> str:
+ """
+ Retrieves information about a given location.
+
+ Args:
+ location_name (str, optional): The name of the location. Defaults to None.
+
+ Returns:
+ str: The information about the location in JSON format, or an error message if the location is not found.
+ """
+ self._log(f"Parameters: Location: {location_name}", True)
+
+ location_name = self._get_function_arg_from_cache(
+ "location_name", location_name
+ )
+
+ if location_name is None:
+ self._log("No location given. Ask for a location.", True)
+ return "No location given. Ask for a location."
+
+ misunderstood = []
+ closest_match = await self._find_closest_match(
+ location_name, self.location_names_set
+ )
+ if closest_match is None:
+ misunderstood.append(f"Location: {location_name}")
+ else:
+ location_name = closest_match
+
+ self._log(f"Interpreted Parameters: Location: {location_name}", True)
+
+ if misunderstood:
+ misunderstood_str = ", ".join(misunderstood)
+ self._log(
+ f"These given parameters do not exist in game. Exactly ask for clarification of these values: {misunderstood_str}",
+ True,
+ )
+ return f"These given parameters do not exist in game. Exactly ask for clarification of these values: {misunderstood_str}"
+
+ formating = "Summarize in natural language: "
+
+ # get a clone of the data
+ terminal = self._get_terminal_by_name(location_name)
+ if terminal is not None:
+ output = await self._get_converted_terminal_for_output(terminal)
+ self._log(output, True)
+ return formating + json.dumps(output)
+ city = self._get_city_by_name(location_name)
+ if city is not None:
+ output = await self._get_converted_city_for_output(city)
+ self._log(output, True)
+ return formating + json.dumps(output)
+ moon = self._get_moon_by_name(location_name)
+ if moon is not None:
+ output = await self._get_converted_moon_for_output(moon)
+ self._log(output, True)
+ return formating + json.dumps(output)
+ planet = self._get_planet_by_name(location_name)
+ if planet is not None:
+ output = await self._get_converted_planet_for_output(planet)
+ self._log(output, True)
+ return formating + json.dumps(output)
+ system = self._get_system_by_name(location_name)
+ if system is not None:
+ output = await self._get_converted_system_for_output(system)
+ self._log(output, True)
+ return formating + json.dumps(output)
+
+ async def _get_converted_terminal_for_output(
+ self, terminal: dict[str, any], allow_lore: bool = True
+ ) -> dict[str, any]:
+ """
+ Converts a terminal dictionary to a dictionary that can be used as output.
+
+ Args:
+ terminal (dict[str, any]): The terminal dictionary to be converted.
+
+ Returns:
+ dict[str, any]: The converted terminal dictionary.
+ """
+ lore = allow_lore and self.uexcorp_add_lore
+ checksum = f"terminal--{terminal['id']}--{lore}"
+ if checksum in self.cache["readable_objects"]:
+ return self.cache["readable_objects"][checksum]
+
+ output = {
+ "type": "terminal",
+ "subtype": terminal["type"],
+ "name": self._format_terminal_name(terminal, True),
+ "nickname": self._format_terminal_name(terminal),
+ "star_system": self._get_system_name_by_code(terminal["id_star_system"]),
+ "planet": self._get_planet_name_by_code(terminal["id_planet"]),
+ "city": self._get_city_name_by_code(terminal["id_city"]),
+ "moon": self._get_moon_name_by_code(terminal["id_moon"]),
+ }
+ if terminal["type"] == "commodity":
+ output["hull_trading"] = (
+ "Trading with large ships, that need a loading area, is possible."
+ if "hull_trading" in terminal and terminal["hull_trading"]
+ else "Trading with large ships, that need a loading area, is not possible."
+ )
+
+ if "vehicle_rental" in terminal:
+ output["vehicles_for_rental"] = []
+ for option in terminal["vehicle_rental"]:
+ ship = self.ship_code_dict[option["id_vehicle"]]
+ output["vehicles_for_rental"].append(
+ f"Rent {self._format_ship_name(ship)} for {option['price_rent']} aUEC."
+ )
+
+ if "vehicle_purchase" in terminal:
+ output["vehicles_for_purchase"] = []
+ for option in terminal["vehicle_purchase"]:
+ ship = self.ship_code_dict[option["id_vehicle"]]
+ output["vehicles_for_purchase"].append(
+ f"Buy {self._format_ship_name(ship)} for {option['price_buy']} aUEC."
+ )
+
+ if "prices" in terminal:
+ buyable_commodities = [
+ f"{data['name']} for {data['price_buy']} aUEC per SCU"
+ for commodity_code, data in terminal["prices"].items()
+ if data["operation"] == "buy"
+ ]
+ sellable_commodities = [
+ f"{data['name']} for {data['price_sell']} aUEC per SCU"
+ for commodity_code, data in terminal["prices"].items()
+ if data["operation"] == "sell"
+ ]
+
+ if len(buyable_commodities):
+ output["buyable_commodities"] = ", ".join(buyable_commodities)
+ if len(sellable_commodities):
+ output["sellable_commodities"] = ", ".join(sellable_commodities)
+
+ for key in ["system", "planet", "city", "moon"]:
+ if output.get(key) is None:
+ output.pop(key, None)
+
+ if lore:
+ output["background_information"] = await self._fetch_lore(
+ self._format_terminal_name(terminal)
+ )
+
+ self.cache["readable_objects"][checksum] = output
+ return output
+
+ async def _get_converted_city_for_output(
+ self, city: dict[str, any]
+ ) -> dict[str, any]:
+ """
+ Converts a city dictionary to a dictionary that can be used as output.
+
+ Args:
+ city (dict[str, any]): The city dictionary to be converted.
+
+ Returns:
+ dict[str, any]: The converted city dictionary.
+ """
+ checksum = f"city--{city['id']}"
+ if checksum in self.cache["readable_objects"]:
+ return self.cache["readable_objects"][checksum]
+
+ output = {
+ "type": "City",
+ "name": self._format_city_name(city),
+ "star_system": self._get_system_name_by_code(city["id_star_system"]),
+ "planet": self._get_planet_name_by_code(city["id_planet"]),
+ "moon": self._get_moon_name_by_code(city["id_moon"]),
+ "is_armistice": "Yes" if city["is_armistice"] else "No",
+ "has_freight_elevator": "Yes" if city["has_freight_elevator"] else "No",
+ "has_docking_ports": "Yes" if city["has_docking_port"] else "No",
+ "has_clinic": "Yes" if city["has_clinic"] else "No",
+ "has_food": "Yes" if city["has_food"] else "No",
+ "has_refuel_option": "Yes" if city["has_refuel"] else "No",
+ "has_repair_option": "Yes" if city["has_repair"] else "No",
+ "has_refinery": "Yes" if city["has_refinery"] else "No",
+ }
+
+ terminals = self._get_terminals_by_position_name(city["name"])
+ if terminals:
+ output["options_to_trade"] = ", ".join(
+ [self._format_terminal_name(terminal) for terminal in terminals]
+ )
+
+ for key in ["star_system", "planet", "moon"]:
+ if output.get(key) is None:
+ output.pop(key, None)
+
+ if self.uexcorp_add_lore:
+ output["background_information"] = await self._fetch_lore(
+ self._format_city_name(city)
+ )
+
+ self.cache["readable_objects"][checksum] = output
+ return output
+
+ async def _get_converted_moon_for_output(
+ self, moon: dict[str, any]
+ ) -> dict[str, any]:
+ """
+ Converts a moon dictionary to a dictionary that can be used as output.
+
+ Args:
+ moon (dict[str, any]): The moon dictionary to be converted.
+
+ Returns:
+ dict[str, any]: The converted moon dictionary.
+ """
+ checksum = f"moon--{moon['id']}"
+ if checksum in self.cache["readable_objects"]:
+ return self.cache["readable_objects"][checksum]
+
+ output = {
+ "type": "Moon",
+ "name": self._format_moon_name(moon),
+ "star_system": self._get_system_name_by_code(moon["id_star_system"]),
+ "orbits_planet": self._get_planet_name_by_code(moon["id_planet"]),
+ }
+
+ terminals = self._get_terminals_by_position_name(self._format_moon_name(moon))
+ if terminals:
+ output["options_to_trade"] = ", ".join(
+ [self._format_terminal_name(terminal) for terminal in terminals]
+ )
+
+ for key in ["star_system", "orbits_planet"]:
+ if output.get(key) is None:
+ output.pop(key, None)
+
+ if self.uexcorp_add_lore:
+ output["background_information"] = await self._fetch_lore(
+ self._format_moon_name(moon)
+ )
+
+ self.cache["readable_objects"][checksum] = output
+ return output
+
+ async def _get_converted_planet_for_output(
+ self, planet: dict[str, any]
+ ) -> dict[str, any]:
+ """
+ Converts a planet dictionary to a dictionary that can be used as output.
+
+ Args:
+ planet (dict[str, any]): The planet dictionary to be converted.
+
+ Returns:
+ dict[str, any]: The converted planet dictionary.
+ """
+ checksum = f"planet--{planet['id']}"
+ if checksum in self.cache["readable_objects"]:
+ return self.cache["readable_objects"][checksum]
+
+ output = {
+ "type": "Planet",
+ "name": self._format_planet_name(planet),
+ "star_system": self._get_system_name_by_code(planet["id_star_system"]),
+ }
+
+ terminals = self._get_terminals_by_position_name(planet["name"])
+ if terminals:
+ output["options_to_trade"] = ", ".join(
+ [self._format_terminal_name(terminal) for terminal in terminals]
+ )
+
+ moons = self._get_moons_by_planetcode(planet["code"])
+ if moons:
+ output["moons"] = ", ".join(
+ [self._format_moon_name(moon) for moon in moons]
+ )
+
+ cities = self._get_cities_by_planetcode(planet["code"])
+ if cities:
+ output["cities"] = ", ".join(
+ [self._format_city_name(city) for city in cities]
+ )
+
+ for key in ["star_system"]:
+ if output.get(key) is None:
+ output.pop(key, None)
+
+ if self.uexcorp_add_lore:
+ output["background_information"] = await self._fetch_lore(
+ self._format_planet_name(planet)
+ )
+
+ self.cache["readable_objects"][checksum] = output
+ return output
+
+ async def _get_converted_system_for_output(
+ self, system: dict[str, any]
+ ) -> dict[str, any]:
+ """
+ Converts a system dictionary to a dictionary that can be used as output.
+
+ Args:
+ system (dict[str, any]): The system dictionary to be converted.
+
+ Returns:
+ dict[str, any]: The converted system dictionary.
+ """
+ checksum = f"system--{system['id']}"
+ if checksum in self.cache["readable_objects"]:
+ return self.cache["readable_objects"][checksum]
+
+ output = {
+ "type": "Star System",
+ "name": self._format_system_name(system),
+ }
+
+ terminals = self._get_terminals_by_position_name(system["name"])
+ if terminals:
+ output["options_to_trade"] = f"{len(terminals)} different options to trade."
+ terminal_without_planets = []
+ gateways = []
+ for terminal in terminals:
+ if not terminal["id_planet"]:
+ if terminal["name"].find("Gateway") != -1:
+ gateways.append(terminal)
+ else:
+ terminal_without_planets.append(terminal)
+ if terminal_without_planets:
+ output["space_stations"] = ", ".join(
+ [
+ self._format_terminal_name(terminal)
+ for terminal in terminal_without_planets
+ ]
+ )
+ if gateways:
+ output["gateways"] = ", ".join(
+ [self._format_terminal_name(terminal) for terminal in gateways]
+ )
+
+ planets = self._get_planets_by_systemcode(system["code"])
+ if planets:
+ output["planets"] = ", ".join(
+ [self._format_planet_name(planet) for planet in planets]
+ )
+
+ if self.uexcorp_add_lore:
+ output["background_information"] = await self._fetch_lore(
+ self._format_system_name(system)
+ )
+
+ self.cache["readable_objects"][checksum] = output
+ return output
+
+ async def _get_converted_ship_for_output(
+ self, ship: dict[str, any]
+ ) -> dict[str, any]:
+ """
+ Converts a ship dictionary to a dictionary that can be used as output.
+
+ Args:
+ ship (dict[str, any]): The ship dictionary to be converted.
+
+ Returns:
+ dict[str, any]: The converted ship dictionary.
+ """
+ checksum = f"ship--{ship['id']}"
+ if checksum in self.cache["readable_objects"]:
+ return self.cache["readable_objects"][checksum]
+
+ output = {
+ "type": "Ship" if ship["is_spaceship"] else "Groud Vehicle",
+ "name": self._format_ship_name(ship),
+ "manufacturer": ship["company_name"],
+ "cargo_capacity": f"{ship['scu']} SCU",
+ # "added_on_version": "Unknown" if ship["is_concept"] else ship["game_version"],
+ "field_of_activity": self._get_ship_field_of_activity(ship),
+ }
+
+ if not ship["is_concept"]:
+ if ship["purchase"]:
+ output["purchase_at"] = []
+ for option in ship["purchase"]:
+ terminal = self.terminal_code_dict[option["id_terminal"]]
+ output["purchase_at"].append(
+ f"Buy at {self._format_terminal_name(terminal)} for {option['price_buy']} aUEC"
+ )
+ else:
+ output["purchase_at"] = "Not available for purchase."
+ if ship["rental"]:
+ output["rent_at"] = []
+ for option in ship["rental"]:
+ terminal = self.terminal_code_dict[option["id_terminal"]]
+ output["rent_at"].append(
+ f"Rent at {self._format_terminal_name(terminal)} for {option['price_rent']} aUEC per day."
+ )
+ else:
+ output["rent_at"] = "Not available as rental."
+
+ if ship["hull_trading"] is True:
+ output["trading_info"] = (
+ "This ship can only trade on suitable space stations with cargo loading option."
+ )
+
+ if self.uexcorp_add_lore:
+ output["additional_info"] = await self._fetch_lore(
+ self._format_ship_name(ship, False)
+ )
+
+ self.cache["readable_objects"][checksum] = output
+ return output
+
+ def _get_ship_field_of_activity(self, ship: dict[str, any]) -> str:
+ """
+ Returns the field of activity of a ship.
+
+ Args:
+ ship (dict[str, any]): The ship dictionary to get the field of activity for.
+
+ Returns:
+ str: The field of activity of the ship.
+ """
+
+ field = []
+ if ship["is_exploration"]:
+ field.append("Exploration")
+ if ship["is_mining"]:
+ field.append("Mining")
+ if ship["is_salvage"]:
+ field.append("Salvage")
+ if ship["is_refinery"]:
+ field.append("Refinery")
+ if ship["is_scanning"]:
+ field.append("Scanning")
+ if ship["is_cargo"]:
+ field.append("Cargo")
+ if ship["is_medical"]:
+ field.append("Medical")
+ if ship["is_racing"]:
+ field.append("Racing")
+ if ship["is_repair"]:
+ field.append("Repair")
+ if ship["is_refuel"]:
+ field.append("Refuel")
+ if ship["is_interdiction"]:
+ field.append("Interdiction")
+ if ship["is_tractor_beam"]:
+ field.append("Tractor Beam")
+ if ship["is_qed"]:
+ field.append("Quantum Interdiction")
+ if ship["is_emp"]:
+ field.append("EMP")
+ if ship["is_construction"]:
+ field.append("Construction")
+ if ship["is_datarunner"]:
+ field.append("Datarunner")
+ if ship["is_science"]:
+ field.append("Science")
+ if ship["is_boarding"]:
+ field.append("Boarding")
+ if ship["is_stealth"]:
+ field.append("Stealth")
+ if ship["is_research"]:
+ field.append("Research")
+ if ship["is_carrier"]:
+ field.append("Carrier")
+
+ addition = []
+ if ship["is_civilian"]:
+ addition.append("Civilian")
+ if ship["is_military"]:
+ addition.append("Military")
+
+ return f"{', '.join(field)} ({' & '.join(addition)})"
+
+ async def _get_converted_commodity_for_output(
+ self, commodity: dict[str, any]
+ ) -> dict[str, any]:
+ """
+ Converts a commodity dictionary to a dictionary that can be used as output.
+
+ Args:
+ commodity (dict[str, any]): The commodity dictionary to be converted.
+
+ Returns:
+ dict[str, any]: The converted commodity dictionary.
+ """
+ checksum = f"commodity--{commodity['id']}"
+ if checksum in self.cache["readable_objects"]:
+ return self.cache["readable_objects"][checksum]
+
+ output = {
+ "type": "Commodity",
+ "subtype": commodity["kind"],
+ "name": commodity["name"],
+ }
+
+ price_buy_best = None
+ price_sell_best = None
+ output["buy_at"] = {}
+ output["sell_at"] = {}
+
+ for terminal in self.terminals:
+ if "prices" not in terminal:
+ continue
+ if commodity["id"] in terminal["prices"]:
+ if terminal["prices"][commodity["id"]]["operation"] == "buy":
+ price_buy = terminal["prices"][commodity["id"]]["price_buy"]
+ if price_buy_best is None or price_buy < price_buy_best:
+ price_buy_best = price_buy
+ output["buy_at"][
+ self._format_terminal_name(terminal)
+ ] = f"{price_buy} aUEC"
+ else:
+ price_sell = terminal["prices"][commodity["id"]]["price_sell"]
+ if price_sell_best is None or price_sell > price_sell_best:
+ price_sell_best = price_sell
+ output["sell_at"][
+ self._format_terminal_name(terminal)
+ ] = f"{price_sell} aUEC"
+
+ output["best_buy_price"] = (
+ f"{price_buy_best} aUEC" if price_buy_best else "Not buyable."
+ )
+ output["best_sell_price"] = (
+ f"{price_sell_best} aUEC" if price_sell_best else "Not sellable."
+ )
+
+ boolean_keys = ["is_harvestable", "is_mineral", "is_illegal"]
+ for key in boolean_keys:
+ output[key] = "Yes" if commodity[key] else "No"
+
+ if commodity["is_illegal"]:
+ output["notes"] = (
+ "Stay away from ship scanns to avoid fines and crimestat, as this commodity is illegal."
+ )
+
+ if self.uexcorp_add_lore:
+ output["additional_info"] = await self._fetch_lore(commodity["name"])
+
+ self.cache["readable_objects"][checksum] = output
+ return output
+
+ async def _gpt_call_get_locations_to_sell_to(
+ self,
+ commodity_name: str = None,
+ ship_name: str = None,
+ position_name: str = None,
+ commodity_amount: int = 1,
+ maximal_number_of_locations: int = 5,
+ ) -> str:
+ await self._print(
+ f"Given Parameters: Commodity: {commodity_name}, Ship Name: {ship_name}, Current Position: {position_name}, Amount: {commodity_amount}, Maximal Number of Locations: {maximal_number_of_locations}",
+ True,
+ )
+
+ commodity_name = self._get_function_arg_from_cache(
+ "commodity_name", commodity_name
+ )
+ ship_name = self._get_function_arg_from_cache("ship_name", ship_name)
+
+ if commodity_name is None:
+ self._log("No commodity given. Ask for a commodity.", True)
+ return "No commodity given. Ask for a commodity."
+
+ misunderstood = []
+ parameters = {
+ "commodity_name": (commodity_name, self.commodity_names),
+ "ship_name": (ship_name, self.ship_names),
+ "position_name": (position_name, self.location_names_set_trading),
+ }
+ for param, (value, names_set) in parameters.items():
+ if value is not None:
+ match = await self._find_closest_match(value, names_set)
+ if match is None:
+ misunderstood.append(f"{param}: {value}")
+ else:
+ self._set_function_arg_to_cache(param, match)
+ parameters[param] = (match, names_set)
+ commodity_name = parameters["commodity_name"][0]
+ ship_name = parameters["ship_name"][0]
+ position_name = parameters["position_name"][0]
+
+ await self._print(
+ f"Interpreted Parameters: Commodity: {commodity_name}, Ship Name: {ship_name}, Position: {position_name}, Amount: {commodity_amount}, Maximal Number of Locations: {maximal_number_of_locations}",
+ True,
+ )
+
+ if misunderstood:
+ self._log(
+ "These given parameters do not exist in game. Exactly ask for clarification of these values: "
+ + ", ".join(misunderstood),
+ True,
+ )
+ return (
+ "These given parameters do not exist in game. Exactly ask for clarification of these values: "
+ + ", ".join(misunderstood)
+ )
+
+ terminals = (
+ self.terminals
+ if position_name is None
+ else self._get_terminals_by_position_name(position_name)
+ )
+ commodity = self._get_commodity_by_name(commodity_name)
+ ship = self._get_ship_by_name(ship_name)
+ amount = max(1, int(commodity_amount or 1))
+ maximal_number_of_locations = max(1, int(maximal_number_of_locations or 3))
+
+ selloptions = collections.defaultdict(list)
+ for terminal in terminals:
+ sellprice = self._get_data_location_sellprice(
+ terminal, commodity, ship, amount
+ )
+ if sellprice is not None:
+ selloptions[sellprice].append(terminal)
+
+ selloptions = dict(sorted(selloptions.items(), reverse=True))
+ selloptions = dict(
+ itertools.islice(selloptions.items(), maximal_number_of_locations)
+ )
+
+ messages = [
+ f"Here are the best {len(selloptions)} locations to sell {amount} SCU {commodity_name}:"
+ ]
+
+ for sellprice, terminals in selloptions.items():
+ messages.append(f"{sellprice} aUEC:")
+ for terminal in terminals:
+ messages.append(await self._get_terminal_route_description(terminal))
+ messages.append("\n")
+
+ self._log("\n".join(messages), True)
+ return "\n".join(messages)
+
+ async def _gpt_call_get_locations_to_buy_from(
+ self,
+ commodity_name: str = None,
+ ship_name: str = None,
+ position_name: str = None,
+ commodity_amount: int = 1,
+ maximal_number_of_locations: int = 5,
+ ) -> str:
+ await self._print(
+ f"Given Parameters: Commodity: {commodity_name}, Ship Name: {ship_name}, Current Position: {position_name}, Amount: {commodity_amount}, Maximal Number of Locations: {maximal_number_of_locations}",
+ True,
+ )
+
+ commodity_name = self._get_function_arg_from_cache(
+ "commodity_name", commodity_name
+ )
+ ship_name = self._get_function_arg_from_cache("ship_name", ship_name)
+
+ if commodity_name is None:
+ self._log("No commodity given. Ask for a commodity.", True)
+ return "No commodity given. Ask for a commodity."
+
+ misunderstood = []
+ parameters = {
+ "ship_name": (ship_name, self.ship_names),
+ "location_name": (position_name, self.location_names_set_trading),
+ "commodity_name": (commodity_name, self.commodity_names),
+ }
+ for param, (value, names_set) in parameters.items():
+ if value is not None:
+ match = await self._find_closest_match(value, names_set)
+ if match is None:
+ misunderstood.append(f"{param}: {value}")
+ else:
+ self._set_function_arg_to_cache(param, match)
+ parameters[param] = (match, names_set)
+ ship_name = parameters["ship_name"][0]
+ position_name = parameters["location_name"][0]
+ commodity_name = parameters["commodity_name"][0]
+
+ await self._print(
+ f"Interpreted Parameters: Commodity: {commodity_name}, Ship Name: {ship_name}, Position: {position_name}, Amount: {commodity_amount}, Maximal Number of Locations: {maximal_number_of_locations}",
+ True,
+ )
+
+ if misunderstood:
+ self._log(
+ "These given parameters do not exist in game. Exactly ask for clarification of these values: "
+ + ", ".join(misunderstood),
+ True,
+ )
+ return (
+ "These given parameters do not exist in game. Exactly ask for clarification of these values: "
+ + ", ".join(misunderstood)
+ )
+
+ terminals = (
+ self.terminals
+ if position_name is None
+ else self._get_terminals_by_position_name(position_name)
+ )
+ commodity = self._get_commodity_by_name(commodity_name)
+ ship = self._get_ship_by_name(ship_name)
+ amount = max(1, int(commodity_amount or 1))
+ maximal_number_of_locations = max(1, int(maximal_number_of_locations or 3))
+
+ buyoptions = collections.defaultdict(list)
+ for terminal in terminals:
+ buyprice = self._get_data_location_buyprice(
+ terminal, commodity, ship, amount
+ )
+ if buyprice is not None:
+ buyoptions[buyprice].append(terminal)
+
+ buyoptions = dict(sorted(buyoptions.items(), reverse=False))
+ buyoptions = dict(
+ itertools.islice(buyoptions.items(), maximal_number_of_locations)
+ )
+
+ messages = [
+ f"Here are the best {len(buyoptions)} locations to buy {amount} SCU {commodity_name}:"
+ ]
+ for buyprice, terminals in buyoptions.items():
+ messages.append(f"{buyprice} aUEC:")
+ messages.append(await self._get_terminal_route_description(terminal))
+ messages.append("\n")
+
+ self._log("\n".join(messages), True)
+ return "\n".join(messages)
+
+ def _get_data_location_sellprice(self, terminal, commodity, ship=None, amount=1):
+ if (
+ ship is not None
+ and ship["hull_trading"] is True
+ and terminal["hull_trading"] is False
+ ):
+ return None
+
+ if "prices" not in terminal:
+ return None
+
+ commodity_code = commodity["id"]
+ for code, price in terminal["prices"].items():
+ if code == commodity_code and price["operation"] == "sell":
+ return price["price_sell"] * amount
+ return None
+
+ def _get_data_location_buyprice(self, terminal, commodity, ship=None, amount=1):
+ if (
+ ship is not None
+ and ship["hull_trading"] is True
+ and terminal["hull_trading"] is False
+ ):
+ return None
+
+ if "prices" not in terminal:
+ return None
+
+ commodity_code = commodity["id"]
+ for code, price in terminal["prices"].items():
+ if code == commodity_code and price["operation"] == "buy":
+ return price["price_buy"] * amount
+ return None
+
+ async def _gpt_call_get_trading_routes(
+ self,
+ ship_name: str = None,
+ money_to_spend: float = None,
+ position_start_name: str = None,
+ free_cargo_space: float = None,
+ position_end_name: str = None,
+ commodity_name: str = None,
+ illegal_commodities_allowed: bool = None,
+ maximal_number_of_routes: int = None,
+ ) -> str:
+ """
+ Finds multiple best trading routes based on the given parameters.
+
+ Args:
+ ship_name (str, optional): The name of the ship. Defaults to None.
+ money_to_spend (float, optional): The amount of money to spend. Defaults to None.
+ position_start_name (str, optional): The name of the starting position. Defaults to None.
+ free_cargo_space (float, optional): The amount of free cargo space. Defaults to None.
+ position_end_name (str, optional): The name of the ending position. Defaults to None.
+ commodity_name (str, optional): The name of the commodity. Defaults to None.
+ illegal_commodities_allowed (bool, optional): Flag indicating whether illegal commodities are allowed. Defaults to True.
+ maximal_number_of_routes (int, optional): The maximum number of routes to return. Defaults to 2.
+
+ Returns:
+ str: A string representation of the trading routes found.
+ """
+
+ # For later use in distance calculation:
+ # https://starmap.tk/api/v2/oc/
+ # https://starmap.tk/api/v2/pois/
+
+ await self._print(
+ f"Parameters: Ship: {ship_name}, Position Start: {position_start_name}, Position End: {position_end_name}, Commodity Name: {commodity_name}, Money: {money_to_spend} aUEC, free_cargo_space: {free_cargo_space} SCU, Maximal Number of Routes: {maximal_number_of_routes}, Illegal Allowed: {illegal_commodities_allowed}",
+ True,
+ )
+
+ ship_name = self._get_function_arg_from_cache("ship_name", ship_name)
+ illegal_commodities_allowed = self._get_function_arg_from_cache(
+ "illegal_commodities_allowed", illegal_commodities_allowed
+ )
+ if illegal_commodities_allowed is None:
+ illegal_commodities_allowed = True
+
+ missing_args = []
+ if ship_name is None:
+ missing_args.append("ship_name")
+
+ if self.uexcorp_tradestart_mandatory and position_start_name is None:
+ missing_args.append("position_start_name")
+
+ money_to_spend = (
+ None
+ if money_to_spend is not None and int(money_to_spend) < 1
+ else money_to_spend
+ )
+ free_cargo_space = (
+ None
+ if free_cargo_space is not None and int(free_cargo_space) < 1
+ else free_cargo_space
+ )
+
+ misunderstood = []
+ parameters = {
+ "ship_name": (ship_name, self.ship_names),
+ "position_start_name": (
+ position_start_name,
+ self.location_names_set_trading,
+ ),
+ "position_end_name": (position_end_name, self.location_names_set_trading),
+ "commodity_name": (commodity_name, self.commodity_names),
+ }
+ for param, (value, names_set) in parameters.items():
+ if value is not None:
+ match = await self._find_closest_match(value, names_set)
+ if match is None:
+ misunderstood.append(f"{param}: {value}")
+ else:
+ self._set_function_arg_to_cache(param, match)
+ parameters[param] = (match, names_set)
+ ship_name = parameters["ship_name"][0]
+ position_start_name = parameters["position_start_name"][0]
+ position_end_name = parameters["position_end_name"][0]
+ commodity_name = parameters["commodity_name"][0]
+
+ if money_to_spend is not None:
+ self._set_function_arg_to_cache("money", money_to_spend)
+
+ await self._print(
+ f"Interpreted Parameters: Ship: {ship_name}, Position Start: {position_start_name}, Position End: {position_end_name}, Commodity Name: {commodity_name}, Money: {money_to_spend} aUEC, free_cargo_space: {free_cargo_space} SCU, Maximal Number of Routes: {maximal_number_of_routes}, Illegal Allowed: {illegal_commodities_allowed}",
+ True,
+ )
+
+ self._set_function_arg_to_cache("money", money_to_spend)
+
+ if misunderstood or missing_args:
+ misunderstood_str = ", ".join(misunderstood)
+ missing_str = ", ".join(missing_args)
+ answer = ""
+ if missing_str:
+ answer += f"Missing parameters: {missing_str}. "
+ if misunderstood_str:
+ answer += (
+ f"These given parameters were misunderstood: {misunderstood_str}"
+ )
+ return answer
+
+ # set variables
+ ship = self._get_ship_by_name(ship_name)
+ if money_to_spend is not None:
+ money = int(money_to_spend)
+ else:
+ money = None
+ if free_cargo_space is not None:
+ free_cargo_space = int(free_cargo_space)
+ else:
+ free_cargo_space = None
+ commodity = (
+ self._get_commodity_by_name(commodity_name) if commodity_name else None
+ )
+ maximal_number_of_routes = int(
+ maximal_number_of_routes or self.uexcorp_default_trade_route_count
+ )
+ start_terminals = (
+ self._get_terminals_by_position_name(position_start_name)
+ if position_start_name
+ else self.terminals
+ )
+ end_terminals = (
+ self._get_terminals_by_position_name(position_end_name)
+ if position_end_name
+ else self.terminals
+ )
+
+ commodities = []
+ if commodity is None:
+ commodities = self.commodities
+ else:
+ commodities.append(commodity)
+
+ trading_routes = []
+ errors = []
+ for commodity in commodities:
+ commodity_routes = []
+ if not illegal_commodities_allowed and commodity["is_illegal"]:
+ continue
+ for start_terminal in start_terminals:
+ if (
+ "prices" not in start_terminal
+ or commodity["id"] not in start_terminal["prices"]
+ or start_terminal["prices"][commodity["id"]]["operation"] != "buy"
+ ):
+ continue
+ for end_terminal in end_terminals:
+ if (
+ "prices" not in end_terminal
+ or commodity["id"] not in end_terminal["prices"]
+ or end_terminal["prices"][commodity["id"]]["operation"]
+ != "sell"
+ ):
+ continue
+
+ if (
+ ship
+ and ship["hull_trading"] is True
+ and (
+ "hull_trading" not in start_terminal
+ or start_terminal["hull_trading"] is not True
+ or "hull_trading" not in end_terminal
+ or end_terminal["hull_trading"] is not True
+ )
+ ):
+ continue
+
+ trading_route_new = self._get_trading_route(
+ ship,
+ start_terminal,
+ end_terminal,
+ money,
+ free_cargo_space,
+ commodity,
+ illegal_commodities_allowed,
+ )
+
+ if isinstance(trading_route_new, str):
+ if trading_route_new not in errors:
+ errors.append(trading_route_new)
+ else:
+ commodity_routes.append(trading_route_new)
+
+ if len(commodity_routes) > 0:
+ if self.uexcorp_summarize_routes_by_commodity:
+ best_commodity_routes = heapq.nlargest(
+ 1, commodity_routes, key=lambda k: int(k["profit"])
+ )
+ trading_routes.extend(best_commodity_routes)
+ else:
+ trading_routes.extend(commodity_routes)
+
+ if len(trading_routes) > 0:
+ additional_answer = ""
+ if len(trading_routes) < maximal_number_of_routes:
+ additional_answer += (
+ f" There are only {len(trading_routes)} routes available."
+ )
+ else:
+ additional_answer += f" There are {len(trading_routes)} routes available and these are the best {maximal_number_of_routes} ones."
+
+ # sort trading routes by profit and limit to maximal_number_of_routes
+ trading_routes = heapq.nlargest(
+ maximal_number_of_routes, trading_routes, key=lambda k: int(k["profit"])
+ )
+
+ for trading_route in trading_routes:
+ destinationselection = []
+ for terminal in trading_route["end"]:
+ destinationselection.append(
+ f"{(await self._get_terminal_route_description(terminal))}"
+ )
+ trading_route["end"] = " OR ".join(destinationselection)
+ startselection = []
+ for terminal in trading_route["start"]:
+ startselection.append(
+ f"{(await self._get_terminal_route_description(terminal))}"
+ )
+ trading_route["start"] = " OR ".join(startselection)
+
+ # format the trading routes
+ for trading_route in trading_routes:
+ trading_route["start"] = trading_route["start"]
+ trading_route["end"] = trading_route["end"]
+ trading_route["commodity"] = self._format_commodity_name(
+ trading_route["commodity"]
+ )
+ trading_route["profit"] = f"{trading_route['profit']} aUEC"
+ trading_route["buy"] = f"{trading_route['buy']} aUEC"
+ trading_route["sell"] = f"{trading_route['sell']} aUEC"
+ trading_route["cargo"] = f"{trading_route['cargo']} SCU"
+ trading_route["additional_info"] = trading_route["additional_info"]
+
+ message = (
+ "Possible commodities with their profit. Just give basic overview at first.\n"
+ + additional_answer
+ + " JSON: \n "
+ + json.dumps(trading_routes)
+ )
+ return message
+ else:
+ return_string = "No trading routes found."
+ if len(errors) > 0:
+ return_string += "\nPossible errors are:\n- " + "\n- ".join(errors)
+ return return_string
+
+ def _get_trading_route(
+ self,
+ ship: dict[str, any],
+ position_start: dict[str, any],
+ position_end: dict[str, any],
+ money: int = None,
+ free_cargo_space: int = None,
+ commodity: dict[str, any] = None,
+ illegal_commodities_allowed: bool = True,
+ ) -> str:
+ """
+ Finds the best trading route based on the given parameters.
+
+ Args:
+ ship (dict[str, any]): The ship dictionary.
+ position_start (dict[str, any]): The starting position dictionary.
+ money (int, optional): The amount of money to spend. Defaults to None.
+ free_cargo_space (int, optional): The amount of free cargo space. Defaults to None.
+ position_end (dict[str, any], optional): The ending position dictionary. Defaults to None.
+ commodity (dict[str, any], optional): The commodity dictionary. Defaults to None.
+ illegal_commodities_allowed (bool, optional): Flag indicating whether illegal commodities are allowed. Defaults to True.
+
+ Returns:
+ str: A string representation of the trading route found. JSON if the route is found, otherwise an error message.
+ """
+
+ # set variables
+ cargo_space = ship["scu"]
+ if free_cargo_space:
+ cargo_space = free_cargo_space
+ if free_cargo_space > ship["scu"]:
+ cargo_space = ship["scu"]
+
+ if cargo_space < 1:
+ return "Your ship has no cargo space to trade."
+
+ commodity_filter = commodity
+ start_terminals = [position_start]
+ if ship["hull_trading"] is True:
+ start_terminals = [
+ terminal
+ for terminal in start_terminals
+ if "hull_trading" in terminal and terminal["hull_trading"] is True
+ ]
+ if len(start_terminals) < 1:
+ if ship["hull_trading"] is True:
+ return "No valid start position given. Make sure to provide a start point compatible with your ship."
+ return "No valid start position given. Try a different position or just name a planet or star system."
+
+ end_terminals = [position_end]
+ if ship["hull_trading"] is True:
+ end_terminals = [
+ terminal
+ for terminal in end_terminals
+ if "hull_trading" in terminal and terminal["hull_trading"] is True
+ ]
+ if len(end_terminals) < 1:
+ return "No valid end position given."
+
+ if (
+ len(end_terminals) == 1
+ and len(start_terminals) == 1
+ and end_terminals[0]["id"] == start_terminals[0]["id"]
+ ):
+ return "Start and end position are the same."
+
+ if money is not None and money <= 0:
+ return "You dont have enough money to trade."
+
+ best_route = {
+ "start": [],
+ "end": [],
+ "commodity": {},
+ "profit": 0,
+ "cargo": 0,
+ "buy": 0,
+ "sell": 0,
+ "additional_info": "",
+ }
+
+ # apply trade port blacklist
+ if self.uexcorp_trade_blacklist:
+ for blacklist_item in self.uexcorp_trade_blacklist:
+ if "tradeport" in blacklist_item and blacklist_item["tradeport"]:
+ for terminal in start_terminals:
+ if (
+ self._format_terminal_name(terminal)
+ == blacklist_item["tradeport"]
+ ):
+ if (
+ "commodity" not in blacklist_item
+ or not blacklist_item["commodity"]
+ ):
+ # remove terminal, if no commodity given
+ start_terminals.remove(terminal)
+ break
+ else:
+ commodity = self._get_commodity_by_name(
+ blacklist_item["commodity"]
+ )
+ for commodity_code, data in terminal["prices"].items():
+ if commodity["id"] == commodity_code:
+ # remove commodity code from terminal
+ terminal["prices"].pop(commodity_code)
+ break
+ for terminal in end_terminals:
+ if (
+ self._format_terminal_name(terminal)
+ == blacklist_item["tradeport"]
+ ):
+ if (
+ "commodity" not in blacklist_item
+ or not blacklist_item["commodity"]
+ ):
+ # remove terminal, if no commodity given
+ end_terminals.remove(terminal)
+ break
+ else:
+ commodity = self._get_commodity_by_name(
+ blacklist_item["commodity"]
+ )
+ for commodity_code, data in terminal["prices"].items():
+ if commodity["id"] == commodity_code:
+ # remove commodity code from terminal
+ terminal["prices"].pop(commodity_code)
+ break
+
+ if len(start_terminals) < 1 or len(end_terminals) < 1:
+ return "Exluded by blacklist."
+
+ for terminal_start in start_terminals:
+ commodities = []
+ if "prices" not in terminal_start:
+ continue
+
+ for commodity_code, price in terminal_start["prices"].items():
+ if price["operation"] == "buy" and (
+ commodity_filter is None or commodity_filter["id"] == commodity_code
+ ):
+ commodity = self._get_commodity_by_code(commodity_code)
+ if (
+ illegal_commodities_allowed is True
+ or not commodity["is_illegal"]
+ ):
+ temp_price = price
+ temp_price["commodity_code"] = commodity_code
+
+ in_blacklist = False
+ # apply commodity blacklist
+ if self.uexcorp_trade_blacklist:
+ for blacklist_item in self.uexcorp_trade_blacklist:
+ if (
+ "commodity" in blacklist_item
+ and blacklist_item["commodity"]
+ and not "tradeport" in blacklist_item
+ or not blacklist_item["tradeport"]
+ ):
+ if commodity["name"] == blacklist_item["commodity"]:
+ # remove commodity code from terminal
+ in_blacklist = True
+ break
+
+ if not in_blacklist:
+ commodities.append(price)
+
+ if len(commodities) < 1:
+ continue
+
+ for terminal_end in end_terminals:
+ if "prices" not in terminal_end:
+ continue
+
+ for commodity_code, price in terminal_end["prices"].items():
+ sell_commodity = self._get_commodity_by_code(commodity_code)
+
+ in_blacklist = False
+ # apply commodity blacklist
+ if sell_commodity and self.uexcorp_trade_blacklist:
+ for blacklist_item in self.uexcorp_trade_blacklist:
+ if (
+ "commodity" in blacklist_item
+ and blacklist_item["commodity"]
+ and not "tradeport" in blacklist_item
+ or not blacklist_item["tradeport"]
+ ):
+ if (
+ sell_commodity["name"]
+ == blacklist_item["commodity"]
+ ):
+ # remove commodity code from terminal
+ in_blacklist = True
+ break
+
+ if in_blacklist:
+ continue
+
+ temp_price = price
+ temp_price["commodity_code"] = commodity_code
+
+ for commodity in commodities:
+ if (
+ commodity["commodity_code"] == temp_price["commodity_code"]
+ and price["operation"] == "sell"
+ and price["price_sell"] > commodity["price_buy"]
+ ):
+ if money is None:
+ cargo_by_money = cargo_space
+ else:
+ cargo_by_money = math.floor(
+ money / commodity["price_buy"]
+ )
+ cargo_by_space = cargo_space
+ if self.uexcorp_use_estimated_availability:
+ cargo_by_availability_sell = (
+ temp_price["scu_expected"] or 0
+ )
+ cargo_by_availability_buy = (
+ commodity["scu_expected"] or 0
+ )
+ else:
+ cargo_by_availability_sell = cargo_by_space
+ cargo_by_availability_buy = cargo_by_space
+
+ cargo = min(
+ cargo_by_money,
+ cargo_by_space,
+ cargo_by_availability_sell,
+ cargo_by_availability_buy,
+ )
+ if cargo >= 1:
+ info = ""
+ if min(cargo_by_money, cargo_by_space) > min(
+ cargo_by_availability_buy,
+ cargo_by_availability_sell,
+ ):
+ if (
+ cargo_by_availability_buy
+ < cargo_by_availability_sell
+ ):
+ info = f"Please mention to user: SCU count limited to {min(cargo_by_availability_buy, cargo_by_availability_sell)} (instead of {min(cargo_by_money, cargo_by_space)}) by estimated availability at buy location."
+ elif (
+ cargo_by_availability_buy
+ > cargo_by_availability_sell
+ ):
+ info = f"Please mention to user: SCU count limited to {min(cargo_by_availability_buy, cargo_by_availability_sell)} (instead of {min(cargo_by_money, cargo_by_space)}) by estimated availability at sell location."
+ else:
+ info = f"Please mention to user: SCU count limited to {min(cargo_by_availability_buy, cargo_by_availability_sell)} (instead of {min(cargo_by_money, cargo_by_space)}) by estimated availability at sell and buy location."
+ profit = round(
+ cargo
+ * (price["price_sell"] - commodity["price_buy"])
+ )
+ if profit > best_route["profit"]:
+ best_route["start"] = [terminal_start]
+ best_route["end"] = [terminal_end]
+ best_route["commodity"] = temp_price
+ best_route["profit"] = profit
+ best_route["cargo"] = cargo
+ best_route["buy"] = commodity["price_buy"] * cargo
+ best_route["sell"] = price["price_sell"] * cargo
+ best_route["additional_info"] = info
+ else:
+ if (
+ profit == best_route["profit"]
+ and best_route["commodity"]["commodity_code"]
+ == temp_price["commodity_code"]
+ ):
+ if terminal_start not in best_route["start"]:
+ best_route["start"].append(terminal_start)
+ if terminal_end not in best_route["end"]:
+ best_route["end"].append(terminal_end)
+
+ if len(best_route["start"]) == 0:
+ return f"No route found for your {ship['name']}. Try a different route."
+
+ best_route["commodity"] = best_route["commodity"]
+ best_route["profit"] = f"{best_route['profit']}"
+ best_route["cargo"] = f"{best_route['cargo']}"
+ best_route["buy"] = f"{best_route['buy']}"
+ best_route["sell"] = f"{best_route['sell']}"
+
+ return best_route
+
+ def _get_ship_by_name(self, name: str) -> dict[str, any] | None:
+ """Finds the ship with the specified name and returns the ship or None.
+
+ Args:
+ name (str): The name of the ship to search for.
+
+ Returns:
+ Optional[object]: The ship object if found, or None if not found.
+ """
+ return self.ship_dict.get(name.lower()) if name else None
+
+ def _get_terminal_by_name(self, name: str) -> dict[str, any] | None:
+ """Finds the terminal with the specified name and returns the terminal or None.
+
+ Args:
+ name (str): The name of the terminal to search for.
+
+ Returns:
+ Optional[object]: The terminal object if found, otherwise None.
+ """
+ return self.terminal_dict.get(name.lower()) if name else None
+
+ def _get_terminal_by_code(self, code: str) -> dict[str, any] | None:
+ """Finds the terminal with the specified code and returns the terminal or None.
+
+ Args:
+ code (str): The code of the terminal to search for.
+
+ Returns:
+ Optional[object]: The terminal object if found, otherwise None.
+ """
+ return self.terminal_code_dict.get(code) if code else None
+
+ def _get_planet_by_name(self, name: str) -> dict[str, any] | None:
+ """Finds the planet with the specified name and returns the planet or None.
+
+ Args:
+ name (str): The name of the planet to search for.
+
+ Returns:
+ Optional[object]: The planet object if found, otherwise None.
+ """
+ return self.planet_dict.get(name.lower()) if name else None
+
+ def _get_city_by_name(self, name: str) -> dict[str, any] | None:
+ """Finds the city with the specified name and returns the city or None.
+
+ Args:
+ name (str): The name of the city to search for.
+
+ Returns:
+ Optional[object]: The city object if found, or None if not found.
+ """
+ return self.city_dict.get(name.lower()) if name else None
+
+ def _get_moon_by_name(self, name: str) -> dict[str, any] | None:
+ """Finds the moon with the specified name and returns the moon or None.
+
+ Args:
+ name (str): The name of the moon to search for.
+
+ Returns:
+ Optional[object]: The moon object if found, otherwise None.
+ """
+ return self.moon_dict.get(name.lower()) if name else None
+
+ def _get_system_by_name(self, name: str) -> dict[str, any] | None:
+ """Finds the system with the specified name and returns the system or None.
+
+ Args:
+ name (str): The name of the system to search for.
+
+ Returns:
+ Optional[object]: The system object if found, otherwise None.
+ """
+ return self.system_dict.get(name.lower()) if name else None
+
+ def _get_commodity_by_name(self, name: str) -> dict[str, any] | None:
+ """Finds the commodity with the specified name and returns the commodity or None.
+
+ Args:
+ name (str): The name of the commodity to search for.
+
+ Returns:
+ Optional[object]: The commodity object if found, otherwise None.
+ """
+ return self.commodity_dict.get(name.lower()) if name else None
+
+ async def _get_terminal_route_description(self, terminal: dict[str, any]) -> str:
+ """Returns the breadcrums of a terminal.
+
+ Args:
+ terminal (dict[str, any]): The terminal information.
+
+ Returns:
+ str: The description of the terminal route.
+ """
+ terminal = await self._get_converted_terminal_for_output(terminal, False)
+ keys = [
+ ("star_system", "Star-System"),
+ ("planet", "Planet"),
+ ("moon", "moon"),
+ ("city", "City"),
+ ("name", "Trade Point"),
+ ]
+ route = [f"{name}: {terminal[key]}" for key, name in keys if key in terminal]
+ return f"({' >> '.join(route)})"
+
+ def _get_system_name_by_code(self, code: str) -> str:
+ """Returns the name of the system with the specified code.
+
+ Args:
+ code (str): The code of the system.
+
+ Returns:
+ str: The name of the system with the specified code.
+ """
+ return (
+ self._format_system_name(self.system_code_dict.get(code)) if code else None
+ )
+
+ def _get_planet_name_by_code(self, code: str) -> str:
+ """Returns the name of the planet with the specified code.
+
+ Args:
+ code (str): The code of the planet.
+
+ Returns:
+ str: The name of the planet with the specified code.
+ """
+ return (
+ self._format_planet_name(self.planet_code_dict.get(code)) if code else None
+ )
+
+ def _get_moon_name_by_code(self, code: str) -> str:
+ """Returns the name of the moon with the specified code.
+
+ Args:
+ code (str): The code of the moon.
+
+ Returns:
+ str: The name of the moon with the specified code.
+ """
+ return self._format_moon_name(self.moon_code_dict.get(code)) if code else None
+
+ def _get_city_name_by_code(self, code: str) -> str:
+ """Returns the name of the city with the specified code.
+
+ Args:
+ code (str): The code of the city.
+
+ Returns:
+ str: The name of the city with the specified code.
+ """
+ return self._format_city_name(self.city_code_dict.get(code)) if code else None
+
+ def _get_commodity_name_by_code(self, code: str) -> str:
+ """Returns the name of the commodity with the specified code.
+
+ Args:
+ code (str): The code of the commodity.
+
+ Returns:
+ str: The name of the commodity with the specified code.
+ """
+ return (
+ self._format_commodity_name(self.commodity_code_dict.get(code))
+ if code
+ else None
+ )
+
+ def _get_commodity_by_code(self, code: str) -> dict[str, any] | None:
+ """Finds the commodity with the specified code and returns the commodity or None.
+
+ Args:
+ code (str): The code of the commodity to search for.
+
+ Returns:
+ Optional[object]: The commodity object if found, otherwise None.
+ """
+ return self.commodity_code_dict.get(code) if code else None
+
+ def _get_terminals_by_position_name(self, name: str) -> list[dict[str, any]]:
+ """Returns all terminals with the specified position name.
+
+ Args:
+ name (str): The position name to search for.
+
+ Returns:
+ list[dict[str, any]]: A list of terminals matching the position name.
+ """
+ if not name:
+ return []
+
+ terminals = []
+
+ terminal_temp = self._get_terminal_by_name(name)
+ if terminal_temp:
+ terminals.append(terminal_temp)
+
+ terminals.extend(self._get_terminals_by_systemname(name))
+ terminals.extend(self._get_terminals_by_planetname(name))
+ terminals.extend(self._get_terminals_by_moonname(name))
+ terminals.extend(self._get_terminals_by_cityname(name))
+ return terminals
+
+ def _get_moons_by_planetcode(self, code: str) -> list[dict[str, any]]:
+ """Returns the moon with the specified planet code.
+
+ Args:
+ code (str): The code of the planet.
+
+ Returns:
+ Optional[object]: The moon object if found, otherwise None.
+ """
+ return self.moons_by_planet.get(code, []) if code else []
+
+ def _get_cities_by_planetcode(self, code: str) -> list[dict[str, any]]:
+ """Returns all cities with the specified planet code.
+
+ Args:
+ code (str): The code of the planet.
+
+ Returns:
+ list[dict[str, any]]: A list of cities matching the planet code.
+ """
+ return self.cities_by_planet.get(code, []) if code else []
+
+ def _get_planets_by_systemcode(self, code: str) -> list[dict[str, any]]:
+ """Returns all planets with the specified system code.
+
+ Args:
+ code (str): The code of the system.
+
+ Returns:
+ list[dict[str, any]]: A list of planets matching the system code.
+ """
+ return self.planets_by_system.get(code, []) if code else []
+
+ def _get_terminals_by_systemcode(self, code: str) -> list[dict[str, any]]:
+ """Returns all terminals with the specified system code.
+
+ Args:
+ code (str): The code of the system.
+
+ Returns:
+ list[dict[str, any]]: A list of terminals matching the system code.
+ """
+ return self.terminals_by_system.get(code, []) if code else []
+
+ def _get_terminals_by_planetcode(self, code: str) -> list[dict[str, any]]:
+ """Returns all terminals with the specified planet code.
+
+ Args:
+ code (str): The code of the planet.
+
+ Returns:
+ list[dict[str, any]]: A list of terminals matching the planet code.
+ """
+ return self.terminals_by_planet.get(code, []) if code else []
+
+ def _get_terminals_by_mooncode(self, code: str) -> list[dict[str, any]]:
+ """Returns all terminals with the specified moon code.
+
+ Args:
+ code (str): The code of the moon.
+
+ Returns:
+ list[dict[str, any]]: A list of terminals matching the moon code.
+ """
+ return self.terminals_by_moon.get(code, []) if code else []
+
+ def _get_terminals_by_citycode(self, code: str) -> list[dict[str, any]]:
+ """Returns all terminals with the specified city code.
+
+ Args:
+ code (str): The code of the city.
+
+ Returns:
+ list[dict[str, any]]: A list of terminals matching the city code.
+ """
+ return self.terminals_by_city.get(code, []) if code else []
+
+ def _get_terminals_by_planetname(self, name: str) -> list[dict[str, any]]:
+ """Returns all terminals with the specified planet name.
+
+ Args:
+ name (str): The name of the planet.
+
+ Returns:
+ list[dict[str, any]]: A list of terminals matching the planet name.
+ """
+ planet = self._get_planet_by_name(name)
+ return self._get_terminals_by_planetcode(planet["id"]) if planet else []
+
+ def _get_terminals_by_moonname(self, name: str) -> list[dict[str, any]]:
+ """Returns all terminals with the specified moon name.
+
+ Args:
+ name (str): The name of the moon.
+
+ Returns:
+ list[dict[str, any]]: A list of terminals matching the moon name.
+ """
+ moon = self._get_moon_by_name(name)
+ return self._get_terminals_by_mooncode(moon["id"]) if moon else []
+
+ def _get_terminals_by_cityname(self, name: str) -> list[dict[str, any]]:
+ """Returns all terminals with the specified city name.
+
+ Args:
+ name (str): The name of the city.
+
+ Returns:
+ list[dict[str, any]]: A list of terminals matching the city name.
+ """
+ city = self._get_city_by_name(name)
+ return self._get_terminals_by_citycode(city["id"]) if city else []
+
+ def _get_terminals_by_systemname(self, name: str) -> list[dict[str, any]]:
+ """Returns all terminals with the specified system name.
+
+ Args:
+ name (str): The name of the system.
+
+ Returns:
+ list[dict[str, any]]: A list of terminals matching the system name.
+ """
+ system = self._get_system_by_name(name)
+ return self._get_terminals_by_systemcode(system["id"]) if system else []
diff --git a/templates/migration/1_5_0/skills/vision_ai/default_config.yaml b/templates/migration/1_5_0/skills/vision_ai/default_config.yaml
new file mode 100644
index 00000000..564eb0a8
--- /dev/null
+++ b/templates/migration/1_5_0/skills/vision_ai/default_config.yaml
@@ -0,0 +1,28 @@
+name: VisionAI
+module: skills.vision_ai.main
+category: general
+description:
+ en: Let your Wingman analyse whatever is on your screen.
+ de: Lass deinen Wingman alles analysieren, was auf deinem Bildschirm zu sehen ist.
+examples:
+ - question:
+ en: What is on my screen?
+ de: Was siehst du auf meinem Bildschirm?
+ answer:
+ en: I see Spotify running and playing music.
+ de: Ich sehe die Spotify-App, die Musik abspielt.
+prompt: |
+ You can also see what the user is seeing and you can analyse it and answer all questions about what you see.
+ Use the tool 'analyse_what_you_or_user_sees' if you are asked to analyse what you see or whtat the user sees.
+ You can also see the screen of the user. Call 'analyse_what_you_or_user_sees' for this, too.
+custom_properties:
+ - id: display
+ name: Display to capture
+ property_type: number
+ required: true
+ value: 1
+ - id: show_screenshots
+ name: Show screenshots
+ property_type: boolean
+ required: true
+ value: true
diff --git a/templates/migration/1_5_0/skills/vision_ai/logo.png b/templates/migration/1_5_0/skills/vision_ai/logo.png
new file mode 100644
index 00000000..440cd3be
Binary files /dev/null and b/templates/migration/1_5_0/skills/vision_ai/logo.png differ
diff --git a/templates/migration/1_5_0/skills/vision_ai/main.py b/templates/migration/1_5_0/skills/vision_ai/main.py
new file mode 100644
index 00000000..8f51673d
--- /dev/null
+++ b/templates/migration/1_5_0/skills/vision_ai/main.py
@@ -0,0 +1,173 @@
+import base64
+import io
+from typing import TYPE_CHECKING
+from mss import mss
+from PIL import Image
+from api.enums import LogSource, LogType
+from api.interface import SettingsConfig, SkillConfig, WingmanInitializationError
+from skills.skill_base import Skill
+
+if TYPE_CHECKING:
+ from wingmen.open_ai_wingman import OpenAiWingman
+
+
+class VisionAI(Skill):
+
+ def __init__(
+ self,
+ config: SkillConfig,
+ settings: SettingsConfig,
+ wingman: "OpenAiWingman",
+ ) -> None:
+ super().__init__(config=config, settings=settings, wingman=wingman)
+
+ self.display = 1
+ self.show_screenshots = False
+
+ async def validate(self) -> list[WingmanInitializationError]:
+ errors = await super().validate()
+
+ self.display = self.retrieve_custom_property_value("display", errors)
+ self.show_screenshots = self.retrieve_custom_property_value(
+ "show_screenshots", errors
+ )
+
+ return errors
+
+ def get_tools(self) -> list[tuple[str, dict]]:
+ tools = [
+ (
+ "analyse_what_you_or_user_sees",
+ {
+ "type": "function",
+ "function": {
+ "name": "analyse_what_you_or_user_sees",
+ "description": "Analyse what you or the user sees and answer questions about it.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "question": {
+ "type": "string",
+ "description": "The question to answer about the image.",
+ }
+ },
+ "required": ["question"],
+ },
+ },
+ },
+ ),
+ ]
+ return tools
+
+ async def execute_tool(
+ self, tool_name: str, parameters: dict[str, any]
+ ) -> tuple[str, str]:
+ function_response = ""
+ instant_response = ""
+
+ if tool_name == "analyse_what_you_or_user_sees":
+ # Take a screenshot
+ with mss() as sct:
+ main_display = sct.monitors[self.display]
+ screenshot = sct.grab(main_display)
+
+ # Create a PIL image from array
+ image = Image.frombytes(
+ "RGB", screenshot.size, screenshot.bgra, "raw", "BGRX"
+ )
+
+ desired_width = 1000
+ aspect_ratio = image.height / image.width
+ new_height = int(desired_width * aspect_ratio)
+
+ resized_image = image.resize((desired_width, new_height))
+
+ png_base64 = self.pil_image_to_base64(resized_image)
+
+ if self.show_screenshots:
+ await self.printr.print_async(
+ "Analyzing this image",
+ color=LogType.INFO,
+ source=LogSource.WINGMAN,
+ source_name=self.wingman.name,
+ skill_name=self.name,
+ additional_data={"image": png_base64},
+ )
+
+ question = parameters.get("question", "What's in this image?")
+
+ messages = [
+ {
+ "role": "system",
+ "content": """
+ You are a helpful ai assistant.
+ """,
+ },
+ {
+ "role": "user",
+ "content": [
+ {"type": "text", "text": question},
+ {
+ "type": "image_url",
+ "image_url": {
+ "url": f"data:image/jpeg;base64,{png_base64}",
+ "detail": "high",
+ },
+ },
+ ],
+ },
+ ]
+ completion = await self.llm_call(messages)
+ answer = (
+ completion.choices[0].message.content
+ if completion and completion.choices
+ else ""
+ )
+
+ if answer:
+ if self.settings.debug_mode:
+ await self.printr.print_async(f"Vision analysis: {answer}.", color=LogType.INFO)
+ function_response = answer
+
+ return function_response, instant_response
+
+ async def is_summarize_needed(self, tool_name: str) -> bool:
+ """Returns whether a tool needs to be summarized."""
+ return True
+
+ async def is_waiting_response_needed(self, tool_name: str) -> bool:
+ """Returns whether a tool probably takes long and a message should be printet in between."""
+ return True
+
+ def pil_image_to_base64(self, pil_image):
+ """
+ Convert a PIL image to a base64 encoded string.
+
+ :param pil_image: PIL Image object
+ :return: Base64 encoded string of the image
+ """
+ # Create a bytes buffer to hold the image data
+ buffer = io.BytesIO()
+ # Save the PIL image to the bytes buffer in PNG format
+ pil_image.save(buffer, format="PNG")
+ # Get the byte data from the buffer
+
+ # Encode the byte data to Base64
+ base64_encoded_data = base64.b64encode(buffer.getvalue())
+ # Convert the base64 bytes to a string
+ base64_string = base64_encoded_data.decode("utf-8")
+
+ return base64_string
+
+ def convert_png_to_base64(self, png_data):
+ """
+ Convert raw PNG data to a base64 encoded string.
+
+ :param png_data: A bytes object containing the raw PNG data
+ :return: A base64 encoded string.
+ """
+ # Encode the PNG data to base64
+ base64_encoded_data = base64.b64encode(png_data)
+ # Convert the base64 bytes to a string
+ base64_string = base64_encoded_data.decode("utf-8")
+ return base64_string
diff --git a/templates/migration/1_5_0/skills/voice_changer/default_config.yaml b/templates/migration/1_5_0/skills/voice_changer/default_config.yaml
new file mode 100644
index 00000000..eb9ca7f9
--- /dev/null
+++ b/templates/migration/1_5_0/skills/voice_changer/default_config.yaml
@@ -0,0 +1,34 @@
+name: VoiceChanger
+module: skills.voice_changer.main
+category: general
+description:
+ en: Changes the voice of your Wingman automatically. Customize it to your liking.
+ de: Wechselt die Stimme deines Wingman automatisch. Konfigurierbar nach eigenen Vorlieben.
+custom_properties:
+ - id: voice_changer_interval
+ name: Switching Interval
+ hint: The interval in seconds in which the voice should be changed. (Calculated from last interaction)
+ value: 180
+ required: true
+ property_type: number
+ - id: voice_changer_clearhistory
+ hint: Enable this to clear the message history (memory) when the voice is changed.
+ name: Clear history on voice switch
+ value: true
+ required: true
+ property_type: boolean
+ - id: voice_changer_voices
+ name: Available voices
+ hint: The voices your Wingman can use.
+ value: []
+ required: false
+ property_type: voice_selection
+ options:
+ - label: "multiple"
+ value: true
+ - id: voice_changer_personalityprompt
+ name: Personality Prompt
+ hint: A prompt used on voice change to generate a new personality. Leave empty to disable.
+ required: false
+ value: ""
+ property_type: textarea
diff --git a/templates/migration/1_5_0/skills/voice_changer/logo.png b/templates/migration/1_5_0/skills/voice_changer/logo.png
new file mode 100644
index 00000000..8bcceddf
Binary files /dev/null and b/templates/migration/1_5_0/skills/voice_changer/logo.png differ
diff --git a/templates/migration/1_5_0/skills/voice_changer/main.py b/templates/migration/1_5_0/skills/voice_changer/main.py
new file mode 100644
index 00000000..47c422c7
--- /dev/null
+++ b/templates/migration/1_5_0/skills/voice_changer/main.py
@@ -0,0 +1,270 @@
+import time
+from random import randrange
+from typing import TYPE_CHECKING
+from api.interface import (
+ SettingsConfig,
+ SkillConfig,
+ VoiceSelection,
+ WingmanInitializationError,
+)
+from api.enums import (
+ LogType,
+ TtsProvider,
+ WingmanProTtsProvider,
+)
+from skills.skill_base import Skill
+
+if TYPE_CHECKING:
+ from wingmen.open_ai_wingman import OpenAiWingman
+
+
+class VoiceChanger(Skill):
+
+ def __init__(
+ self,
+ config: SkillConfig,
+ settings: SettingsConfig,
+ wingman: "OpenAiWingman",
+ ) -> None:
+ super().__init__(config=config, settings=settings, wingman=wingman)
+
+ self.voice_switching = True
+ self.voices = []
+ self.voice_timespan = 0
+ self.voice_last_message = None
+ self.voice_current_index = None
+ self.clear_history = False
+
+ self.context_generation = True
+ self.context_prompt = None
+ self.context_personality = ""
+ self.context_personality_next = ""
+
+ self.active = False
+
+ async def validate(self) -> list[WingmanInitializationError]:
+ errors = await super().validate()
+
+ self.voice_timespan = self.retrieve_custom_property_value(
+ "voice_changer_interval", errors
+ )
+ if not self.voice_timespan or self.voice_timespan < 0:
+ self.voice_switching = False
+
+ self.clear_history = self.retrieve_custom_property_value(
+ "voice_changer_clearhistory", errors
+ )
+
+ self.context_prompt = self.retrieve_custom_property_value(
+ "voice_changer_personalityprompt", errors
+ )
+ if not self.context_prompt:
+ self.context_generation = False
+
+ # prepare voices
+ voices: list[VoiceSelection] = self.retrieve_custom_property_value(
+ "voice_changer_voices", errors
+ )
+ if not voices or len(voices) == 0:
+ self.voice_switching = False
+ else:
+ # we have to initiate all providers here
+ initiated_providers = []
+ initiate_provider_error = False
+
+ for voice in voices:
+ voice_provider = voice.provider
+ if voice_provider not in initiated_providers:
+ initiated_providers.append(voice_provider)
+
+ # initiate provider
+ if voice_provider == TtsProvider.OPENAI and not self.wingman.openai:
+ await self.wingman.validate_and_set_openai(errors)
+ if len(errors) > 0:
+ initiate_provider_error = True
+ elif (
+ voice_provider == TtsProvider.AZURE
+ and not self.wingman.openai_azure
+ ):
+ await self.wingman.validate_and_set_azure(errors)
+ if len(errors) > 0:
+ initiate_provider_error = True
+ elif (
+ voice_provider == TtsProvider.ELEVENLABS
+ and not self.wingman.elevenlabs
+ ):
+ await self.wingman.validate_and_set_elevenlabs(errors)
+ if len(errors) > 0:
+ initiate_provider_error = True
+ elif (
+ voice_provider == TtsProvider.WINGMAN_PRO
+ and not self.wingman.wingman_pro
+ ):
+ await self.wingman.validate_and_set_wingman_pro()
+
+ if not initiate_provider_error:
+ self.voices = voices
+ else:
+ self.voice_switching = False
+
+ return errors
+
+ async def prepare(self) -> None:
+ self.active = True
+
+ # prepare first personality
+ if self.context_generation:
+ self.threaded_execution(self._generate_new_context)
+
+ async def unload(self) -> None:
+ self.active = False
+
+ async def on_add_user_message(self, message: str):
+ if not self.active:
+ return
+
+ if self.voice_last_message is None:
+ await self._initiate_change()
+ self.voice_last_message = time.time()
+ return
+
+ last_message_diff = time.time() - self.voice_last_message
+ last_message_diff = round(last_message_diff, 0)
+ self.voice_last_message = time.time()
+
+ if last_message_diff >= self.voice_timespan:
+ await self._initiate_change()
+
+ async def _initiate_change(self):
+ messages = []
+ if self.voice_switching:
+ messages.append(self._switch_voice())
+ if self.context_generation:
+ messages.append(self._switch_personality())
+ if self.clear_history:
+ self.wingman.reset_conversation_history()
+
+ # sort out empty messages
+ messages = [await message for message in messages if message]
+
+ if messages:
+ await self.printr.print_async(
+ text="\n".join(messages),
+ color=LogType.INFO,
+ source_name=self.wingman.name,
+ )
+
+ async def _switch_voice(self) -> str:
+ """Switch voice to the given voice setting."""
+
+ # choose voice
+ while True:
+ index = randrange(len(self.voices)) - 1
+ if (
+ self.voice_current_index is None
+ or len(self.voices) == 1
+ or index != self.voice_current_index
+ ):
+ self.voice_current_index = index
+ voice_setting = self.voices[index]
+ break
+
+ if not voice_setting:
+ await self.printr.print_async(
+ "Voice switching failed due to missing voice settings.",
+ LogType.ERROR,
+ )
+ return "Voice switching failed due to missing voice settings."
+
+ voice_provider = voice_setting.provider
+ voice = voice_setting.voice
+ voice_name = None
+ error = False
+
+ if voice_provider == TtsProvider.WINGMAN_PRO:
+ if voice_setting.subprovider == WingmanProTtsProvider.OPENAI:
+ voice_name = voice.value
+ provider_name = "Wingman Pro / OpenAI"
+ self.wingman.config.openai.tts_voice = voice
+ elif voice_setting.subprovider == WingmanProTtsProvider.AZURE:
+ voice_name = voice
+ provider_name = "Wingman Pro / Azure TTS"
+ self.wingman.config.azure.tts.voice = voice
+ elif voice_provider == TtsProvider.OPENAI:
+ voice_name = voice.value
+ provider_name = "OpenAI"
+ self.wingman.config.openai.tts_voice = voice
+ elif voice_provider == TtsProvider.ELEVENLABS:
+ voice_name = voice.name or voice.id
+ provider_name = "Elevenlabs"
+ self.wingman.config.elevenlabs.voice = voice
+ self.wingman.config.elevenlabs.output_streaming = False
+ elif voice_provider == TtsProvider.AZURE:
+ voice_name = voice
+ provider_name = "Azure TTS"
+ self.wingman.config.azure.tts.voice = voice
+ elif voice_provider == TtsProvider.XVASYNTH:
+ voice_name = voice.voice_name
+ provider_name = "XVASynth"
+ self.wingman.config.xvasynth.voice = voice
+ elif voice_provider == TtsProvider.EDGE_TTS:
+ voice_name = voice
+ provider_name = "Edge TTS"
+ self.wingman.config.edge_tts.voice = voice
+ else:
+ error = True
+
+ if error or not voice_name or not voice_provider:
+ await self.printr.print_async(
+ "Voice switching failed due to an unknown voice provider/subprovider. Setting: {voice_setting}",
+ LogType.ERROR,
+ )
+ return f"Voice switching failed due to an unknown voice provider/subprovider. Setting: {voice_setting}"
+
+ self.wingman.config.features.tts_provider = voice_provider
+
+ return f"Switched {self.wingman.name}'s voice to {voice_name} ({provider_name})"
+
+ async def _switch_personality(self) -> str:
+ # if no next context is available, generate a new one
+ if not self.context_personality_next:
+ await self._generate_new_context()
+
+ self.context_personality = self.context_personality_next
+ self.context_personality_next = ""
+
+ self.threaded_execution(self._generate_new_context)
+
+ return "Switched personality context."
+
+ async def _generate_new_context(self):
+ messages = [
+ {
+ "role": "system",
+ "content": """
+ Generate new context based on the input in the \"You\"-perspective.
+ Like \"You are a grumpy...\" or \"You are an enthusiastic...\" and so on.
+ Only output the personality description without additional context or commentary.
+ """,
+ },
+ {
+ "role": "user",
+ "content": self.context_prompt,
+ },
+ ]
+ completion = await self.llm_call(messages)
+ generated_context = (
+ completion.choices[0].message.content
+ if completion and completion.choices
+ else ""
+ )
+
+ self.context_personality_next = generated_context
+
+ async def get_prompt(self) -> str | None:
+ prompts = []
+ if self.config.prompt:
+ prompts.append(self.config.prompt)
+ if self.context_generation:
+ prompts.append(self.context_personality)
+ return " ".join(prompts) if prompts else None
diff --git a/templates/migration/1_5_0/skills/web_search/default_config.yaml b/templates/migration/1_5_0/skills/web_search/default_config.yaml
new file mode 100644
index 00000000..f3f04bd2
--- /dev/null
+++ b/templates/migration/1_5_0/skills/web_search/default_config.yaml
@@ -0,0 +1,27 @@
+name: WebSearch
+module: skills.web_search.main
+category: general
+description:
+ en: Searches the web using the DuckDuckGo search engine.
+ de: Nutze die DuckDuckGo Suchmaschine um Daten aus dem Internet zu laden.
+hint:
+ en: Start your queries with "Search the web for..." or "Search the internet for..."
+ de: Starte deine Anfragen mit 'Suche im Web nach...' oder 'Suche im Internet nach...'.
+examples:
+ - question:
+ en: Search the internet for recent news about Star Citizen and give me the highlights.
+ de: Suche im Internet nach aktuellen Neuigkeiten zu Star Citizen und fasse sie zusammen.
+ answer:
+ en: I found the following news about Star Citizen...
+ de: Ich habe folgende Neuigkeiten zu Star Citizen gefunden...
+prompt: |
+ You can also search the internet for topics identified by the user by using your `web_search_function` tool.
+
+ Examples indicating that the user wants to search the internet are:
+
+ - "Search the web for..." or "Search the internet for..."
+ - "Use DuckDuckGo to search for..."
+ - "Find more information about..."
+ - "What is the latest news about..." (or any other mention of current or recent information)
+ - How is the weather forecast for... (or any other mention of future information)
+ - The user is asking a question that can be answered by searching the internet and is not part of your general knowledge.
diff --git a/templates/migration/1_5_0/skills/web_search/logo.png b/templates/migration/1_5_0/skills/web_search/logo.png
new file mode 100644
index 00000000..3411ad07
Binary files /dev/null and b/templates/migration/1_5_0/skills/web_search/logo.png differ
diff --git a/templates/migration/1_5_0/skills/web_search/main.py b/templates/migration/1_5_0/skills/web_search/main.py
new file mode 100644
index 00000000..da86a49f
--- /dev/null
+++ b/templates/migration/1_5_0/skills/web_search/main.py
@@ -0,0 +1,206 @@
+import time
+import math
+from urllib.parse import urlparse
+from copy import deepcopy
+from typing import TYPE_CHECKING
+from duckduckgo_search import DDGS
+from trafilatura import fetch_url, extract
+from trafilatura.settings import DEFAULT_CONFIG
+from api.interface import SettingsConfig, SkillConfig
+from api.enums import LogType
+from skills.skill_base import Skill
+
+if TYPE_CHECKING:
+ from wingmen.open_ai_wingman import OpenAiWingman
+
+
+class WebSearch(Skill):
+
+ def __init__(
+ self,
+ config: SkillConfig,
+ settings: SettingsConfig,
+ wingman: "OpenAiWingman",
+ ) -> None:
+ super().__init__(config=config, settings=settings, wingman=wingman)
+
+ # Set default and custom behavior
+ self.max_time = 5
+ self.max_results = 5
+ self.min_results = 2
+ self.max_result_size = 4000
+
+ # Set necessary trafilatura settings to match
+
+ # Copy default config file that comes with trafilatura
+ self.trafilatura_config = deepcopy(DEFAULT_CONFIG)
+ # Change download and max redirects default in config
+ self.trafilatura_config["DEFAULT"][
+ "DOWNLOAD_TIMEOUT"
+ ] = f"{math.ceil(self.max_time/2)}"
+ self.trafilatura_config["DEFAULT"]["MAX_REDIRECTS "] = "3"
+
+ def get_tools(self) -> list[tuple[str, dict]]:
+ tools = [
+ (
+ "web_search_function",
+ {
+ "type": "function",
+ "function": {
+ "name": "web_search_function",
+ "description": "Searches the internet / web for the topic identified by the user or identified by the AI to answer a user question.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "search_query": {
+ "type": "string",
+ "description": "The topic to search the internet for.",
+ },
+ "search_type": {
+ "type": "string",
+ "description": "The type of search to perform. Use 'news', if the user is looking for current events, weather, or recent news. Use 'general' for general detailed information about a topic. Use 'single_site' if the user has specified one particular web page that they want you to review, and then use the 'single_site_url' parameter to identify the web page. If it is not clear what type of search the user wants, ask.",
+ "enum": [
+ "news",
+ "general",
+ "single_site",
+ ],
+ },
+ "single_site_url": {
+ "type": "string",
+ "description": "If the user wants to search a single website, the specific site url that they want to search, formatted as a proper url.",
+ },
+ },
+ "required": ["search_query", "search_type"],
+ },
+ },
+ },
+ ),
+ ]
+ return tools
+
+ async def is_waiting_response_needed(self, tool_name: str) -> bool:
+ return True
+
+ async def execute_tool(
+ self, tool_name: str, parameters: dict[str, any]
+ ) -> tuple[str, str]:
+ function_response = "No search results found or search failed."
+ instant_response = ""
+
+ if tool_name == "web_search_function":
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+ await self.printr.print_async(
+ f"Executing web_search_function with parameters: {parameters}",
+ color=LogType.INFO,
+ )
+ final_results = ""
+ search_query = parameters.get("search_query")
+ search_type = parameters.get("search_type")
+ site_url = parameters.get("single_site_url")
+
+ # Since site_url is not a required parameter, it is possible the AI may not to include it even when using single site type, and instead put the web address in the query field; check if that is the case.
+ if not site_url and search_type == "single_site":
+ try:
+ urlparse(search_query)
+ site_url = search_query
+ except ValueError:
+ await self.printr.print_async(
+ "Tried single site search but no valid url to search.",
+ color=LogType.INFO,
+ )
+
+ processed_results = []
+
+ async def gather_information(result):
+ title = result.get("title")
+ link = result.get("url")
+ if search_type == "general":
+ link = result.get("href")
+ body = result.get("body")
+
+ # If doing a deep dive on a single site get as much content as possible
+ if search_type == "single_site":
+ self.max_result_size = 20000
+ else:
+ self.max_result_size = 4000
+ # If a link is in search results or identified by the user, then use trafilatura to download its content and extract the content to text
+ if link:
+ trafilatura_url = link
+ trafilatura_downloaded = fetch_url(
+ trafilatura_url, config=self.trafilatura_config
+ )
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"web_search skill analyzing website at: {link} for full content using trafilatura",
+ color=LogType.INFO,
+ )
+ trafilatura_result = extract(
+ trafilatura_downloaded,
+ include_comments=False,
+ include_tables=False,
+ )
+ if trafilatura_result:
+ processed_results.append(
+ title
+ + "\n"
+ + link
+ + "\n"
+ + trafilatura_result[: self.max_result_size]
+ )
+
+ else:
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"web_search skill could not extract results from website at: {link} for full content using trafilatura",
+ color=LogType.INFO,
+ )
+ processed_results.append(title + "\n" + link + "\n" + body)
+
+ if search_type == "general":
+ self.min_results = 2
+ self.max_time = 5
+ search_results = DDGS().text(
+ search_query, safesearch="off", max_results=self.max_results
+ )
+ elif search_type == "news":
+ self.min_results = 2
+ self.max_time = 5
+ search_results = DDGS().news(
+ search_query, safesearch="off", max_results=self.max_results
+ )
+ else:
+ search_results = [
+ {"url": site_url, "title": "Site Requested", "body": "None found"}
+ ]
+ self.min_results = 1
+ self.max_time = 30
+
+ self.trafilatura_config["DEFAULT"][
+ "DOWNLOAD_TIMEOUT"
+ ] = f"{math.ceil(self.max_time/2)}"
+
+ start_time = time.time()
+
+ for result in search_results:
+ self.threaded_execution(gather_information, result)
+
+ while (
+ len(processed_results) < self.min_results
+ and time.time() - start_time < self.max_time
+ ):
+ time.sleep(0.1)
+
+ final_results = "\n\n".join(processed_results)
+ if final_results:
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Final web_search skill results used as context for AI response: \n\n {final_results}",
+ color=LogType.INFO,
+ )
+ function_response = final_results
+
+ if self.settings.debug_mode:
+ await self.print_execution_time()
+
+ return function_response, instant_response
diff --git a/templates/migration/1_5_0/skills/web_search/requirements.txt b/templates/migration/1_5_0/skills/web_search/requirements.txt
new file mode 100644
index 00000000..522f0721
Binary files /dev/null and b/templates/migration/1_5_0/skills/web_search/requirements.txt differ
diff --git a/templates/skills/ask_perplexity/default_config.yaml b/templates/skills/ask_perplexity/default_config.yaml
new file mode 100644
index 00000000..ff992dfd
--- /dev/null
+++ b/templates/skills/ask_perplexity/default_config.yaml
@@ -0,0 +1,31 @@
+name: AskPerplexity
+module: skills.ask_perplexity.main
+category: general
+description:
+ en: Uses the Perplexity API to get up-to-date information on a wide range of topics. Perplexity is a paid service, you will need a funded account with an active API key, see https://www.perplexity.ai/settings/api
+ de: Verwendet die Perplexity-API, um aktuelle Informationen zu einer Vielzahl von Themen zu erhalten. Perplexity ist ein kostenpflichtiger Dienst, ein Konto mit Guthaben und aktiven API key ist notwendig, siehe https://www.perplexity.ai/settings/api
+examples:
+ - question:
+ en: How is the weather today in Berlin?
+ de: Wie ist das Wetter heute?
+ answer:
+ en: Today, the weather in Berlin is cloudy with a high of 20°C and a ... (more details)
+ de: Heute ist das Wetter in Berlin bewölkt, mit einer Höchsttemperatur von 20°C und ... (mehr Details)
+ - question:
+ en: In Star Citizen mining, what is currently the best way to find quantanium?
+ de: Beim Mining in Star Citizen, wie finde ich aktuell am besten Quantanium?
+ answer:
+ en: To find Quantanium for mining in Star Citizen, your best bet is Lyria, as it offers ... (more details)
+ de: Um Quantanium im Star Citizen Universum zu finden, ist Lyria der beste Ort, da dort ... (mehr Details)
+prompt: |
+ There is a new function: 'ask_perplexity'
+ Perplexity is a powerful tool that can provide you with up-to-date information on a wide range of topics.
+ Use it everytime the user asks a question that implies the need for up-to-date information.
+ Always use this if no other available skill matches the request better to get up-to-date information.
+custom_properties:
+ - id: instant_response
+ name: Instant Response
+ hint: If set, the Perplexity answer will be used instantly and unprocessed. This is faster but will not include format and/or language guidelines set in your wingman.
+ value: False
+ required: false
+ property_type: boolean
diff --git a/templates/skills/ask_perplexity/logo.png b/templates/skills/ask_perplexity/logo.png
new file mode 100644
index 00000000..3479ef77
Binary files /dev/null and b/templates/skills/ask_perplexity/logo.png differ
diff --git a/templates/skills/ask_perplexity/main.py b/templates/skills/ask_perplexity/main.py
new file mode 100644
index 00000000..869d3c97
--- /dev/null
+++ b/templates/skills/ask_perplexity/main.py
@@ -0,0 +1,84 @@
+from api.interface import (
+ WingmanInitializationError,
+)
+from skills.skill_base import Skill
+
+class AskPerplexity(Skill):
+
+ def __init__(
+ self,
+ *args,
+ **kwargs,
+ ) -> None:
+ super().__init__(*args, **kwargs)
+
+ self.instant_response = False
+
+ async def validate(self) -> list[WingmanInitializationError]:
+ errors = await super().validate()
+
+ if not self.wingman.perplexity:
+ await self.wingman.validate_and_set_perplexity(errors)
+
+ self.instant_response = self.retrieve_custom_property_value("instant_response", errors)
+
+ return errors
+
+ def get_tools(self) -> list[tuple[str, dict]]:
+ tools = [
+ (
+ "ask_perplexity",
+ {
+ "type": "function",
+ "function": {
+ "name": "ask_perplexity",
+ "description": "Expects a question that is answered with up-to-date information from the internet.",
+ "parameters": {
+ "type": "object",
+ "properties": {
+ "question": {"type": "string"},
+ },
+ "required": ["question"],
+ },
+ },
+
+ },
+ ),
+ ]
+ return tools
+
+ async def execute_tool(
+ self, tool_name: str, parameters: dict[str, any]
+ ) -> tuple[str, str]:
+ function_response = ""
+ instant_response = ""
+
+ if tool_name in ["ask_perplexity"]:
+ if self.settings.debug_mode:
+ self.start_execution_benchmark()
+
+ if tool_name == "ask_perplexity" and "question" in parameters:
+ function_response = self.ask_perplexity(parameters["question"])
+ if self.instant_response:
+ instant_response = function_response
+
+ if self.settings.debug_mode:
+ await self.printr.print_async(
+ f"Perplexity answer: {function_response}"
+ )
+ await self.print_execution_time()
+
+ return function_response, instant_response
+
+ def ask_perplexity(self, question: str) -> str:
+ """Uses the Perplexity API to answer a question."""
+
+ completion = self.wingman.perplexity.ask(
+ messages=[{"role": "user", "content": question}],
+ model=self.wingman.config.perplexity.conversation_model.value,
+ )
+
+ if completion and completion.choices:
+ return completion.choices[0].message.content
+ else:
+ return "Error: Unable to retrieve a response from Perplexity API."
diff --git a/templates/skills/file_manager/default_config.yaml b/templates/skills/file_manager/default_config.yaml
index 16dc408b..deb80aaf 100644
--- a/templates/skills/file_manager/default_config.yaml
+++ b/templates/skills/file_manager/default_config.yaml
@@ -2,11 +2,11 @@ name: FileManager
module: skills.file_manager.main
category: general
description:
- en: Manage local files, save, load and create directories. Supports various text-based file formats.
- de: Verwalte lokale Dateien, speichere, lade oder erstelle Verzeichnisse. Unterstützt verschiedene text-basierte Formate.
+ en: Manage local files, save, load and create directories. Supports various text-based file formats and reading PDFs.
+ de: Verwalte lokale Dateien, speichere, lade und erstelle Verzeichnisse. Unterstützt verschiedene text-basierte Formate und das Lesen von PDFs.
hint:
- en:
You should provide an exact file path and name for where you want to create a directory or save or load a text file. For example "save that text to a file called samplefile in my C drive in the directory called Documents."
If you do not, a directory called "files" in your Wingman config dir will be created and used.
Supported file formats are plain text file formats, such as txt, md, log, yaml, py, json, etc.
- de:
Gib einen möglichst genauen Speicherort für deine Verzeichnisse oder Dateien an, beispielsweise "Speichere diesen Text in eine Daten namens beispieldatei auf meinem C-Laufwerk im Verzeichnis Dokumente".
Wenn du das nicht machst, wird ein Verzeichnis namens "files" in deinem Wingman-Konfigurationsverzeichnis erstellt und verwendet.
Unterstützte Dateiformate sind einfache Textdateiformate wie txt, md, log, yaml, py, json usw
+ en:
You should provide an exact file path and name for where you want to create a directory or save or load a text file. For example "save that text to a file called samplefile in my C drive in the directory called Documents."
If you do not, a directory called "files" in your Wingman config dir will be created and used.
Supported file formats are plain text file formats, such as txt, md, log, yaml, py, json, etc., and PDFs.
+ de:
Gib einen möglichst genauen Speicherort für deine Verzeichnisse oder Dateien an, beispielsweise "Speichere diesen Text in eine Daten namens beispieldatei auf meinem C-Laufwerk im Verzeichnis Dokumente".
Wenn du das nicht machst, wird ein Verzeichnis namens "files" in deinem Wingman-Konfigurationsverzeichnis erstellt und verwendet.
Unterstützte Dateiformate sind einfache Textdateiformate wie txt, md, log, yaml, py, json usw., und PDFs.
examples:
- question:
en: Save 'Hello, World!' to hello.txt.
@@ -26,9 +26,15 @@ examples:
answer:
en: (creates a directory named 'Projects' in the default directory)
de: (erstellt ein Verzeichnis namens 'Projekte' im Standardverzeichnis)
+ - question:
+ en: Read page 5 of example.pdf.
+ de: Lies Seite 5 von example.pdf.
+ answer:
+ en: (loads page 5 of example.pdf and reads it into memory)
+ de: (lädt Seite 5 von example.pdf und liest sie in den Speicher)
prompt: |
You can also save text to various file formats, load text from files, or create directories as specified by the user.
- You support all plain text file formats.
+ You support reading and writing all plain text file formats and reading PDF files.
When adding text to an existing file, you follow these rules:
(1) determine if it is appropriate to add a new line before the added text or ask the user if you do not know.
(2) only add content to an existing file if you are sure that is what the user wants.
@@ -46,4 +52,4 @@ custom_properties:
name: Allow overwrite existing files
property_type: boolean
required: true
- value: false
+ value: false
\ No newline at end of file
diff --git a/templates/skills/file_manager/dependencies/_cffi_backend.cp311-win_amd64.pyd b/templates/skills/file_manager/dependencies/_cffi_backend.cp311-win_amd64.pyd
new file mode 100644
index 00000000..9bb0309f
Binary files /dev/null and b/templates/skills/file_manager/dependencies/_cffi_backend.cp311-win_amd64.pyd differ
diff --git a/templates/skills/file_manager/dependencies/bin/dumppdf.py b/templates/skills/file_manager/dependencies/bin/dumppdf.py
new file mode 100644
index 00000000..516f0347
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/bin/dumppdf.py
@@ -0,0 +1,482 @@
+"""Extract pdf structure in XML format"""
+import logging
+import os.path
+import re
+import sys
+from typing import Any, Container, Dict, Iterable, List, Optional, TextIO, Union, cast
+from argparse import ArgumentParser
+
+import pdfminer
+from pdfminer.pdfdocument import PDFDocument, PDFNoOutlines, PDFXRefFallback
+from pdfminer.pdfpage import PDFPage
+from pdfminer.pdfparser import PDFParser
+from pdfminer.pdftypes import (
+ PDFStream,
+ PDFObjRef,
+ resolve1,
+ stream_value,
+)
+from pdfminer.pdfexceptions import (
+ PDFTypeError,
+ PDFValueError,
+ PDFObjectNotFound,
+ PDFIOError,
+)
+from pdfminer.psparser import PSKeyword, PSLiteral, LIT
+from pdfminer.utils import isnumber
+
+logging.basicConfig()
+logger = logging.getLogger(__name__)
+
+ESC_PAT = re.compile(r'[\000-\037&<>()"\042\047\134\177-\377]')
+
+
+def escape(s: Union[str, bytes]) -> str:
+ if isinstance(s, bytes):
+ us = str(s, "latin-1")
+ else:
+ us = s
+ return ESC_PAT.sub(lambda m: "%d;" % ord(m.group(0)), us)
+
+
+def dumpxml(out: TextIO, obj: object, codec: Optional[str] = None) -> None:
+ if obj is None:
+ out.write("")
+ return
+
+ if isinstance(obj, dict):
+ out.write('\n' % len(obj))
+ for (k, v) in obj.items():
+ out.write("%s\n" % k)
+ out.write("")
+ dumpxml(out, v)
+ out.write("\n")
+ out.write("")
+ return
+
+ if isinstance(obj, list):
+ out.write('\n' % len(obj))
+ for v in obj:
+ dumpxml(out, v)
+ out.write("\n")
+ out.write("")
+ return
+
+ if isinstance(obj, (str, bytes)):
+ out.write('%s' % (len(obj), escape(obj)))
+ return
+
+ if isinstance(obj, PDFStream):
+ if codec == "raw":
+ # Bug: writing bytes to text I/O. This will raise TypeError.
+ out.write(obj.get_rawdata()) # type: ignore [arg-type]
+ elif codec == "binary":
+ # Bug: writing bytes to text I/O. This will raise TypeError.
+ out.write(obj.get_data()) # type: ignore [arg-type]
+ else:
+ out.write("\n\n")
+ dumpxml(out, obj.attrs)
+ out.write("\n\n")
+ if codec == "text":
+ data = obj.get_data()
+ out.write('%s\n' % (len(data), escape(data)))
+ out.write("")
+ return
+
+ if isinstance(obj, PDFObjRef):
+ out.write('' % obj.objid)
+ return
+
+ if isinstance(obj, PSKeyword):
+ # Likely bug: obj.name is bytes, not str
+ out.write("%s" % obj.name) # type: ignore [str-bytes-safe]
+ return
+
+ if isinstance(obj, PSLiteral):
+ # Likely bug: obj.name may be bytes, not str
+ out.write("%s" % obj.name) # type: ignore [str-bytes-safe]
+ return
+
+ if isnumber(obj):
+ out.write("%s" % obj)
+ return
+
+ raise PDFTypeError(obj)
+
+
+def dumptrailers(
+ out: TextIO, doc: PDFDocument, show_fallback_xref: bool = False
+) -> None:
+ for xref in doc.xrefs:
+ if not isinstance(xref, PDFXRefFallback) or show_fallback_xref:
+ out.write("\n")
+ dumpxml(out, xref.get_trailer())
+ out.write("\n\n\n")
+ no_xrefs = all(isinstance(xref, PDFXRefFallback) for xref in doc.xrefs)
+ if no_xrefs and not show_fallback_xref:
+ msg = (
+ "This PDF does not have an xref. Use --show-fallback-xref if "
+ "you want to display the content of a fallback xref that "
+ "contains all objects."
+ )
+ logger.warning(msg)
+ return
+
+
+def dumpallobjs(
+ out: TextIO,
+ doc: PDFDocument,
+ codec: Optional[str] = None,
+ show_fallback_xref: bool = False,
+) -> None:
+ visited = set()
+ out.write("")
+ for xref in doc.xrefs:
+ for objid in xref.get_objids():
+ if objid in visited:
+ continue
+ visited.add(objid)
+ try:
+ obj = doc.getobj(objid)
+ if obj is None:
+ continue
+ out.write('\n\n")
+ except PDFObjectNotFound as e:
+ print("not found: %r" % e)
+ dumptrailers(out, doc, show_fallback_xref)
+ out.write("")
+ return
+
+
+def dumpoutline(
+ outfp: TextIO,
+ fname: str,
+ objids: Any,
+ pagenos: Container[int],
+ password: str = "",
+ dumpall: bool = False,
+ codec: Optional[str] = None,
+ extractdir: Optional[str] = None,
+) -> None:
+ fp = open(fname, "rb")
+ parser = PDFParser(fp)
+ doc = PDFDocument(parser, password)
+ pages = {
+ page.pageid: pageno
+ for (pageno, page) in enumerate(PDFPage.create_pages(doc), 1)
+ }
+
+ def resolve_dest(dest: object) -> Any:
+ if isinstance(dest, (str, bytes)):
+ dest = resolve1(doc.get_dest(dest))
+ elif isinstance(dest, PSLiteral):
+ dest = resolve1(doc.get_dest(dest.name))
+ if isinstance(dest, dict):
+ dest = dest["D"]
+ if isinstance(dest, PDFObjRef):
+ dest = dest.resolve()
+ return dest
+
+ try:
+ outlines = doc.get_outlines()
+ outfp.write("\n")
+ for (level, title, dest, a, se) in outlines:
+ pageno = None
+ if dest:
+ dest = resolve_dest(dest)
+ pageno = pages[dest[0].objid]
+ elif a:
+ action = a
+ if isinstance(action, dict):
+ subtype = action.get("S")
+ if subtype and repr(subtype) == "/'GoTo'" and action.get("D"):
+ dest = resolve_dest(action["D"])
+ pageno = pages[dest[0].objid]
+ s = escape(title)
+ outfp.write(f'\n')
+ if dest is not None:
+ outfp.write("")
+ dumpxml(outfp, dest)
+ outfp.write("\n")
+ if pageno is not None:
+ outfp.write("%r\n" % pageno)
+ outfp.write("\n")
+ outfp.write("\n")
+ except PDFNoOutlines:
+ pass
+ parser.close()
+ fp.close()
+ return
+
+
+LITERAL_FILESPEC = LIT("Filespec")
+LITERAL_EMBEDDEDFILE = LIT("EmbeddedFile")
+
+
+def extractembedded(fname: str, password: str, extractdir: str) -> None:
+ def extract1(objid: int, obj: Dict[str, Any]) -> None:
+ filename = os.path.basename(obj.get("UF") or cast(bytes, obj.get("F")).decode())
+ fileref = obj["EF"].get("UF") or obj["EF"].get("F")
+ fileobj = doc.getobj(fileref.objid)
+ if not isinstance(fileobj, PDFStream):
+ error_msg = (
+ "unable to process PDF: reference for %r is not a "
+ "PDFStream" % filename
+ )
+ raise PDFValueError(error_msg)
+ if fileobj.get("Type") is not LITERAL_EMBEDDEDFILE:
+ raise PDFValueError(
+ "unable to process PDF: reference for %r "
+ "is not an EmbeddedFile" % (filename)
+ )
+ path = os.path.join(extractdir, "%.6d-%s" % (objid, filename))
+ if os.path.exists(path):
+ raise PDFIOError("file exists: %r" % path)
+ print("extracting: %r" % path)
+ os.makedirs(os.path.dirname(path), exist_ok=True)
+ out = open(path, "wb")
+ out.write(fileobj.get_data())
+ out.close()
+ return
+
+ with open(fname, "rb") as fp:
+ parser = PDFParser(fp)
+ doc = PDFDocument(parser, password)
+ extracted_objids = set()
+ for xref in doc.xrefs:
+ for objid in xref.get_objids():
+ obj = doc.getobj(objid)
+ if (
+ objid not in extracted_objids
+ and isinstance(obj, dict)
+ and obj.get("Type") is LITERAL_FILESPEC
+ ):
+ extracted_objids.add(objid)
+ extract1(objid, obj)
+ return
+
+
+def dumppdf(
+ outfp: TextIO,
+ fname: str,
+ objids: Iterable[int],
+ pagenos: Container[int],
+ password: str = "",
+ dumpall: bool = False,
+ codec: Optional[str] = None,
+ extractdir: Optional[str] = None,
+ show_fallback_xref: bool = False,
+) -> None:
+ fp = open(fname, "rb")
+ parser = PDFParser(fp)
+ doc = PDFDocument(parser, password)
+ if objids:
+ for objid in objids:
+ obj = doc.getobj(objid)
+ dumpxml(outfp, obj, codec=codec)
+ if pagenos:
+ for (pageno, page) in enumerate(PDFPage.create_pages(doc)):
+ if pageno in pagenos:
+ if codec:
+ for obj in page.contents:
+ obj = stream_value(obj)
+ dumpxml(outfp, obj, codec=codec)
+ else:
+ dumpxml(outfp, page.attrs)
+ if dumpall:
+ dumpallobjs(outfp, doc, codec, show_fallback_xref)
+ if (not objids) and (not pagenos) and (not dumpall):
+ dumptrailers(outfp, doc, show_fallback_xref)
+ fp.close()
+ if codec not in ("raw", "binary"):
+ outfp.write("\n")
+ return
+
+
+def create_parser() -> ArgumentParser:
+ parser = ArgumentParser(description=__doc__, add_help=True)
+ parser.add_argument(
+ "files",
+ type=str,
+ default=None,
+ nargs="+",
+ help="One or more paths to PDF files.",
+ )
+
+ parser.add_argument(
+ "--version",
+ "-v",
+ action="version",
+ version=f"pdfminer.six v{pdfminer.__version__}",
+ )
+ parser.add_argument(
+ "--debug",
+ "-d",
+ default=False,
+ action="store_true",
+ help="Use debug logging level.",
+ )
+ procedure_parser = parser.add_mutually_exclusive_group()
+ procedure_parser.add_argument(
+ "--extract-toc",
+ "-T",
+ default=False,
+ action="store_true",
+ help="Extract structure of outline",
+ )
+ procedure_parser.add_argument(
+ "--extract-embedded", "-E", type=str, help="Extract embedded files"
+ )
+
+ parse_params = parser.add_argument_group(
+ "Parser", description="Used during PDF parsing"
+ )
+ parse_params.add_argument(
+ "--page-numbers",
+ type=int,
+ default=None,
+ nargs="+",
+ help="A space-seperated list of page numbers to parse.",
+ )
+ parse_params.add_argument(
+ "--pagenos",
+ "-p",
+ type=str,
+ help="A comma-separated list of page numbers to parse. Included for "
+ "legacy applications, use --page-numbers for more idiomatic "
+ "argument entry.",
+ )
+ parse_params.add_argument(
+ "--objects",
+ "-i",
+ type=str,
+ help="Comma separated list of object numbers to extract",
+ )
+ parse_params.add_argument(
+ "--all",
+ "-a",
+ default=False,
+ action="store_true",
+ help="If the structure of all objects should be extracted",
+ )
+ parse_params.add_argument(
+ "--show-fallback-xref",
+ action="store_true",
+ help="Additionally show the fallback xref. Use this if the PDF "
+ "has zero or only invalid xref's. This setting is ignored if "
+ "--extract-toc or --extract-embedded is used.",
+ )
+ parse_params.add_argument(
+ "--password",
+ "-P",
+ type=str,
+ default="",
+ help="The password to use for decrypting PDF file.",
+ )
+
+ output_params = parser.add_argument_group(
+ "Output", description="Used during output generation."
+ )
+ output_params.add_argument(
+ "--outfile",
+ "-o",
+ type=str,
+ default="-",
+ help='Path to file where output is written. Or "-" (default) to '
+ "write to stdout.",
+ )
+ codec_parser = output_params.add_mutually_exclusive_group()
+ codec_parser.add_argument(
+ "--raw-stream",
+ "-r",
+ default=False,
+ action="store_true",
+ help="Write stream objects without encoding",
+ )
+ codec_parser.add_argument(
+ "--binary-stream",
+ "-b",
+ default=False,
+ action="store_true",
+ help="Write stream objects with binary encoding",
+ )
+ codec_parser.add_argument(
+ "--text-stream",
+ "-t",
+ default=False,
+ action="store_true",
+ help="Write stream objects as plain text",
+ )
+
+ return parser
+
+
+def main(argv: Optional[List[str]] = None) -> None:
+ parser = create_parser()
+ args = parser.parse_args(args=argv)
+
+ if args.debug:
+ logging.getLogger().setLevel(logging.DEBUG)
+
+ if args.outfile == "-":
+ outfp = sys.stdout
+ else:
+ outfp = open(args.outfile, "w")
+
+ if args.objects:
+ objids = [int(x) for x in args.objects.split(",")]
+ else:
+ objids = []
+
+ if args.page_numbers:
+ pagenos = {x - 1 for x in args.page_numbers}
+ elif args.pagenos:
+ pagenos = {int(x) - 1 for x in args.pagenos.split(",")}
+ else:
+ pagenos = set()
+
+ password = args.password
+
+ if args.raw_stream:
+ codec: Optional[str] = "raw"
+ elif args.binary_stream:
+ codec = "binary"
+ elif args.text_stream:
+ codec = "text"
+ else:
+ codec = None
+
+ for fname in args.files:
+ if args.extract_toc:
+ dumpoutline(
+ outfp,
+ fname,
+ objids,
+ pagenos,
+ password=password,
+ dumpall=args.all,
+ codec=codec,
+ extractdir=None,
+ )
+ elif args.extract_embedded:
+ extractembedded(fname, password=password, extractdir=args.extract_embedded)
+ else:
+ dumppdf(
+ outfp,
+ fname,
+ objids,
+ pagenos,
+ password=password,
+ dumpall=args.all,
+ codec=codec,
+ extractdir=None,
+ show_fallback_xref=args.show_fallback_xref,
+ )
+
+ outfp.close()
+
+
+if __name__ == "__main__":
+ main()
diff --git a/templates/skills/file_manager/dependencies/bin/normalizer.exe b/templates/skills/file_manager/dependencies/bin/normalizer.exe
new file mode 100644
index 00000000..d69c7297
Binary files /dev/null and b/templates/skills/file_manager/dependencies/bin/normalizer.exe differ
diff --git a/templates/skills/file_manager/dependencies/bin/pdf2txt.py b/templates/skills/file_manager/dependencies/bin/pdf2txt.py
new file mode 100644
index 00000000..981bd7a6
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/bin/pdf2txt.py
@@ -0,0 +1,317 @@
+"""A command line tool for extracting text and images from PDF and
+output it to plain text, html, xml or tags."""
+import argparse
+import logging
+import sys
+from typing import Any, Container, Iterable, List, Optional
+
+import pdfminer.high_level
+from pdfminer.layout import LAParams
+from pdfminer.utils import AnyIO
+from pdfminer.pdfexceptions import PDFValueError
+
+logging.basicConfig()
+
+OUTPUT_TYPES = ((".htm", "html"), (".html", "html"), (".xml", "xml"), (".tag", "tag"))
+
+
+def float_or_disabled(x: str) -> Optional[float]:
+ if x.lower().strip() == "disabled":
+ return None
+ try:
+ return float(x)
+ except ValueError:
+ raise argparse.ArgumentTypeError(f"invalid float value: {x}")
+
+
+def extract_text(
+ files: Iterable[str] = [],
+ outfile: str = "-",
+ laparams: Optional[LAParams] = None,
+ output_type: str = "text",
+ codec: str = "utf-8",
+ strip_control: bool = False,
+ maxpages: int = 0,
+ page_numbers: Optional[Container[int]] = None,
+ password: str = "",
+ scale: float = 1.0,
+ rotation: int = 0,
+ layoutmode: str = "normal",
+ output_dir: Optional[str] = None,
+ debug: bool = False,
+ disable_caching: bool = False,
+ **kwargs: Any,
+) -> AnyIO:
+ if not files:
+ raise PDFValueError("Must provide files to work upon!")
+
+ if output_type == "text" and outfile != "-":
+ for override, alttype in OUTPUT_TYPES:
+ if outfile.endswith(override):
+ output_type = alttype
+
+ if outfile == "-":
+ outfp: AnyIO = sys.stdout
+ if sys.stdout.encoding is not None:
+ codec = "utf-8"
+ else:
+ outfp = open(outfile, "wb")
+
+ for fname in files:
+ with open(fname, "rb") as fp:
+ pdfminer.high_level.extract_text_to_fp(fp, **locals())
+ return outfp
+
+
+def create_parser() -> argparse.ArgumentParser:
+ parser = argparse.ArgumentParser(description=__doc__, add_help=True)
+ parser.add_argument(
+ "files",
+ type=str,
+ default=None,
+ nargs="+",
+ help="One or more paths to PDF files.",
+ )
+
+ parser.add_argument(
+ "--version",
+ "-v",
+ action="version",
+ version=f"pdfminer.six v{pdfminer.__version__}",
+ )
+ parser.add_argument(
+ "--debug",
+ "-d",
+ default=False,
+ action="store_true",
+ help="Use debug logging level.",
+ )
+ parser.add_argument(
+ "--disable-caching",
+ "-C",
+ default=False,
+ action="store_true",
+ help="If caching or resources, such as fonts, should be disabled.",
+ )
+
+ parse_params = parser.add_argument_group(
+ "Parser", description="Used during PDF parsing"
+ )
+ parse_params.add_argument(
+ "--page-numbers",
+ type=int,
+ default=None,
+ nargs="+",
+ help="A space-seperated list of page numbers to parse.",
+ )
+ parse_params.add_argument(
+ "--pagenos",
+ "-p",
+ type=str,
+ help="A comma-separated list of page numbers to parse. "
+ "Included for legacy applications, use --page-numbers "
+ "for more idiomatic argument entry.",
+ )
+ parse_params.add_argument(
+ "--maxpages",
+ "-m",
+ type=int,
+ default=0,
+ help="The maximum number of pages to parse.",
+ )
+ parse_params.add_argument(
+ "--password",
+ "-P",
+ type=str,
+ default="",
+ help="The password to use for decrypting PDF file.",
+ )
+ parse_params.add_argument(
+ "--rotation",
+ "-R",
+ default=0,
+ type=int,
+ help="The number of degrees to rotate the PDF "
+ "before other types of processing.",
+ )
+
+ la_params = LAParams() # will be used for defaults
+ la_param_group = parser.add_argument_group(
+ "Layout analysis", description="Used during layout analysis."
+ )
+ la_param_group.add_argument(
+ "--no-laparams",
+ "-n",
+ default=False,
+ action="store_true",
+ help="If layout analysis parameters should be ignored.",
+ )
+ la_param_group.add_argument(
+ "--detect-vertical",
+ "-V",
+ default=la_params.detect_vertical,
+ action="store_true",
+ help="If vertical text should be considered during layout analysis",
+ )
+ la_param_group.add_argument(
+ "--line-overlap",
+ type=float,
+ default=la_params.line_overlap,
+ help="If two characters have more overlap than this they "
+ "are considered to be on the same line. The overlap is specified "
+ "relative to the minimum height of both characters.",
+ )
+ la_param_group.add_argument(
+ "--char-margin",
+ "-M",
+ type=float,
+ default=la_params.char_margin,
+ help="If two characters are closer together than this margin they "
+ "are considered to be part of the same line. The margin is "
+ "specified relative to the width of the character.",
+ )
+ la_param_group.add_argument(
+ "--word-margin",
+ "-W",
+ type=float,
+ default=la_params.word_margin,
+ help="If two characters on the same line are further apart than this "
+ "margin then they are considered to be two separate words, and "
+ "an intermediate space will be added for readability. The margin "
+ "is specified relative to the width of the character.",
+ )
+ la_param_group.add_argument(
+ "--line-margin",
+ "-L",
+ type=float,
+ default=la_params.line_margin,
+ help="If two lines are close together they are considered to "
+ "be part of the same paragraph. The margin is specified "
+ "relative to the height of a line.",
+ )
+ la_param_group.add_argument(
+ "--boxes-flow",
+ "-F",
+ type=float_or_disabled,
+ default=la_params.boxes_flow,
+ help="Specifies how much a horizontal and vertical position of a "
+ "text matters when determining the order of lines. The value "
+ "should be within the range of -1.0 (only horizontal position "
+ "matters) to +1.0 (only vertical position matters). You can also "
+ "pass `disabled` to disable advanced layout analysis, and "
+ "instead return text based on the position of the bottom left "
+ "corner of the text box.",
+ )
+ la_param_group.add_argument(
+ "--all-texts",
+ "-A",
+ default=la_params.all_texts,
+ action="store_true",
+ help="If layout analysis should be performed on text in figures.",
+ )
+
+ output_params = parser.add_argument_group(
+ "Output", description="Used during output generation."
+ )
+ output_params.add_argument(
+ "--outfile",
+ "-o",
+ type=str,
+ default="-",
+ help="Path to file where output is written. "
+ 'Or "-" (default) to write to stdout.',
+ )
+ output_params.add_argument(
+ "--output_type",
+ "-t",
+ type=str,
+ default="text",
+ help="Type of output to generate {text,html,xml,tag}.",
+ )
+ output_params.add_argument(
+ "--codec",
+ "-c",
+ type=str,
+ default="utf-8",
+ help="Text encoding to use in output file.",
+ )
+ output_params.add_argument(
+ "--output-dir",
+ "-O",
+ default=None,
+ help="The output directory to put extracted images in. If not given, "
+ "images are not extracted.",
+ )
+ output_params.add_argument(
+ "--layoutmode",
+ "-Y",
+ default="normal",
+ type=str,
+ help="Type of layout to use when generating html "
+ "{normal,exact,loose}. If normal,each line is"
+ " positioned separately in the html. If exact"
+ ", each character is positioned separately in"
+ " the html. If loose, same result as normal "
+ "but with an additional newline after each "
+ "text line. Only used when output_type is html.",
+ )
+ output_params.add_argument(
+ "--scale",
+ "-s",
+ type=float,
+ default=1.0,
+ help="The amount of zoom to use when generating html file. "
+ "Only used when output_type is html.",
+ )
+ output_params.add_argument(
+ "--strip-control",
+ "-S",
+ default=False,
+ action="store_true",
+ help="Remove control statement from text. "
+ "Only used when output_type is xml.",
+ )
+
+ return parser
+
+
+def parse_args(args: Optional[List[str]]) -> argparse.Namespace:
+ parsed_args = create_parser().parse_args(args=args)
+
+ # Propagate parsed layout parameters to LAParams object
+ if parsed_args.no_laparams:
+ parsed_args.laparams = None
+ else:
+ parsed_args.laparams = LAParams(
+ line_overlap=parsed_args.line_overlap,
+ char_margin=parsed_args.char_margin,
+ line_margin=parsed_args.line_margin,
+ word_margin=parsed_args.word_margin,
+ boxes_flow=parsed_args.boxes_flow,
+ detect_vertical=parsed_args.detect_vertical,
+ all_texts=parsed_args.all_texts,
+ )
+
+ if parsed_args.page_numbers:
+ parsed_args.page_numbers = {x - 1 for x in parsed_args.page_numbers}
+
+ if parsed_args.pagenos:
+ parsed_args.page_numbers = {int(x) - 1 for x in parsed_args.pagenos.split(",")}
+
+ if parsed_args.output_type == "text" and parsed_args.outfile != "-":
+ for override, alttype in OUTPUT_TYPES:
+ if parsed_args.outfile.endswith(override):
+ parsed_args.output_type = alttype
+
+ return parsed_args
+
+
+def main(args: Optional[List[str]] = None) -> int:
+ parsed_args = parse_args(args)
+ outfp = extract_text(**vars(parsed_args))
+ outfp.close()
+ return 0
+
+
+if __name__ == "__main__":
+ sys.exit(main())
diff --git a/templates/skills/file_manager/dependencies/cffi-1.17.1.dist-info/INSTALLER b/templates/skills/file_manager/dependencies/cffi-1.17.1.dist-info/INSTALLER
new file mode 100644
index 00000000..a1b589e3
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/cffi-1.17.1.dist-info/INSTALLER
@@ -0,0 +1 @@
+pip
diff --git a/templates/skills/file_manager/dependencies/cffi-1.17.1.dist-info/LICENSE b/templates/skills/file_manager/dependencies/cffi-1.17.1.dist-info/LICENSE
new file mode 100644
index 00000000..29225eee
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/cffi-1.17.1.dist-info/LICENSE
@@ -0,0 +1,26 @@
+
+Except when otherwise stated (look for LICENSE files in directories or
+information at the beginning of each file) all software and
+documentation is licensed as follows:
+
+ The MIT License
+
+ Permission is hereby granted, free of charge, to any person
+ obtaining a copy of this software and associated documentation
+ files (the "Software"), to deal in the Software without
+ restriction, including without limitation the rights to use,
+ copy, modify, merge, publish, distribute, sublicense, and/or
+ sell copies of the Software, and to permit persons to whom the
+ Software is furnished to do so, subject to the following conditions:
+
+ The above copyright notice and this permission notice shall be included
+ in all copies or substantial portions of the Software.
+
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS
+ OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL
+ THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
+ FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER
+ DEALINGS IN THE SOFTWARE.
+
diff --git a/templates/skills/file_manager/dependencies/cffi-1.17.1.dist-info/METADATA b/templates/skills/file_manager/dependencies/cffi-1.17.1.dist-info/METADATA
new file mode 100644
index 00000000..60b0779f
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/cffi-1.17.1.dist-info/METADATA
@@ -0,0 +1,40 @@
+Metadata-Version: 2.1
+Name: cffi
+Version: 1.17.1
+Summary: Foreign Function Interface for Python calling C code.
+Home-page: http://cffi.readthedocs.org
+Author: Armin Rigo, Maciej Fijalkowski
+Author-email: python-cffi@googlegroups.com
+License: MIT
+Project-URL: Documentation, http://cffi.readthedocs.org/
+Project-URL: Source Code, https://github.com/python-cffi/cffi
+Project-URL: Issue Tracker, https://github.com/python-cffi/cffi/issues
+Project-URL: Changelog, https://cffi.readthedocs.io/en/latest/whatsnew.html
+Project-URL: Downloads, https://github.com/python-cffi/cffi/releases
+Project-URL: Contact, https://groups.google.com/forum/#!forum/python-cffi
+Classifier: Programming Language :: Python
+Classifier: Programming Language :: Python :: 3
+Classifier: Programming Language :: Python :: 3.8
+Classifier: Programming Language :: Python :: 3.9
+Classifier: Programming Language :: Python :: 3.10
+Classifier: Programming Language :: Python :: 3.11
+Classifier: Programming Language :: Python :: 3.12
+Classifier: Programming Language :: Python :: 3.13
+Classifier: Programming Language :: Python :: Implementation :: CPython
+Classifier: Programming Language :: Python :: Implementation :: PyPy
+Classifier: License :: OSI Approved :: MIT License
+Requires-Python: >=3.8
+License-File: LICENSE
+Requires-Dist: pycparser
+
+
+CFFI
+====
+
+Foreign Function Interface for Python calling C code.
+Please see the `Documentation `_.
+
+Contact
+-------
+
+`Mailing list `_
diff --git a/templates/skills/file_manager/dependencies/cffi-1.17.1.dist-info/RECORD b/templates/skills/file_manager/dependencies/cffi-1.17.1.dist-info/RECORD
new file mode 100644
index 00000000..b0a4eed8
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/cffi-1.17.1.dist-info/RECORD
@@ -0,0 +1,49 @@
+_cffi_backend.cp311-win_amd64.pyd,sha256=mu6Qz3mAyP9pS7P_4Gxx-H62phMDP3PjF0pzJkjTmYA,178176
+cffi-1.17.1.dist-info/INSTALLER,sha256=zuuue4knoyJ-UwPPXg8fezS7VCrXJQrAP7zeNuwvFQg,4
+cffi-1.17.1.dist-info/LICENSE,sha256=BLgPWwd7vtaICM_rreteNSPyqMmpZJXFh72W3x6sKjM,1294
+cffi-1.17.1.dist-info/METADATA,sha256=avJrvo-kUNx6iXJEaZVjGXNy42QS-YfjNHdJdeiBlFc,1571
+cffi-1.17.1.dist-info/RECORD,,
+cffi-1.17.1.dist-info/REQUESTED,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
+cffi-1.17.1.dist-info/WHEEL,sha256=gP9oq1B6BRaUd7LM9qWlox_06QqMeQKU8gW0ScfyBso,101
+cffi-1.17.1.dist-info/entry_points.txt,sha256=y6jTxnyeuLnL-XJcDv8uML3n6wyYiGRg8MTp_QGJ9Ho,75
+cffi-1.17.1.dist-info/top_level.txt,sha256=rE7WR3rZfNKxWI9-jn6hsHCAl7MDkB-FmuQbxWjFehQ,19
+cffi/__init__.py,sha256=H6t_ebva6EeHpUuItFLW1gbRp94eZRNJODLaWKdbx1I,513
+cffi/__pycache__/__init__.cpython-311.pyc,,
+cffi/__pycache__/_imp_emulation.cpython-311.pyc,,
+cffi/__pycache__/_shimmed_dist_utils.cpython-311.pyc,,
+cffi/__pycache__/api.cpython-311.pyc,,
+cffi/__pycache__/backend_ctypes.cpython-311.pyc,,
+cffi/__pycache__/cffi_opcode.cpython-311.pyc,,
+cffi/__pycache__/commontypes.cpython-311.pyc,,
+cffi/__pycache__/cparser.cpython-311.pyc,,
+cffi/__pycache__/error.cpython-311.pyc,,
+cffi/__pycache__/ffiplatform.cpython-311.pyc,,
+cffi/__pycache__/lock.cpython-311.pyc,,
+cffi/__pycache__/model.cpython-311.pyc,,
+cffi/__pycache__/pkgconfig.cpython-311.pyc,,
+cffi/__pycache__/recompiler.cpython-311.pyc,,
+cffi/__pycache__/setuptools_ext.cpython-311.pyc,,
+cffi/__pycache__/vengine_cpy.cpython-311.pyc,,
+cffi/__pycache__/vengine_gen.cpython-311.pyc,,
+cffi/__pycache__/verifier.cpython-311.pyc,,
+cffi/_cffi_errors.h,sha256=zQXt7uR_m8gUW-fI2hJg0KoSkJFwXv8RGUkEDZ177dQ,3908
+cffi/_cffi_include.h,sha256=Exhmgm9qzHWzWivjfTe0D7Xp4rPUkVxdNuwGhMTMzbw,15055
+cffi/_embedding.h,sha256=EDKw5QrLvQoe3uosXB3H1xPVTYxsn33eV3A43zsA_Fw,18787
+cffi/_imp_emulation.py,sha256=RxREG8zAbI2RPGBww90u_5fi8sWdahpdipOoPzkp7C0,2960
+cffi/_shimmed_dist_utils.py,sha256=Bjj2wm8yZbvFvWEx5AEfmqaqZyZFhYfoyLLQHkXZuao,2230
+cffi/api.py,sha256=alBv6hZQkjpmZplBphdaRn2lPO9-CORs_M7ixabvZWI,42169
+cffi/backend_ctypes.py,sha256=h5ZIzLc6BFVXnGyc9xPqZWUS7qGy7yFSDqXe68Sa8z4,42454
+cffi/cffi_opcode.py,sha256=JDV5l0R0_OadBX_uE7xPPTYtMdmpp8I9UYd6av7aiDU,5731
+cffi/commontypes.py,sha256=7N6zPtCFlvxXMWhHV08psUjdYIK2XgsN3yo5dgua_v4,2805
+cffi/cparser.py,sha256=0qI3mEzZSNVcCangoyXOoAcL-RhpQL08eG8798T024s,44789
+cffi/error.py,sha256=v6xTiS4U0kvDcy4h_BDRo5v39ZQuj-IMRYLv5ETddZs,877
+cffi/ffiplatform.py,sha256=avxFjdikYGJoEtmJO7ewVmwG_VEVl6EZ_WaNhZYCqv4,3584
+cffi/lock.py,sha256=l9TTdwMIMpi6jDkJGnQgE9cvTIR7CAntIJr8EGHt3pY,747
+cffi/model.py,sha256=W30UFQZE73jL5Mx5N81YT77us2W2iJjTm0XYfnwz1cg,21797
+cffi/parse_c_type.h,sha256=OdwQfwM9ktq6vlCB43exFQmxDBtj2MBNdK8LYl15tjw,5976
+cffi/pkgconfig.py,sha256=LP1w7vmWvmKwyqLaU1Z243FOWGNQMrgMUZrvgFuOlco,4374
+cffi/recompiler.py,sha256=sim4Tm7lamt2Jn8uzKN0wMYp6ODByk3g7of47-h9LD4,65367
+cffi/setuptools_ext.py,sha256=-ebj79lO2_AUH-kRcaja2pKY1Z_5tloGwsJgzK8P3Cc,8871
+cffi/vengine_cpy.py,sha256=8UagT6ZEOZf6Dju7_CfNulue8CnsHLEzJYhnqUhoF04,43752
+cffi/vengine_gen.py,sha256=DUlEIrDiVin1Pnhn1sfoamnS5NLqfJcOdhRoeSNeJRg,26939
+cffi/verifier.py,sha256=oX8jpaohg2Qm3aHcznidAdvrVm5N4sQYG0a3Eo5mIl4,11182
diff --git a/templates/skills/file_manager/dependencies/cffi-1.17.1.dist-info/REQUESTED b/templates/skills/file_manager/dependencies/cffi-1.17.1.dist-info/REQUESTED
new file mode 100644
index 00000000..e69de29b
diff --git a/templates/skills/file_manager/dependencies/cffi-1.17.1.dist-info/WHEEL b/templates/skills/file_manager/dependencies/cffi-1.17.1.dist-info/WHEEL
new file mode 100644
index 00000000..eac371e1
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/cffi-1.17.1.dist-info/WHEEL
@@ -0,0 +1,5 @@
+Wheel-Version: 1.0
+Generator: setuptools (74.1.1)
+Root-Is-Purelib: false
+Tag: cp311-cp311-win_amd64
+
diff --git a/templates/skills/file_manager/dependencies/cffi-1.17.1.dist-info/entry_points.txt b/templates/skills/file_manager/dependencies/cffi-1.17.1.dist-info/entry_points.txt
new file mode 100644
index 00000000..4b0274f2
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/cffi-1.17.1.dist-info/entry_points.txt
@@ -0,0 +1,2 @@
+[distutils.setup_keywords]
+cffi_modules = cffi.setuptools_ext:cffi_modules
diff --git a/templates/skills/file_manager/dependencies/cffi-1.17.1.dist-info/top_level.txt b/templates/skills/file_manager/dependencies/cffi-1.17.1.dist-info/top_level.txt
new file mode 100644
index 00000000..f6457795
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/cffi-1.17.1.dist-info/top_level.txt
@@ -0,0 +1,2 @@
+_cffi_backend
+cffi
diff --git a/templates/skills/file_manager/dependencies/cffi/__init__.py b/templates/skills/file_manager/dependencies/cffi/__init__.py
new file mode 100644
index 00000000..2e35a38c
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/cffi/__init__.py
@@ -0,0 +1,14 @@
+__all__ = ['FFI', 'VerificationError', 'VerificationMissing', 'CDefError',
+ 'FFIError']
+
+from .api import FFI
+from .error import CDefError, FFIError, VerificationError, VerificationMissing
+from .error import PkgConfigError
+
+__version__ = "1.17.1"
+__version_info__ = (1, 17, 1)
+
+# The verifier module file names are based on the CRC32 of a string that
+# contains the following version number. It may be older than __version__
+# if nothing is clearly incompatible.
+__version_verifier_modules__ = "0.8.6"
diff --git a/templates/skills/file_manager/dependencies/cffi/_cffi_errors.h b/templates/skills/file_manager/dependencies/cffi/_cffi_errors.h
new file mode 100644
index 00000000..158e0590
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/cffi/_cffi_errors.h
@@ -0,0 +1,149 @@
+#ifndef CFFI_MESSAGEBOX
+# ifdef _MSC_VER
+# define CFFI_MESSAGEBOX 1
+# else
+# define CFFI_MESSAGEBOX 0
+# endif
+#endif
+
+
+#if CFFI_MESSAGEBOX
+/* Windows only: logic to take the Python-CFFI embedding logic
+ initialization errors and display them in a background thread
+ with MessageBox. The idea is that if the whole program closes
+ as a result of this problem, then likely it is already a console
+ program and you can read the stderr output in the console too.
+ If it is not a console program, then it will likely show its own
+ dialog to complain, or generally not abruptly close, and for this
+ case the background thread should stay alive.
+*/
+static void *volatile _cffi_bootstrap_text;
+
+static PyObject *_cffi_start_error_capture(void)
+{
+ PyObject *result = NULL;
+ PyObject *x, *m, *bi;
+
+ if (InterlockedCompareExchangePointer(&_cffi_bootstrap_text,
+ (void *)1, NULL) != NULL)
+ return (PyObject *)1;
+
+ m = PyImport_AddModule("_cffi_error_capture");
+ if (m == NULL)
+ goto error;
+
+ result = PyModule_GetDict(m);
+ if (result == NULL)
+ goto error;
+
+#if PY_MAJOR_VERSION >= 3
+ bi = PyImport_ImportModule("builtins");
+#else
+ bi = PyImport_ImportModule("__builtin__");
+#endif
+ if (bi == NULL)
+ goto error;
+ PyDict_SetItemString(result, "__builtins__", bi);
+ Py_DECREF(bi);
+
+ x = PyRun_String(
+ "import sys\n"
+ "class FileLike:\n"
+ " def write(self, x):\n"
+ " try:\n"
+ " of.write(x)\n"
+ " except: pass\n"
+ " self.buf += x\n"
+ " def flush(self):\n"
+ " pass\n"
+ "fl = FileLike()\n"
+ "fl.buf = ''\n"
+ "of = sys.stderr\n"
+ "sys.stderr = fl\n"
+ "def done():\n"
+ " sys.stderr = of\n"
+ " return fl.buf\n", /* make sure the returned value stays alive */
+ Py_file_input,
+ result, result);
+ Py_XDECREF(x);
+
+ error:
+ if (PyErr_Occurred())
+ {
+ PyErr_WriteUnraisable(Py_None);
+ PyErr_Clear();
+ }
+ return result;
+}
+
+#pragma comment(lib, "user32.lib")
+
+static DWORD WINAPI _cffi_bootstrap_dialog(LPVOID ignored)
+{
+ Sleep(666); /* may be interrupted if the whole process is closing */
+#if PY_MAJOR_VERSION >= 3
+ MessageBoxW(NULL, (wchar_t *)_cffi_bootstrap_text,
+ L"Python-CFFI error",
+ MB_OK | MB_ICONERROR);
+#else
+ MessageBoxA(NULL, (char *)_cffi_bootstrap_text,
+ "Python-CFFI error",
+ MB_OK | MB_ICONERROR);
+#endif
+ _cffi_bootstrap_text = NULL;
+ return 0;
+}
+
+static void _cffi_stop_error_capture(PyObject *ecap)
+{
+ PyObject *s;
+ void *text;
+
+ if (ecap == (PyObject *)1)
+ return;
+
+ if (ecap == NULL)
+ goto error;
+
+ s = PyRun_String("done()", Py_eval_input, ecap, ecap);
+ if (s == NULL)
+ goto error;
+
+ /* Show a dialog box, but in a background thread, and
+ never show multiple dialog boxes at once. */
+#if PY_MAJOR_VERSION >= 3
+ text = PyUnicode_AsWideCharString(s, NULL);
+#else
+ text = PyString_AsString(s);
+#endif
+
+ _cffi_bootstrap_text = text;
+
+ if (text != NULL)
+ {
+ HANDLE h;
+ h = CreateThread(NULL, 0, _cffi_bootstrap_dialog,
+ NULL, 0, NULL);
+ if (h != NULL)
+ CloseHandle(h);
+ }
+ /* decref the string, but it should stay alive as 'fl.buf'
+ in the small module above. It will really be freed only if
+ we later get another similar error. So it's a leak of at
+ most one copy of the small module. That's fine for this
+ situation which is usually a "fatal error" anyway. */
+ Py_DECREF(s);
+ PyErr_Clear();
+ return;
+
+ error:
+ _cffi_bootstrap_text = NULL;
+ PyErr_Clear();
+}
+
+#else
+
+static PyObject *_cffi_start_error_capture(void) { return NULL; }
+static void _cffi_stop_error_capture(PyObject *ecap) { }
+
+#endif
diff --git a/templates/skills/file_manager/dependencies/cffi/_cffi_include.h b/templates/skills/file_manager/dependencies/cffi/_cffi_include.h
new file mode 100644
index 00000000..908a1d73
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/cffi/_cffi_include.h
@@ -0,0 +1,389 @@
+#define _CFFI_
+
+/* We try to define Py_LIMITED_API before including Python.h.
+
+ Mess: we can only define it if Py_DEBUG, Py_TRACE_REFS and
+ Py_REF_DEBUG are not defined. This is a best-effort approximation:
+ we can learn about Py_DEBUG from pyconfig.h, but it is unclear if
+ the same works for the other two macros. Py_DEBUG implies them,
+ but not the other way around.
+
+ The implementation is messy (issue #350): on Windows, with _MSC_VER,
+ we have to define Py_LIMITED_API even before including pyconfig.h.
+ In that case, we guess what pyconfig.h will do to the macros above,
+ and check our guess after the #include.
+
+ Note that on Windows, with CPython 3.x, you need >= 3.5 and virtualenv
+ version >= 16.0.0. With older versions of either, you don't get a
+ copy of PYTHON3.DLL in the virtualenv. We can't check the version of
+ CPython *before* we even include pyconfig.h. ffi.set_source() puts
+ a ``#define _CFFI_NO_LIMITED_API'' at the start of this file if it is
+ running on Windows < 3.5, as an attempt at fixing it, but that's
+ arguably wrong because it may not be the target version of Python.
+ Still better than nothing I guess. As another workaround, you can
+ remove the definition of Py_LIMITED_API here.
+
+ See also 'py_limited_api' in cffi/setuptools_ext.py.
+*/
+#if !defined(_CFFI_USE_EMBEDDING) && !defined(Py_LIMITED_API)
+# ifdef _MSC_VER
+# if !defined(_DEBUG) && !defined(Py_DEBUG) && !defined(Py_TRACE_REFS) && !defined(Py_REF_DEBUG) && !defined(_CFFI_NO_LIMITED_API)
+# define Py_LIMITED_API
+# endif
+# include
+ /* sanity-check: Py_LIMITED_API will cause crashes if any of these
+ are also defined. Normally, the Python file PC/pyconfig.h does not
+ cause any of these to be defined, with the exception that _DEBUG
+ causes Py_DEBUG. Double-check that. */
+# ifdef Py_LIMITED_API
+# if defined(Py_DEBUG)
+# error "pyconfig.h unexpectedly defines Py_DEBUG, but Py_LIMITED_API is set"
+# endif
+# if defined(Py_TRACE_REFS)
+# error "pyconfig.h unexpectedly defines Py_TRACE_REFS, but Py_LIMITED_API is set"
+# endif
+# if defined(Py_REF_DEBUG)
+# error "pyconfig.h unexpectedly defines Py_REF_DEBUG, but Py_LIMITED_API is set"
+# endif
+# endif
+# else
+# include
+# if !defined(Py_DEBUG) && !defined(Py_TRACE_REFS) && !defined(Py_REF_DEBUG) && !defined(_CFFI_NO_LIMITED_API)
+# define Py_LIMITED_API
+# endif
+# endif
+#endif
+
+#include
+#ifdef __cplusplus
+extern "C" {
+#endif
+#include
+#include "parse_c_type.h"
+
+/* this block of #ifs should be kept exactly identical between
+ c/_cffi_backend.c, cffi/vengine_cpy.py, cffi/vengine_gen.py
+ and cffi/_cffi_include.h */
+#if defined(_MSC_VER)
+# include /* for alloca() */
+# if _MSC_VER < 1600 /* MSVC < 2010 */
+ typedef __int8 int8_t;
+ typedef __int16 int16_t;
+ typedef __int32 int32_t;
+ typedef __int64 int64_t;
+ typedef unsigned __int8 uint8_t;
+ typedef unsigned __int16 uint16_t;
+ typedef unsigned __int32 uint32_t;
+ typedef unsigned __int64 uint64_t;
+ typedef __int8 int_least8_t;
+ typedef __int16 int_least16_t;
+ typedef __int32 int_least32_t;
+ typedef __int64 int_least64_t;
+ typedef unsigned __int8 uint_least8_t;
+ typedef unsigned __int16 uint_least16_t;
+ typedef unsigned __int32 uint_least32_t;
+ typedef unsigned __int64 uint_least64_t;
+ typedef __int8 int_fast8_t;
+ typedef __int16 int_fast16_t;
+ typedef __int32 int_fast32_t;
+ typedef __int64 int_fast64_t;
+ typedef unsigned __int8 uint_fast8_t;
+ typedef unsigned __int16 uint_fast16_t;
+ typedef unsigned __int32 uint_fast32_t;
+ typedef unsigned __int64 uint_fast64_t;
+ typedef __int64 intmax_t;
+ typedef unsigned __int64 uintmax_t;
+# else
+# include
+# endif
+# if _MSC_VER < 1800 /* MSVC < 2013 */
+# ifndef __cplusplus
+ typedef unsigned char _Bool;
+# endif
+# endif
+# define _cffi_float_complex_t _Fcomplex /* include for it */
+# define _cffi_double_complex_t _Dcomplex /* include for it */
+#else
+# include
+# if (defined (__SVR4) && defined (__sun)) || defined(_AIX) || defined(__hpux)
+# include
+# endif
+# define _cffi_float_complex_t float _Complex
+# define _cffi_double_complex_t double _Complex
+#endif
+
+#ifdef __GNUC__
+# define _CFFI_UNUSED_FN __attribute__((unused))
+#else
+# define _CFFI_UNUSED_FN /* nothing */
+#endif
+
+#ifdef __cplusplus
+# ifndef _Bool
+ typedef bool _Bool; /* semi-hackish: C++ has no _Bool; bool is builtin */
+# endif
+#endif
+
+/********** CPython-specific section **********/
+#ifndef PYPY_VERSION
+
+
+#if PY_MAJOR_VERSION >= 3
+# define PyInt_FromLong PyLong_FromLong
+#endif
+
+#define _cffi_from_c_double PyFloat_FromDouble
+#define _cffi_from_c_float PyFloat_FromDouble
+#define _cffi_from_c_long PyInt_FromLong
+#define _cffi_from_c_ulong PyLong_FromUnsignedLong
+#define _cffi_from_c_longlong PyLong_FromLongLong
+#define _cffi_from_c_ulonglong PyLong_FromUnsignedLongLong
+#define _cffi_from_c__Bool PyBool_FromLong
+
+#define _cffi_to_c_double PyFloat_AsDouble
+#define _cffi_to_c_float PyFloat_AsDouble
+
+#define _cffi_from_c_int(x, type) \
+ (((type)-1) > 0 ? /* unsigned */ \
+ (sizeof(type) < sizeof(long) ? \
+ PyInt_FromLong((long)x) : \
+ sizeof(type) == sizeof(long) ? \
+ PyLong_FromUnsignedLong((unsigned long)x) : \
+ PyLong_FromUnsignedLongLong((unsigned long long)x)) : \
+ (sizeof(type) <= sizeof(long) ? \
+ PyInt_FromLong((long)x) : \
+ PyLong_FromLongLong((long long)x)))
+
+#define _cffi_to_c_int(o, type) \
+ ((type)( \
+ sizeof(type) == 1 ? (((type)-1) > 0 ? (type)_cffi_to_c_u8(o) \
+ : (type)_cffi_to_c_i8(o)) : \
+ sizeof(type) == 2 ? (((type)-1) > 0 ? (type)_cffi_to_c_u16(o) \
+ : (type)_cffi_to_c_i16(o)) : \
+ sizeof(type) == 4 ? (((type)-1) > 0 ? (type)_cffi_to_c_u32(o) \
+ : (type)_cffi_to_c_i32(o)) : \
+ sizeof(type) == 8 ? (((type)-1) > 0 ? (type)_cffi_to_c_u64(o) \
+ : (type)_cffi_to_c_i64(o)) : \
+ (Py_FatalError("unsupported size for type " #type), (type)0)))
+
+#define _cffi_to_c_i8 \
+ ((int(*)(PyObject *))_cffi_exports[1])
+#define _cffi_to_c_u8 \
+ ((int(*)(PyObject *))_cffi_exports[2])
+#define _cffi_to_c_i16 \
+ ((int(*)(PyObject *))_cffi_exports[3])
+#define _cffi_to_c_u16 \
+ ((int(*)(PyObject *))_cffi_exports[4])
+#define _cffi_to_c_i32 \
+ ((int(*)(PyObject *))_cffi_exports[5])
+#define _cffi_to_c_u32 \
+ ((unsigned int(*)(PyObject *))_cffi_exports[6])
+#define _cffi_to_c_i64 \
+ ((long long(*)(PyObject *))_cffi_exports[7])
+#define _cffi_to_c_u64 \
+ ((unsigned long long(*)(PyObject *))_cffi_exports[8])
+#define _cffi_to_c_char \
+ ((int(*)(PyObject *))_cffi_exports[9])
+#define _cffi_from_c_pointer \
+ ((PyObject *(*)(char *, struct _cffi_ctypedescr *))_cffi_exports[10])
+#define _cffi_to_c_pointer \
+ ((char *(*)(PyObject *, struct _cffi_ctypedescr *))_cffi_exports[11])
+#define _cffi_get_struct_layout \
+ not used any more
+#define _cffi_restore_errno \
+ ((void(*)(void))_cffi_exports[13])
+#define _cffi_save_errno \
+ ((void(*)(void))_cffi_exports[14])
+#define _cffi_from_c_char \
+ ((PyObject *(*)(char))_cffi_exports[15])
+#define _cffi_from_c_deref \
+ ((PyObject *(*)(char *, struct _cffi_ctypedescr *))_cffi_exports[16])
+#define _cffi_to_c \
+ ((int(*)(char *, struct _cffi_ctypedescr *, PyObject *))_cffi_exports[17])
+#define _cffi_from_c_struct \
+ ((PyObject *(*)(char *, struct _cffi_ctypedescr *))_cffi_exports[18])
+#define _cffi_to_c_wchar_t \
+ ((_cffi_wchar_t(*)(PyObject *))_cffi_exports[19])
+#define _cffi_from_c_wchar_t \
+ ((PyObject *(*)(_cffi_wchar_t))_cffi_exports[20])
+#define _cffi_to_c_long_double \
+ ((long double(*)(PyObject *))_cffi_exports[21])
+#define _cffi_to_c__Bool \
+ ((_Bool(*)(PyObject *))_cffi_exports[22])
+#define _cffi_prepare_pointer_call_argument \
+ ((Py_ssize_t(*)(struct _cffi_ctypedescr *, \
+ PyObject *, char **))_cffi_exports[23])
+#define _cffi_convert_array_from_object \
+ ((int(*)(char *, struct _cffi_ctypedescr *, PyObject *))_cffi_exports[24])
+#define _CFFI_CPIDX 25
+#define _cffi_call_python \
+ ((void(*)(struct _cffi_externpy_s *, char *))_cffi_exports[_CFFI_CPIDX])
+#define _cffi_to_c_wchar3216_t \
+ ((int(*)(PyObject *))_cffi_exports[26])
+#define _cffi_from_c_wchar3216_t \
+ ((PyObject *(*)(int))_cffi_exports[27])
+#define _CFFI_NUM_EXPORTS 28
+
+struct _cffi_ctypedescr;
+
+static void *_cffi_exports[_CFFI_NUM_EXPORTS];
+
+#define _cffi_type(index) ( \
+ assert((((uintptr_t)_cffi_types[index]) & 1) == 0), \
+ (struct _cffi_ctypedescr *)_cffi_types[index])
+
+static PyObject *_cffi_init(const char *module_name, Py_ssize_t version,
+ const struct _cffi_type_context_s *ctx)
+{
+ PyObject *module, *o_arg, *new_module;
+ void *raw[] = {
+ (void *)module_name,
+ (void *)version,
+ (void *)_cffi_exports,
+ (void *)ctx,
+ };
+
+ module = PyImport_ImportModule("_cffi_backend");
+ if (module == NULL)
+ goto failure;
+
+ o_arg = PyLong_FromVoidPtr((void *)raw);
+ if (o_arg == NULL)
+ goto failure;
+
+ new_module = PyObject_CallMethod(
+ module, (char *)"_init_cffi_1_0_external_module", (char *)"O", o_arg);
+
+ Py_DECREF(o_arg);
+ Py_DECREF(module);
+ return new_module;
+
+ failure:
+ Py_XDECREF(module);
+ return NULL;
+}
+
+
+#ifdef HAVE_WCHAR_H
+typedef wchar_t _cffi_wchar_t;
+#else
+typedef uint16_t _cffi_wchar_t; /* same random pick as _cffi_backend.c */
+#endif
+
+_CFFI_UNUSED_FN static uint16_t _cffi_to_c_char16_t(PyObject *o)
+{
+ if (sizeof(_cffi_wchar_t) == 2)
+ return (uint16_t)_cffi_to_c_wchar_t(o);
+ else
+ return (uint16_t)_cffi_to_c_wchar3216_t(o);
+}
+
+_CFFI_UNUSED_FN static PyObject *_cffi_from_c_char16_t(uint16_t x)
+{
+ if (sizeof(_cffi_wchar_t) == 2)
+ return _cffi_from_c_wchar_t((_cffi_wchar_t)x);
+ else
+ return _cffi_from_c_wchar3216_t((int)x);
+}
+
+_CFFI_UNUSED_FN static int _cffi_to_c_char32_t(PyObject *o)
+{
+ if (sizeof(_cffi_wchar_t) == 4)
+ return (int)_cffi_to_c_wchar_t(o);
+ else
+ return (int)_cffi_to_c_wchar3216_t(o);
+}
+
+_CFFI_UNUSED_FN static PyObject *_cffi_from_c_char32_t(unsigned int x)
+{
+ if (sizeof(_cffi_wchar_t) == 4)
+ return _cffi_from_c_wchar_t((_cffi_wchar_t)x);
+ else
+ return _cffi_from_c_wchar3216_t((int)x);
+}
+
+union _cffi_union_alignment_u {
+ unsigned char m_char;
+ unsigned short m_short;
+ unsigned int m_int;
+ unsigned long m_long;
+ unsigned long long m_longlong;
+ float m_float;
+ double m_double;
+ long double m_longdouble;
+};
+
+struct _cffi_freeme_s {
+ struct _cffi_freeme_s *next;
+ union _cffi_union_alignment_u alignment;
+};
+
+_CFFI_UNUSED_FN static int
+_cffi_convert_array_argument(struct _cffi_ctypedescr *ctptr, PyObject *arg,
+ char **output_data, Py_ssize_t datasize,
+ struct _cffi_freeme_s **freeme)
+{
+ char *p;
+ if (datasize < 0)
+ return -1;
+
+ p = *output_data;
+ if (p == NULL) {
+ struct _cffi_freeme_s *fp = (struct _cffi_freeme_s *)PyObject_Malloc(
+ offsetof(struct _cffi_freeme_s, alignment) + (size_t)datasize);
+ if (fp == NULL)
+ return -1;
+ fp->next = *freeme;
+ *freeme = fp;
+ p = *output_data = (char *)&fp->alignment;
+ }
+ memset((void *)p, 0, (size_t)datasize);
+ return _cffi_convert_array_from_object(p, ctptr, arg);
+}
+
+_CFFI_UNUSED_FN static void
+_cffi_free_array_arguments(struct _cffi_freeme_s *freeme)
+{
+ do {
+ void *p = (void *)freeme;
+ freeme = freeme->next;
+ PyObject_Free(p);
+ } while (freeme != NULL);
+}
+
+/********** end CPython-specific section **********/
+#else
+_CFFI_UNUSED_FN
+static void (*_cffi_call_python_org)(struct _cffi_externpy_s *, char *);
+# define _cffi_call_python _cffi_call_python_org
+#endif
+
+
+#define _cffi_array_len(array) (sizeof(array) / sizeof((array)[0]))
+
+#define _cffi_prim_int(size, sign) \
+ ((size) == 1 ? ((sign) ? _CFFI_PRIM_INT8 : _CFFI_PRIM_UINT8) : \
+ (size) == 2 ? ((sign) ? _CFFI_PRIM_INT16 : _CFFI_PRIM_UINT16) : \
+ (size) == 4 ? ((sign) ? _CFFI_PRIM_INT32 : _CFFI_PRIM_UINT32) : \
+ (size) == 8 ? ((sign) ? _CFFI_PRIM_INT64 : _CFFI_PRIM_UINT64) : \
+ _CFFI__UNKNOWN_PRIM)
+
+#define _cffi_prim_float(size) \
+ ((size) == sizeof(float) ? _CFFI_PRIM_FLOAT : \
+ (size) == sizeof(double) ? _CFFI_PRIM_DOUBLE : \
+ (size) == sizeof(long double) ? _CFFI__UNKNOWN_LONG_DOUBLE : \
+ _CFFI__UNKNOWN_FLOAT_PRIM)
+
+#define _cffi_check_int(got, got_nonpos, expected) \
+ ((got_nonpos) == (expected <= 0) && \
+ (got) == (unsigned long long)expected)
+
+#ifdef MS_WIN32
+# define _cffi_stdcall __stdcall
+#else
+# define _cffi_stdcall /* nothing */
+#endif
+
+#ifdef __cplusplus
+}
+#endif
diff --git a/templates/skills/file_manager/dependencies/cffi/_embedding.h b/templates/skills/file_manager/dependencies/cffi/_embedding.h
new file mode 100644
index 00000000..94d8b30a
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/cffi/_embedding.h
@@ -0,0 +1,550 @@
+
+/***** Support code for embedding *****/
+
+#ifdef __cplusplus
+extern "C" {
+#endif
+
+
+#if defined(_WIN32)
+# define CFFI_DLLEXPORT __declspec(dllexport)
+#elif defined(__GNUC__)
+# define CFFI_DLLEXPORT __attribute__((visibility("default")))
+#else
+# define CFFI_DLLEXPORT /* nothing */
+#endif
+
+
+/* There are two global variables of type _cffi_call_python_fnptr:
+
+ * _cffi_call_python, which we declare just below, is the one called
+ by ``extern "Python"`` implementations.
+
+ * _cffi_call_python_org, which on CPython is actually part of the
+ _cffi_exports[] array, is the function pointer copied from
+ _cffi_backend. If _cffi_start_python() fails, then this is set
+ to NULL; otherwise, it should never be NULL.
+
+ After initialization is complete, both are equal. However, the
+ first one remains equal to &_cffi_start_and_call_python until the
+ very end of initialization, when we are (or should be) sure that
+ concurrent threads also see a completely initialized world, and
+ only then is it changed.
+*/
+#undef _cffi_call_python
+typedef void (*_cffi_call_python_fnptr)(struct _cffi_externpy_s *, char *);
+static void _cffi_start_and_call_python(struct _cffi_externpy_s *, char *);
+static _cffi_call_python_fnptr _cffi_call_python = &_cffi_start_and_call_python;
+
+
+#ifndef _MSC_VER
+ /* --- Assuming a GCC not infinitely old --- */
+# define cffi_compare_and_swap(l,o,n) __sync_bool_compare_and_swap(l,o,n)
+# define cffi_write_barrier() __sync_synchronize()
+# if !defined(__amd64__) && !defined(__x86_64__) && \
+ !defined(__i386__) && !defined(__i386)
+# define cffi_read_barrier() __sync_synchronize()
+# else
+# define cffi_read_barrier() (void)0
+# endif
+#else
+ /* --- Windows threads version --- */
+# include
+# define cffi_compare_and_swap(l,o,n) \
+ (InterlockedCompareExchangePointer(l,n,o) == (o))
+# define cffi_write_barrier() InterlockedCompareExchange(&_cffi_dummy,0,0)
+# define cffi_read_barrier() (void)0
+static volatile LONG _cffi_dummy;
+#endif
+
+#ifdef WITH_THREAD
+# ifndef _MSC_VER
+# include
+ static pthread_mutex_t _cffi_embed_startup_lock;
+# else
+ static CRITICAL_SECTION _cffi_embed_startup_lock;
+# endif
+ static char _cffi_embed_startup_lock_ready = 0;
+#endif
+
+static void _cffi_acquire_reentrant_mutex(void)
+{
+ static void *volatile lock = NULL;
+
+ while (!cffi_compare_and_swap(&lock, NULL, (void *)1)) {
+ /* should ideally do a spin loop instruction here, but
+ hard to do it portably and doesn't really matter I
+ think: pthread_mutex_init() should be very fast, and
+ this is only run at start-up anyway. */
+ }
+
+#ifdef WITH_THREAD
+ if (!_cffi_embed_startup_lock_ready) {
+# ifndef _MSC_VER
+ pthread_mutexattr_t attr;
+ pthread_mutexattr_init(&attr);
+ pthread_mutexattr_settype(&attr, PTHREAD_MUTEX_RECURSIVE);
+ pthread_mutex_init(&_cffi_embed_startup_lock, &attr);
+# else
+ InitializeCriticalSection(&_cffi_embed_startup_lock);
+# endif
+ _cffi_embed_startup_lock_ready = 1;
+ }
+#endif
+
+ while (!cffi_compare_and_swap(&lock, (void *)1, NULL))
+ ;
+
+#ifndef _MSC_VER
+ pthread_mutex_lock(&_cffi_embed_startup_lock);
+#else
+ EnterCriticalSection(&_cffi_embed_startup_lock);
+#endif
+}
+
+static void _cffi_release_reentrant_mutex(void)
+{
+#ifndef _MSC_VER
+ pthread_mutex_unlock(&_cffi_embed_startup_lock);
+#else
+ LeaveCriticalSection(&_cffi_embed_startup_lock);
+#endif
+}
+
+
+/********** CPython-specific section **********/
+#ifndef PYPY_VERSION
+
+#include "_cffi_errors.h"
+
+
+#define _cffi_call_python_org _cffi_exports[_CFFI_CPIDX]
+
+PyMODINIT_FUNC _CFFI_PYTHON_STARTUP_FUNC(void); /* forward */
+
+static void _cffi_py_initialize(void)
+{
+ /* XXX use initsigs=0, which "skips initialization registration of
+ signal handlers, which might be useful when Python is
+ embedded" according to the Python docs. But review and think
+ if it should be a user-controllable setting.
+
+ XXX we should also give a way to write errors to a buffer
+ instead of to stderr.
+
+ XXX if importing 'site' fails, CPython (any version) calls
+ exit(). Should we try to work around this behavior here?
+ */
+ Py_InitializeEx(0);
+}
+
+static int _cffi_initialize_python(void)
+{
+ /* This initializes Python, imports _cffi_backend, and then the
+ present .dll/.so is set up as a CPython C extension module.
+ */
+ int result;
+ PyGILState_STATE state;
+ PyObject *pycode=NULL, *global_dict=NULL, *x;
+ PyObject *builtins;
+
+ state = PyGILState_Ensure();
+
+ /* Call the initxxx() function from the present module. It will
+ create and initialize us as a CPython extension module, instead
+ of letting the startup Python code do it---it might reimport
+ the same .dll/.so and get maybe confused on some platforms.
+ It might also have troubles locating the .dll/.so again for all
+ I know.
+ */
+ (void)_CFFI_PYTHON_STARTUP_FUNC();
+ if (PyErr_Occurred())
+ goto error;
+
+ /* Now run the Python code provided to ffi.embedding_init_code().
+ */
+ pycode = Py_CompileString(_CFFI_PYTHON_STARTUP_CODE,
+ "",
+ Py_file_input);
+ if (pycode == NULL)
+ goto error;
+ global_dict = PyDict_New();
+ if (global_dict == NULL)
+ goto error;
+ builtins = PyEval_GetBuiltins();
+ if (builtins == NULL)
+ goto error;
+ if (PyDict_SetItemString(global_dict, "__builtins__", builtins) < 0)
+ goto error;
+ x = PyEval_EvalCode(
+#if PY_MAJOR_VERSION < 3
+ (PyCodeObject *)
+#endif
+ pycode, global_dict, global_dict);
+ if (x == NULL)
+ goto error;
+ Py_DECREF(x);
+
+ /* Done! Now if we've been called from
+ _cffi_start_and_call_python() in an ``extern "Python"``, we can
+ only hope that the Python code did correctly set up the
+ corresponding @ffi.def_extern() function. Otherwise, the
+ general logic of ``extern "Python"`` functions (inside the
+ _cffi_backend module) will find that the reference is still
+ missing and print an error.
+ */
+ result = 0;
+ done:
+ Py_XDECREF(pycode);
+ Py_XDECREF(global_dict);
+ PyGILState_Release(state);
+ return result;
+
+ error:;
+ {
+ /* Print as much information as potentially useful.
+ Debugging load-time failures with embedding is not fun
+ */
+ PyObject *ecap;
+ PyObject *exception, *v, *tb, *f, *modules, *mod;
+ PyErr_Fetch(&exception, &v, &tb);
+ ecap = _cffi_start_error_capture();
+ f = PySys_GetObject((char *)"stderr");
+ if (f != NULL && f != Py_None) {
+ PyFile_WriteString(
+ "Failed to initialize the Python-CFFI embedding logic:\n\n", f);
+ }
+
+ if (exception != NULL) {
+ PyErr_NormalizeException(&exception, &v, &tb);
+ PyErr_Display(exception, v, tb);
+ }
+ Py_XDECREF(exception);
+ Py_XDECREF(v);
+ Py_XDECREF(tb);
+
+ if (f != NULL && f != Py_None) {
+ PyFile_WriteString("\nFrom: " _CFFI_MODULE_NAME
+ "\ncompiled with cffi version: 1.17.1"
+ "\n_cffi_backend module: ", f);
+ modules = PyImport_GetModuleDict();
+ mod = PyDict_GetItemString(modules, "_cffi_backend");
+ if (mod == NULL) {
+ PyFile_WriteString("not loaded", f);
+ }
+ else {
+ v = PyObject_GetAttrString(mod, "__file__");
+ PyFile_WriteObject(v, f, 0);
+ Py_XDECREF(v);
+ }
+ PyFile_WriteString("\nsys.path: ", f);
+ PyFile_WriteObject(PySys_GetObject((char *)"path"), f, 0);
+ PyFile_WriteString("\n\n", f);
+ }
+ _cffi_stop_error_capture(ecap);
+ }
+ result = -1;
+ goto done;
+}
+
+#if PY_VERSION_HEX < 0x03080000
+PyAPI_DATA(char *) _PyParser_TokenNames[]; /* from CPython */
+#endif
+
+static int _cffi_carefully_make_gil(void)
+{
+ /* This does the basic initialization of Python. It can be called
+ completely concurrently from unrelated threads. It assumes
+ that we don't hold the GIL before (if it exists), and we don't
+ hold it afterwards.
+
+ (What it really does used to be completely different in Python 2
+ and Python 3, with the Python 2 solution avoiding the spin-lock
+ around the Py_InitializeEx() call. However, after recent changes
+ to CPython 2.7 (issue #358) it no longer works. So we use the
+ Python 3 solution everywhere.)
+
+ This initializes Python by calling Py_InitializeEx().
+ Important: this must not be called concurrently at all.
+ So we use a global variable as a simple spin lock. This global
+ variable must be from 'libpythonX.Y.so', not from this
+ cffi-based extension module, because it must be shared from
+ different cffi-based extension modules.
+
+ In Python < 3.8, we choose
+ _PyParser_TokenNames[0] as a completely arbitrary pointer value
+ that is never written to. The default is to point to the
+ string "ENDMARKER". We change it temporarily to point to the
+ next character in that string. (Yes, I know it's REALLY
+ obscure.)
+
+ In Python >= 3.8, this string array is no longer writable, so
+ instead we pick PyCapsuleType.tp_version_tag. We can't change
+ Python < 3.8 because someone might use a mixture of cffi
+ embedded modules, some of which were compiled before this file
+ changed.
+
+ In Python >= 3.12, this stopped working because that particular
+ tp_version_tag gets modified during interpreter startup. It's
+ arguably a bad idea before 3.12 too, but again we can't change
+ that because someone might use a mixture of cffi embedded
+ modules, and no-one reported a bug so far. In Python >= 3.12
+ we go instead for PyCapsuleType.tp_as_buffer, which is supposed
+ to always be NULL. We write to it temporarily a pointer to
+ a struct full of NULLs, which is semantically the same.
+ */
+
+#ifdef WITH_THREAD
+# if PY_VERSION_HEX < 0x03080000
+ char *volatile *lock = (char *volatile *)_PyParser_TokenNames;
+ char *old_value, *locked_value;
+
+ while (1) { /* spin loop */
+ old_value = *lock;
+ locked_value = old_value + 1;
+ if (old_value[0] == 'E') {
+ assert(old_value[1] == 'N');
+ if (cffi_compare_and_swap(lock, old_value, locked_value))
+ break;
+ }
+ else {
+ assert(old_value[0] == 'N');
+ /* should ideally do a spin loop instruction here, but
+ hard to do it portably and doesn't really matter I
+ think: PyEval_InitThreads() should be very fast, and
+ this is only run at start-up anyway. */
+ }
+ }
+# else
+# if PY_VERSION_HEX < 0x030C0000
+ int volatile *lock = (int volatile *)&PyCapsule_Type.tp_version_tag;
+ int old_value, locked_value = -42;
+ assert(!(PyCapsule_Type.tp_flags & Py_TPFLAGS_HAVE_VERSION_TAG));
+# else
+ static struct ebp_s { PyBufferProcs buf; int mark; } empty_buffer_procs;
+ empty_buffer_procs.mark = -42;
+ PyBufferProcs *volatile *lock = (PyBufferProcs *volatile *)
+ &PyCapsule_Type.tp_as_buffer;
+ PyBufferProcs *old_value, *locked_value = &empty_buffer_procs.buf;
+# endif
+
+ while (1) { /* spin loop */
+ old_value = *lock;
+ if (old_value == 0) {
+ if (cffi_compare_and_swap(lock, old_value, locked_value))
+ break;
+ }
+ else {
+# if PY_VERSION_HEX < 0x030C0000
+ assert(old_value == locked_value);
+# else
+ /* The pointer should point to a possibly different
+ empty_buffer_procs from another C extension module */
+ assert(((struct ebp_s *)old_value)->mark == -42);
+# endif
+ /* should ideally do a spin loop instruction here, but
+ hard to do it portably and doesn't really matter I
+ think: PyEval_InitThreads() should be very fast, and
+ this is only run at start-up anyway. */
+ }
+ }
+# endif
+#endif
+
+ /* call Py_InitializeEx() */
+ if (!Py_IsInitialized()) {
+ _cffi_py_initialize();
+#if PY_VERSION_HEX < 0x03070000
+ PyEval_InitThreads();
+#endif
+ PyEval_SaveThread(); /* release the GIL */
+ /* the returned tstate must be the one that has been stored into the
+ autoTLSkey by _PyGILState_Init() called from Py_Initialize(). */
+ }
+ else {
+#if PY_VERSION_HEX < 0x03070000
+ /* PyEval_InitThreads() is always a no-op from CPython 3.7 */
+ PyGILState_STATE state = PyGILState_Ensure();
+ PyEval_InitThreads();
+ PyGILState_Release(state);
+#endif
+ }
+
+#ifdef WITH_THREAD
+ /* release the lock */
+ while (!cffi_compare_and_swap(lock, locked_value, old_value))
+ ;
+#endif
+
+ return 0;
+}
+
+/********** end CPython-specific section **********/
+
+
+#else
+
+
+/********** PyPy-specific section **********/
+
+PyMODINIT_FUNC _CFFI_PYTHON_STARTUP_FUNC(const void *[]); /* forward */
+
+static struct _cffi_pypy_init_s {
+ const char *name;
+ void *func; /* function pointer */
+ const char *code;
+} _cffi_pypy_init = {
+ _CFFI_MODULE_NAME,
+ _CFFI_PYTHON_STARTUP_FUNC,
+ _CFFI_PYTHON_STARTUP_CODE,
+};
+
+extern int pypy_carefully_make_gil(const char *);
+extern int pypy_init_embedded_cffi_module(int, struct _cffi_pypy_init_s *);
+
+static int _cffi_carefully_make_gil(void)
+{
+ return pypy_carefully_make_gil(_CFFI_MODULE_NAME);
+}
+
+static int _cffi_initialize_python(void)
+{
+ return pypy_init_embedded_cffi_module(0xB011, &_cffi_pypy_init);
+}
+
+/********** end PyPy-specific section **********/
+
+
+#endif
+
+
+#ifdef __GNUC__
+__attribute__((noinline))
+#endif
+static _cffi_call_python_fnptr _cffi_start_python(void)
+{
+ /* Delicate logic to initialize Python. This function can be
+ called multiple times concurrently, e.g. when the process calls
+ its first ``extern "Python"`` functions in multiple threads at
+ once. It can also be called recursively, in which case we must
+ ignore it. We also have to consider what occurs if several
+ different cffi-based extensions reach this code in parallel
+ threads---it is a different copy of the code, then, and we
+ can't have any shared global variable unless it comes from
+ 'libpythonX.Y.so'.
+
+ Idea:
+
+ * _cffi_carefully_make_gil(): "carefully" call
+ PyEval_InitThreads() (possibly with Py_InitializeEx() first).
+
+ * then we use a (local) custom lock to make sure that a call to this
+ cffi-based extension will wait if another call to the *same*
+ extension is running the initialization in another thread.
+ It is reentrant, so that a recursive call will not block, but
+ only one from a different thread.
+
+ * then we grab the GIL and (Python 2) we call Py_InitializeEx().
+ At this point, concurrent calls to Py_InitializeEx() are not
+ possible: we have the GIL.
+
+ * do the rest of the specific initialization, which may
+ temporarily release the GIL but not the custom lock.
+ Only release the custom lock when we are done.
+ */
+ static char called = 0;
+
+ if (_cffi_carefully_make_gil() != 0)
+ return NULL;
+
+ _cffi_acquire_reentrant_mutex();
+
+ /* Here the GIL exists, but we don't have it. We're only protected
+ from concurrency by the reentrant mutex. */
+
+ /* This file only initializes the embedded module once, the first
+ time this is called, even if there are subinterpreters. */
+ if (!called) {
+ called = 1; /* invoke _cffi_initialize_python() only once,
+ but don't set '_cffi_call_python' right now,
+ otherwise concurrent threads won't call
+ this function at all (we need them to wait) */
+ if (_cffi_initialize_python() == 0) {
+ /* now initialization is finished. Switch to the fast-path. */
+
+ /* We would like nobody to see the new value of
+ '_cffi_call_python' without also seeing the rest of the
+ data initialized. However, this is not possible. But
+ the new value of '_cffi_call_python' is the function
+ 'cffi_call_python()' from _cffi_backend. So: */
+ cffi_write_barrier();
+ /* ^^^ we put a write barrier here, and a corresponding
+ read barrier at the start of cffi_call_python(). This
+ ensures that after that read barrier, we see everything
+ done here before the write barrier.
+ */
+
+ assert(_cffi_call_python_org != NULL);
+ _cffi_call_python = (_cffi_call_python_fnptr)_cffi_call_python_org;
+ }
+ else {
+ /* initialization failed. Reset this to NULL, even if it was
+ already set to some other value. Future calls to
+ _cffi_start_python() are still forced to occur, and will
+ always return NULL from now on. */
+ _cffi_call_python_org = NULL;
+ }
+ }
+
+ _cffi_release_reentrant_mutex();
+
+ return (_cffi_call_python_fnptr)_cffi_call_python_org;
+}
+
+static
+void _cffi_start_and_call_python(struct _cffi_externpy_s *externpy, char *args)
+{
+ _cffi_call_python_fnptr fnptr;
+ int current_err = errno;
+#ifdef _MSC_VER
+ int current_lasterr = GetLastError();
+#endif
+ fnptr = _cffi_start_python();
+ if (fnptr == NULL) {
+ fprintf(stderr, "function %s() called, but initialization code "
+ "failed. Returning 0.\n", externpy->name);
+ memset(args, 0, externpy->size_of_result);
+ }
+#ifdef _MSC_VER
+ SetLastError(current_lasterr);
+#endif
+ errno = current_err;
+
+ if (fnptr != NULL)
+ fnptr(externpy, args);
+}
+
+
+/* The cffi_start_python() function makes sure Python is initialized
+ and our cffi module is set up. It can be called manually from the
+ user C code. The same effect is obtained automatically from any
+ dll-exported ``extern "Python"`` function. This function returns
+ -1 if initialization failed, 0 if all is OK. */
+_CFFI_UNUSED_FN
+static int cffi_start_python(void)
+{
+ if (_cffi_call_python == &_cffi_start_and_call_python) {
+ if (_cffi_start_python() == NULL)
+ return -1;
+ }
+ cffi_read_barrier();
+ return 0;
+}
+
+#undef cffi_compare_and_swap
+#undef cffi_write_barrier
+#undef cffi_read_barrier
+
+#ifdef __cplusplus
+}
+#endif
diff --git a/templates/skills/file_manager/dependencies/cffi/_imp_emulation.py b/templates/skills/file_manager/dependencies/cffi/_imp_emulation.py
new file mode 100644
index 00000000..136abddd
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/cffi/_imp_emulation.py
@@ -0,0 +1,83 @@
+
+try:
+ # this works on Python < 3.12
+ from imp import *
+
+except ImportError:
+ # this is a limited emulation for Python >= 3.12.
+ # Note that this is used only for tests or for the old ffi.verify().
+ # This is copied from the source code of Python 3.11.
+
+ from _imp import (acquire_lock, release_lock,
+ is_builtin, is_frozen)
+
+ from importlib._bootstrap import _load
+
+ from importlib import machinery
+ import os
+ import sys
+ import tokenize
+
+ SEARCH_ERROR = 0
+ PY_SOURCE = 1
+ PY_COMPILED = 2
+ C_EXTENSION = 3
+ PY_RESOURCE = 4
+ PKG_DIRECTORY = 5
+ C_BUILTIN = 6
+ PY_FROZEN = 7
+ PY_CODERESOURCE = 8
+ IMP_HOOK = 9
+
+ def get_suffixes():
+ extensions = [(s, 'rb', C_EXTENSION)
+ for s in machinery.EXTENSION_SUFFIXES]
+ source = [(s, 'r', PY_SOURCE) for s in machinery.SOURCE_SUFFIXES]
+ bytecode = [(s, 'rb', PY_COMPILED) for s in machinery.BYTECODE_SUFFIXES]
+ return extensions + source + bytecode
+
+ def find_module(name, path=None):
+ if not isinstance(name, str):
+ raise TypeError("'name' must be a str, not {}".format(type(name)))
+ elif not isinstance(path, (type(None), list)):
+ # Backwards-compatibility
+ raise RuntimeError("'path' must be None or a list, "
+ "not {}".format(type(path)))
+
+ if path is None:
+ if is_builtin(name):
+ return None, None, ('', '', C_BUILTIN)
+ elif is_frozen(name):
+ return None, None, ('', '', PY_FROZEN)
+ else:
+ path = sys.path
+
+ for entry in path:
+ package_directory = os.path.join(entry, name)
+ for suffix in ['.py', machinery.BYTECODE_SUFFIXES[0]]:
+ package_file_name = '__init__' + suffix
+ file_path = os.path.join(package_directory, package_file_name)
+ if os.path.isfile(file_path):
+ return None, package_directory, ('', '', PKG_DIRECTORY)
+ for suffix, mode, type_ in get_suffixes():
+ file_name = name + suffix
+ file_path = os.path.join(entry, file_name)
+ if os.path.isfile(file_path):
+ break
+ else:
+ continue
+ break # Break out of outer loop when breaking out of inner loop.
+ else:
+ raise ImportError(name, name=name)
+
+ encoding = None
+ if 'b' not in mode:
+ with open(file_path, 'rb') as file:
+ encoding = tokenize.detect_encoding(file.readline)[0]
+ file = open(file_path, mode, encoding=encoding)
+ return file, file_path, (suffix, mode, type_)
+
+ def load_dynamic(name, path, file=None):
+ loader = machinery.ExtensionFileLoader(name, path)
+ spec = machinery.ModuleSpec(name=name, loader=loader, origin=path)
+ return _load(spec)
diff --git a/templates/skills/file_manager/dependencies/cffi/_shimmed_dist_utils.py b/templates/skills/file_manager/dependencies/cffi/_shimmed_dist_utils.py
new file mode 100644
index 00000000..c3d23128
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/cffi/_shimmed_dist_utils.py
@@ -0,0 +1,45 @@
+"""
+Temporary shim module to indirect the bits of distutils we need from setuptools/distutils while providing useful
+error messages beyond `No module named 'distutils' on Python >= 3.12, or when setuptools' vendored distutils is broken.
+
+This is a compromise to avoid a hard-dep on setuptools for Python >= 3.12, since many users don't need runtime compilation support from CFFI.
+"""
+import sys
+
+try:
+ # import setuptools first; this is the most robust way to ensure its embedded distutils is available
+ # (the .pth shim should usually work, but this is even more robust)
+ import setuptools
+except Exception as ex:
+ if sys.version_info >= (3, 12):
+ # Python 3.12 has no built-in distutils to fall back on, so any import problem is fatal
+ raise Exception("This CFFI feature requires setuptools on Python >= 3.12. The setuptools module is missing or non-functional.") from ex
+
+ # silently ignore on older Pythons (support fallback to stdlib distutils where available)
+else:
+ del setuptools
+
+try:
+ # bring in just the bits of distutils we need, whether they really came from setuptools or stdlib-embedded distutils
+ from distutils import log, sysconfig
+ from distutils.ccompiler import CCompiler
+ from distutils.command.build_ext import build_ext
+ from distutils.core import Distribution, Extension
+ from distutils.dir_util import mkpath
+ from distutils.errors import DistutilsSetupError, CompileError, LinkError
+ from distutils.log import set_threshold, set_verbosity
+
+ if sys.platform == 'win32':
+ try:
+ # FUTURE: msvc9compiler module was removed in setuptools 74; consider removing, as it's only used by an ancient patch in `recompiler`
+ from distutils.msvc9compiler import MSVCCompiler
+ except ImportError:
+ MSVCCompiler = None
+except Exception as ex:
+ if sys.version_info >= (3, 12):
+ raise Exception("This CFFI feature requires setuptools on Python >= 3.12. Please install the setuptools package.") from ex
+
+ # anything older, just let the underlying distutils import error fly
+ raise Exception("This CFFI feature requires distutils. Please install the distutils or setuptools package.") from ex
+
+del sys
diff --git a/templates/skills/file_manager/dependencies/cffi/api.py b/templates/skills/file_manager/dependencies/cffi/api.py
new file mode 100644
index 00000000..5a474f3d
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/cffi/api.py
@@ -0,0 +1,967 @@
+import sys, types
+from .lock import allocate_lock
+from .error import CDefError
+from . import model
+
+try:
+ callable
+except NameError:
+ # Python 3.1
+ from collections import Callable
+ callable = lambda x: isinstance(x, Callable)
+
+try:
+ basestring
+except NameError:
+ # Python 3.x
+ basestring = str
+
+_unspecified = object()
+
+
+
+class FFI(object):
+ r'''
+ The main top-level class that you instantiate once, or once per module.
+
+ Example usage:
+
+ ffi = FFI()
+ ffi.cdef("""
+ int printf(const char *, ...);
+ """)
+
+ C = ffi.dlopen(None) # standard library
+ -or-
+ C = ffi.verify() # use a C compiler: verify the decl above is right
+
+ C.printf("hello, %s!\n", ffi.new("char[]", "world"))
+ '''
+
+ def __init__(self, backend=None):
+ """Create an FFI instance. The 'backend' argument is used to
+ select a non-default backend, mostly for tests.
+ """
+ if backend is None:
+ # You need PyPy (>= 2.0 beta), or a CPython (>= 2.6) with
+ # _cffi_backend.so compiled.
+ import _cffi_backend as backend
+ from . import __version__
+ if backend.__version__ != __version__:
+ # bad version! Try to be as explicit as possible.
+ if hasattr(backend, '__file__'):
+ # CPython
+ raise Exception("Version mismatch: this is the 'cffi' package version %s, located in %r. When we import the top-level '_cffi_backend' extension module, we get version %s, located in %r. The two versions should be equal; check your installation." % (
+ __version__, __file__,
+ backend.__version__, backend.__file__))
+ else:
+ # PyPy
+ raise Exception("Version mismatch: this is the 'cffi' package version %s, located in %r. This interpreter comes with a built-in '_cffi_backend' module, which is version %s. The two versions should be equal; check your installation." % (
+ __version__, __file__, backend.__version__))
+ # (If you insist you can also try to pass the option
+ # 'backend=backend_ctypes.CTypesBackend()', but don't
+ # rely on it! It's probably not going to work well.)
+
+ from . import cparser
+ self._backend = backend
+ self._lock = allocate_lock()
+ self._parser = cparser.Parser()
+ self._cached_btypes = {}
+ self._parsed_types = types.ModuleType('parsed_types').__dict__
+ self._new_types = types.ModuleType('new_types').__dict__
+ self._function_caches = []
+ self._libraries = []
+ self._cdefsources = []
+ self._included_ffis = []
+ self._windows_unicode = None
+ self._init_once_cache = {}
+ self._cdef_version = None
+ self._embedding = None
+ self._typecache = model.get_typecache(backend)
+ if hasattr(backend, 'set_ffi'):
+ backend.set_ffi(self)
+ for name in list(backend.__dict__):
+ if name.startswith('RTLD_'):
+ setattr(self, name, getattr(backend, name))
+ #
+ with self._lock:
+ self.BVoidP = self._get_cached_btype(model.voidp_type)
+ self.BCharA = self._get_cached_btype(model.char_array_type)
+ if isinstance(backend, types.ModuleType):
+ # _cffi_backend: attach these constants to the class
+ if not hasattr(FFI, 'NULL'):
+ FFI.NULL = self.cast(self.BVoidP, 0)
+ FFI.CData, FFI.CType = backend._get_types()
+ else:
+ # ctypes backend: attach these constants to the instance
+ self.NULL = self.cast(self.BVoidP, 0)
+ self.CData, self.CType = backend._get_types()
+ self.buffer = backend.buffer
+
+ def cdef(self, csource, override=False, packed=False, pack=None):
+ """Parse the given C source. This registers all declared functions,
+ types, and global variables. The functions and global variables can
+ then be accessed via either 'ffi.dlopen()' or 'ffi.verify()'.
+ The types can be used in 'ffi.new()' and other functions.
+ If 'packed' is specified as True, all structs declared inside this
+ cdef are packed, i.e. laid out without any field alignment at all.
+ Alternatively, 'pack' can be a small integer, and requests for
+ alignment greater than that are ignored (pack=1 is equivalent to
+ packed=True).
+ """
+ self._cdef(csource, override=override, packed=packed, pack=pack)
+
+ def embedding_api(self, csource, packed=False, pack=None):
+ self._cdef(csource, packed=packed, pack=pack, dllexport=True)
+ if self._embedding is None:
+ self._embedding = ''
+
+ def _cdef(self, csource, override=False, **options):
+ if not isinstance(csource, str): # unicode, on Python 2
+ if not isinstance(csource, basestring):
+ raise TypeError("cdef() argument must be a string")
+ csource = csource.encode('ascii')
+ with self._lock:
+ self._cdef_version = object()
+ self._parser.parse(csource, override=override, **options)
+ self._cdefsources.append(csource)
+ if override:
+ for cache in self._function_caches:
+ cache.clear()
+ finishlist = self._parser._recomplete
+ if finishlist:
+ self._parser._recomplete = []
+ for tp in finishlist:
+ tp.finish_backend_type(self, finishlist)
+
+ def dlopen(self, name, flags=0):
+ """Load and return a dynamic library identified by 'name'.
+ The standard C library can be loaded by passing None.
+ Note that functions and types declared by 'ffi.cdef()' are not
+ linked to a particular library, just like C headers; in the
+ library we only look for the actual (untyped) symbols.
+ """
+ if not (isinstance(name, basestring) or
+ name is None or
+ isinstance(name, self.CData)):
+ raise TypeError("dlopen(name): name must be a file name, None, "
+ "or an already-opened 'void *' handle")
+ with self._lock:
+ lib, function_cache = _make_ffi_library(self, name, flags)
+ self._function_caches.append(function_cache)
+ self._libraries.append(lib)
+ return lib
+
+ def dlclose(self, lib):
+ """Close a library obtained with ffi.dlopen(). After this call,
+ access to functions or variables from the library will fail
+ (possibly with a segmentation fault).
+ """
+ type(lib).__cffi_close__(lib)
+
+ def _typeof_locked(self, cdecl):
+ # call me with the lock!
+ key = cdecl
+ if key in self._parsed_types:
+ return self._parsed_types[key]
+ #
+ if not isinstance(cdecl, str): # unicode, on Python 2
+ cdecl = cdecl.encode('ascii')
+ #
+ type = self._parser.parse_type(cdecl)
+ really_a_function_type = type.is_raw_function
+ if really_a_function_type:
+ type = type.as_function_pointer()
+ btype = self._get_cached_btype(type)
+ result = btype, really_a_function_type
+ self._parsed_types[key] = result
+ return result
+
+ def _typeof(self, cdecl, consider_function_as_funcptr=False):
+ # string -> ctype object
+ try:
+ result = self._parsed_types[cdecl]
+ except KeyError:
+ with self._lock:
+ result = self._typeof_locked(cdecl)
+ #
+ btype, really_a_function_type = result
+ if really_a_function_type and not consider_function_as_funcptr:
+ raise CDefError("the type %r is a function type, not a "
+ "pointer-to-function type" % (cdecl,))
+ return btype
+
+ def typeof(self, cdecl):
+ """Parse the C type given as a string and return the
+ corresponding object.
+ It can also be used on 'cdata' instance to get its C type.
+ """
+ if isinstance(cdecl, basestring):
+ return self._typeof(cdecl)
+ if isinstance(cdecl, self.CData):
+ return self._backend.typeof(cdecl)
+ if isinstance(cdecl, types.BuiltinFunctionType):
+ res = _builtin_function_type(cdecl)
+ if res is not None:
+ return res
+ if (isinstance(cdecl, types.FunctionType)
+ and hasattr(cdecl, '_cffi_base_type')):
+ with self._lock:
+ return self._get_cached_btype(cdecl._cffi_base_type)
+ raise TypeError(type(cdecl))
+
+ def sizeof(self, cdecl):
+ """Return the size in bytes of the argument. It can be a
+ string naming a C type, or a 'cdata' instance.
+ """
+ if isinstance(cdecl, basestring):
+ BType = self._typeof(cdecl)
+ return self._backend.sizeof(BType)
+ else:
+ return self._backend.sizeof(cdecl)
+
+ def alignof(self, cdecl):
+ """Return the natural alignment size in bytes of the C type
+ given as a string.
+ """
+ if isinstance(cdecl, basestring):
+ cdecl = self._typeof(cdecl)
+ return self._backend.alignof(cdecl)
+
+ def offsetof(self, cdecl, *fields_or_indexes):
+ """Return the offset of the named field inside the given
+ structure or array, which must be given as a C type name.
+ You can give several field names in case of nested structures.
+ You can also give numeric values which correspond to array
+ items, in case of an array type.
+ """
+ if isinstance(cdecl, basestring):
+ cdecl = self._typeof(cdecl)
+ return self._typeoffsetof(cdecl, *fields_or_indexes)[1]
+
+ def new(self, cdecl, init=None):
+ """Allocate an instance according to the specified C type and
+ return a pointer to it. The specified C type must be either a
+ pointer or an array: ``new('X *')`` allocates an X and returns
+ a pointer to it, whereas ``new('X[n]')`` allocates an array of
+ n X'es and returns an array referencing it (which works
+ mostly like a pointer, like in C). You can also use
+ ``new('X[]', n)`` to allocate an array of a non-constant
+ length n.
+
+ The memory is initialized following the rules of declaring a
+ global variable in C: by default it is zero-initialized, but
+ an explicit initializer can be given which can be used to
+ fill all or part of the memory.
+
+ When the returned object goes out of scope, the memory
+ is freed. In other words the returned object has
+ ownership of the value of type 'cdecl' that it points to. This
+ means that the raw data can be used as long as this object is
+ kept alive, but must not be used for a longer time. Be careful
+ about that when copying the pointer to the memory somewhere
+ else, e.g. into another structure.
+ """
+ if isinstance(cdecl, basestring):
+ cdecl = self._typeof(cdecl)
+ return self._backend.newp(cdecl, init)
+
+ def new_allocator(self, alloc=None, free=None,
+ should_clear_after_alloc=True):
+ """Return a new allocator, i.e. a function that behaves like ffi.new()
+ but uses the provided low-level 'alloc' and 'free' functions.
+
+ 'alloc' is called with the size as argument. If it returns NULL, a
+ MemoryError is raised. 'free' is called with the result of 'alloc'
+ as argument. Both can be either Python function or directly C
+ functions. If 'free' is None, then no free function is called.
+ If both 'alloc' and 'free' are None, the default is used.
+
+ If 'should_clear_after_alloc' is set to False, then the memory
+ returned by 'alloc' is assumed to be already cleared (or you are
+ fine with garbage); otherwise CFFI will clear it.
+ """
+ compiled_ffi = self._backend.FFI()
+ allocator = compiled_ffi.new_allocator(alloc, free,
+ should_clear_after_alloc)
+ def allocate(cdecl, init=None):
+ if isinstance(cdecl, basestring):
+ cdecl = self._typeof(cdecl)
+ return allocator(cdecl, init)
+ return allocate
+
+ def cast(self, cdecl, source):
+ """Similar to a C cast: returns an instance of the named C
+ type initialized with the given 'source'. The source is
+ casted between integers or pointers of any type.
+ """
+ if isinstance(cdecl, basestring):
+ cdecl = self._typeof(cdecl)
+ return self._backend.cast(cdecl, source)
+
+ def string(self, cdata, maxlen=-1):
+ """Return a Python string (or unicode string) from the 'cdata'.
+ If 'cdata' is a pointer or array of characters or bytes, returns
+ the null-terminated string. The returned string extends until
+ the first null character, or at most 'maxlen' characters. If
+ 'cdata' is an array then 'maxlen' defaults to its length.
+
+ If 'cdata' is a pointer or array of wchar_t, returns a unicode
+ string following the same rules.
+
+ If 'cdata' is a single character or byte or a wchar_t, returns
+ it as a string or unicode string.
+
+ If 'cdata' is an enum, returns the value of the enumerator as a
+ string, or 'NUMBER' if the value is out of range.
+ """
+ return self._backend.string(cdata, maxlen)
+
+ def unpack(self, cdata, length):
+ """Unpack an array of C data of the given length,
+ returning a Python string/unicode/list.
+
+ If 'cdata' is a pointer to 'char', returns a byte string.
+ It does not stop at the first null. This is equivalent to:
+ ffi.buffer(cdata, length)[:]
+
+ If 'cdata' is a pointer to 'wchar_t', returns a unicode string.
+ 'length' is measured in wchar_t's; it is not the size in bytes.
+
+ If 'cdata' is a pointer to anything else, returns a list of
+ 'length' items. This is a faster equivalent to:
+ [cdata[i] for i in range(length)]
+ """
+ return self._backend.unpack(cdata, length)
+
+ #def buffer(self, cdata, size=-1):
+ # """Return a read-write buffer object that references the raw C data
+ # pointed to by the given 'cdata'. The 'cdata' must be a pointer or
+ # an array. Can be passed to functions expecting a buffer, or directly
+ # manipulated with:
+ #
+ # buf[:] get a copy of it in a regular string, or
+ # buf[idx] as a single character
+ # buf[:] = ...
+ # buf[idx] = ... change the content
+ # """
+ # note that 'buffer' is a type, set on this instance by __init__
+
+ def from_buffer(self, cdecl, python_buffer=_unspecified,
+ require_writable=False):
+ """Return a cdata of the given type pointing to the data of the
+ given Python object, which must support the buffer interface.
+ Note that this is not meant to be used on the built-in types
+ str or unicode (you can build 'char[]' arrays explicitly)
+ but only on objects containing large quantities of raw data
+ in some other format, like 'array.array' or numpy arrays.
+
+ The first argument is optional and default to 'char[]'.
+ """
+ if python_buffer is _unspecified:
+ cdecl, python_buffer = self.BCharA, cdecl
+ elif isinstance(cdecl, basestring):
+ cdecl = self._typeof(cdecl)
+ return self._backend.from_buffer(cdecl, python_buffer,
+ require_writable)
+
+ def memmove(self, dest, src, n):
+ """ffi.memmove(dest, src, n) copies n bytes of memory from src to dest.
+
+ Like the C function memmove(), the memory areas may overlap;
+ apart from that it behaves like the C function memcpy().
+
+ 'src' can be any cdata ptr or array, or any Python buffer object.
+ 'dest' can be any cdata ptr or array, or a writable Python buffer
+ object. The size to copy, 'n', is always measured in bytes.
+
+ Unlike other methods, this one supports all Python buffer including
+ byte strings and bytearrays---but it still does not support
+ non-contiguous buffers.
+ """
+ return self._backend.memmove(dest, src, n)
+
+ def callback(self, cdecl, python_callable=None, error=None, onerror=None):
+ """Return a callback object or a decorator making such a
+ callback object. 'cdecl' must name a C function pointer type.
+ The callback invokes the specified 'python_callable' (which may
+ be provided either directly or via a decorator). Important: the
+ callback object must be manually kept alive for as long as the
+ callback may be invoked from the C level.
+ """
+ def callback_decorator_wrap(python_callable):
+ if not callable(python_callable):
+ raise TypeError("the 'python_callable' argument "
+ "is not callable")
+ return self._backend.callback(cdecl, python_callable,
+ error, onerror)
+ if isinstance(cdecl, basestring):
+ cdecl = self._typeof(cdecl, consider_function_as_funcptr=True)
+ if python_callable is None:
+ return callback_decorator_wrap # decorator mode
+ else:
+ return callback_decorator_wrap(python_callable) # direct mode
+
+ def getctype(self, cdecl, replace_with=''):
+ """Return a string giving the C type 'cdecl', which may be itself
+ a string or a object. If 'replace_with' is given, it gives
+ extra text to append (or insert for more complicated C types), like
+ a variable name, or '*' to get actually the C type 'pointer-to-cdecl'.
+ """
+ if isinstance(cdecl, basestring):
+ cdecl = self._typeof(cdecl)
+ replace_with = replace_with.strip()
+ if (replace_with.startswith('*')
+ and '&[' in self._backend.getcname(cdecl, '&')):
+ replace_with = '(%s)' % replace_with
+ elif replace_with and not replace_with[0] in '[(':
+ replace_with = ' ' + replace_with
+ return self._backend.getcname(cdecl, replace_with)
+
+ def gc(self, cdata, destructor, size=0):
+ """Return a new cdata object that points to the same
+ data. Later, when this new cdata object is garbage-collected,
+ 'destructor(old_cdata_object)' will be called.
+
+ The optional 'size' gives an estimate of the size, used to
+ trigger the garbage collection more eagerly. So far only used
+ on PyPy. It tells the GC that the returned object keeps alive
+ roughly 'size' bytes of external memory.
+ """
+ return self._backend.gcp(cdata, destructor, size)
+
+ def _get_cached_btype(self, type):
+ assert self._lock.acquire(False) is False
+ # call me with the lock!
+ try:
+ BType = self._cached_btypes[type]
+ except KeyError:
+ finishlist = []
+ BType = type.get_cached_btype(self, finishlist)
+ for type in finishlist:
+ type.finish_backend_type(self, finishlist)
+ return BType
+
+ def verify(self, source='', tmpdir=None, **kwargs):
+ """Verify that the current ffi signatures compile on this
+ machine, and return a dynamic library object. The dynamic
+ library can be used to call functions and access global
+ variables declared in this 'ffi'. The library is compiled
+ by the C compiler: it gives you C-level API compatibility
+ (including calling macros). This is unlike 'ffi.dlopen()',
+ which requires binary compatibility in the signatures.
+ """
+ from .verifier import Verifier, _caller_dir_pycache
+ #
+ # If set_unicode(True) was called, insert the UNICODE and
+ # _UNICODE macro declarations
+ if self._windows_unicode:
+ self._apply_windows_unicode(kwargs)
+ #
+ # Set the tmpdir here, and not in Verifier.__init__: it picks
+ # up the caller's directory, which we want to be the caller of
+ # ffi.verify(), as opposed to the caller of Veritier().
+ tmpdir = tmpdir or _caller_dir_pycache()
+ #
+ # Make a Verifier() and use it to load the library.
+ self.verifier = Verifier(self, source, tmpdir, **kwargs)
+ lib = self.verifier.load_library()
+ #
+ # Save the loaded library for keep-alive purposes, even
+ # if the caller doesn't keep it alive itself (it should).
+ self._libraries.append(lib)
+ return lib
+
+ def _get_errno(self):
+ return self._backend.get_errno()
+ def _set_errno(self, errno):
+ self._backend.set_errno(errno)
+ errno = property(_get_errno, _set_errno, None,
+ "the value of 'errno' from/to the C calls")
+
+ def getwinerror(self, code=-1):
+ return self._backend.getwinerror(code)
+
+ def _pointer_to(self, ctype):
+ with self._lock:
+ return model.pointer_cache(self, ctype)
+
+ def addressof(self, cdata, *fields_or_indexes):
+ """Return the address of a .
+ If 'fields_or_indexes' are given, returns the address of that
+ field or array item in the structure or array, recursively in
+ case of nested structures.
+ """
+ try:
+ ctype = self._backend.typeof(cdata)
+ except TypeError:
+ if '__addressof__' in type(cdata).__dict__:
+ return type(cdata).__addressof__(cdata, *fields_or_indexes)
+ raise
+ if fields_or_indexes:
+ ctype, offset = self._typeoffsetof(ctype, *fields_or_indexes)
+ else:
+ if ctype.kind == "pointer":
+ raise TypeError("addressof(pointer)")
+ offset = 0
+ ctypeptr = self._pointer_to(ctype)
+ return self._backend.rawaddressof(ctypeptr, cdata, offset)
+
+ def _typeoffsetof(self, ctype, field_or_index, *fields_or_indexes):
+ ctype, offset = self._backend.typeoffsetof(ctype, field_or_index)
+ for field1 in fields_or_indexes:
+ ctype, offset1 = self._backend.typeoffsetof(ctype, field1, 1)
+ offset += offset1
+ return ctype, offset
+
+ def include(self, ffi_to_include):
+ """Includes the typedefs, structs, unions and enums defined
+ in another FFI instance. Usage is similar to a #include in C,
+ where a part of the program might include types defined in
+ another part for its own usage. Note that the include()
+ method has no effect on functions, constants and global
+ variables, which must anyway be accessed directly from the
+ lib object returned by the original FFI instance.
+ """
+ if not isinstance(ffi_to_include, FFI):
+ raise TypeError("ffi.include() expects an argument that is also of"
+ " type cffi.FFI, not %r" % (
+ type(ffi_to_include).__name__,))
+ if ffi_to_include is self:
+ raise ValueError("self.include(self)")
+ with ffi_to_include._lock:
+ with self._lock:
+ self._parser.include(ffi_to_include._parser)
+ self._cdefsources.append('[')
+ self._cdefsources.extend(ffi_to_include._cdefsources)
+ self._cdefsources.append(']')
+ self._included_ffis.append(ffi_to_include)
+
+ def new_handle(self, x):
+ return self._backend.newp_handle(self.BVoidP, x)
+
+ def from_handle(self, x):
+ return self._backend.from_handle(x)
+
+ def release(self, x):
+ self._backend.release(x)
+
+ def set_unicode(self, enabled_flag):
+ """Windows: if 'enabled_flag' is True, enable the UNICODE and
+ _UNICODE defines in C, and declare the types like TCHAR and LPTCSTR
+ to be (pointers to) wchar_t. If 'enabled_flag' is False,
+ declare these types to be (pointers to) plain 8-bit characters.
+ This is mostly for backward compatibility; you usually want True.
+ """
+ if self._windows_unicode is not None:
+ raise ValueError("set_unicode() can only be called once")
+ enabled_flag = bool(enabled_flag)
+ if enabled_flag:
+ self.cdef("typedef wchar_t TBYTE;"
+ "typedef wchar_t TCHAR;"
+ "typedef const wchar_t *LPCTSTR;"
+ "typedef const wchar_t *PCTSTR;"
+ "typedef wchar_t *LPTSTR;"
+ "typedef wchar_t *PTSTR;"
+ "typedef TBYTE *PTBYTE;"
+ "typedef TCHAR *PTCHAR;")
+ else:
+ self.cdef("typedef char TBYTE;"
+ "typedef char TCHAR;"
+ "typedef const char *LPCTSTR;"
+ "typedef const char *PCTSTR;"
+ "typedef char *LPTSTR;"
+ "typedef char *PTSTR;"
+ "typedef TBYTE *PTBYTE;"
+ "typedef TCHAR *PTCHAR;")
+ self._windows_unicode = enabled_flag
+
+ def _apply_windows_unicode(self, kwds):
+ defmacros = kwds.get('define_macros', ())
+ if not isinstance(defmacros, (list, tuple)):
+ raise TypeError("'define_macros' must be a list or tuple")
+ defmacros = list(defmacros) + [('UNICODE', '1'),
+ ('_UNICODE', '1')]
+ kwds['define_macros'] = defmacros
+
+ def _apply_embedding_fix(self, kwds):
+ # must include an argument like "-lpython2.7" for the compiler
+ def ensure(key, value):
+ lst = kwds.setdefault(key, [])
+ if value not in lst:
+ lst.append(value)
+ #
+ if '__pypy__' in sys.builtin_module_names:
+ import os
+ if sys.platform == "win32":
+ # we need 'libpypy-c.lib'. Current distributions of
+ # pypy (>= 4.1) contain it as 'libs/python27.lib'.
+ pythonlib = "python{0[0]}{0[1]}".format(sys.version_info)
+ if hasattr(sys, 'prefix'):
+ ensure('library_dirs', os.path.join(sys.prefix, 'libs'))
+ else:
+ # we need 'libpypy-c.{so,dylib}', which should be by
+ # default located in 'sys.prefix/bin' for installed
+ # systems.
+ if sys.version_info < (3,):
+ pythonlib = "pypy-c"
+ else:
+ pythonlib = "pypy3-c"
+ if hasattr(sys, 'prefix'):
+ ensure('library_dirs', os.path.join(sys.prefix, 'bin'))
+ # On uninstalled pypy's, the libpypy-c is typically found in
+ # .../pypy/goal/.
+ if hasattr(sys, 'prefix'):
+ ensure('library_dirs', os.path.join(sys.prefix, 'pypy', 'goal'))
+ else:
+ if sys.platform == "win32":
+ template = "python%d%d"
+ if hasattr(sys, 'gettotalrefcount'):
+ template += '_d'
+ else:
+ try:
+ import sysconfig
+ except ImportError: # 2.6
+ from cffi._shimmed_dist_utils import sysconfig
+ template = "python%d.%d"
+ if sysconfig.get_config_var('DEBUG_EXT'):
+ template += sysconfig.get_config_var('DEBUG_EXT')
+ pythonlib = (template %
+ (sys.hexversion >> 24, (sys.hexversion >> 16) & 0xff))
+ if hasattr(sys, 'abiflags'):
+ pythonlib += sys.abiflags
+ ensure('libraries', pythonlib)
+ if sys.platform == "win32":
+ ensure('extra_link_args', '/MANIFEST')
+
+ def set_source(self, module_name, source, source_extension='.c', **kwds):
+ import os
+ if hasattr(self, '_assigned_source'):
+ raise ValueError("set_source() cannot be called several times "
+ "per ffi object")
+ if not isinstance(module_name, basestring):
+ raise TypeError("'module_name' must be a string")
+ if os.sep in module_name or (os.altsep and os.altsep in module_name):
+ raise ValueError("'module_name' must not contain '/': use a dotted "
+ "name to make a 'package.module' location")
+ self._assigned_source = (str(module_name), source,
+ source_extension, kwds)
+
+ def set_source_pkgconfig(self, module_name, pkgconfig_libs, source,
+ source_extension='.c', **kwds):
+ from . import pkgconfig
+ if not isinstance(pkgconfig_libs, list):
+ raise TypeError("the pkgconfig_libs argument must be a list "
+ "of package names")
+ kwds2 = pkgconfig.flags_from_pkgconfig(pkgconfig_libs)
+ pkgconfig.merge_flags(kwds, kwds2)
+ self.set_source(module_name, source, source_extension, **kwds)
+
+ def distutils_extension(self, tmpdir='build', verbose=True):
+ from cffi._shimmed_dist_utils import mkpath
+ from .recompiler import recompile
+ #
+ if not hasattr(self, '_assigned_source'):
+ if hasattr(self, 'verifier'): # fallback, 'tmpdir' ignored
+ return self.verifier.get_extension()
+ raise ValueError("set_source() must be called before"
+ " distutils_extension()")
+ module_name, source, source_extension, kwds = self._assigned_source
+ if source is None:
+ raise TypeError("distutils_extension() is only for C extension "
+ "modules, not for dlopen()-style pure Python "
+ "modules")
+ mkpath(tmpdir)
+ ext, updated = recompile(self, module_name,
+ source, tmpdir=tmpdir, extradir=tmpdir,
+ source_extension=source_extension,
+ call_c_compiler=False, **kwds)
+ if verbose:
+ if updated:
+ sys.stderr.write("regenerated: %r\n" % (ext.sources[0],))
+ else:
+ sys.stderr.write("not modified: %r\n" % (ext.sources[0],))
+ return ext
+
+ def emit_c_code(self, filename):
+ from .recompiler import recompile
+ #
+ if not hasattr(self, '_assigned_source'):
+ raise ValueError("set_source() must be called before emit_c_code()")
+ module_name, source, source_extension, kwds = self._assigned_source
+ if source is None:
+ raise TypeError("emit_c_code() is only for C extension modules, "
+ "not for dlopen()-style pure Python modules")
+ recompile(self, module_name, source,
+ c_file=filename, call_c_compiler=False,
+ uses_ffiplatform=False, **kwds)
+
+ def emit_python_code(self, filename):
+ from .recompiler import recompile
+ #
+ if not hasattr(self, '_assigned_source'):
+ raise ValueError("set_source() must be called before emit_c_code()")
+ module_name, source, source_extension, kwds = self._assigned_source
+ if source is not None:
+ raise TypeError("emit_python_code() is only for dlopen()-style "
+ "pure Python modules, not for C extension modules")
+ recompile(self, module_name, source,
+ c_file=filename, call_c_compiler=False,
+ uses_ffiplatform=False, **kwds)
+
+ def compile(self, tmpdir='.', verbose=0, target=None, debug=None):
+ """The 'target' argument gives the final file name of the
+ compiled DLL. Use '*' to force distutils' choice, suitable for
+ regular CPython C API modules. Use a file name ending in '.*'
+ to ask for the system's default extension for dynamic libraries
+ (.so/.dll/.dylib).
+
+ The default is '*' when building a non-embedded C API extension,
+ and (module_name + '.*') when building an embedded library.
+ """
+ from .recompiler import recompile
+ #
+ if not hasattr(self, '_assigned_source'):
+ raise ValueError("set_source() must be called before compile()")
+ module_name, source, source_extension, kwds = self._assigned_source
+ return recompile(self, module_name, source, tmpdir=tmpdir,
+ target=target, source_extension=source_extension,
+ compiler_verbose=verbose, debug=debug, **kwds)
+
+ def init_once(self, func, tag):
+ # Read _init_once_cache[tag], which is either (False, lock) if
+ # we're calling the function now in some thread, or (True, result).
+ # Don't call setdefault() in most cases, to avoid allocating and
+ # immediately freeing a lock; but still use setdefaut() to avoid
+ # races.
+ try:
+ x = self._init_once_cache[tag]
+ except KeyError:
+ x = self._init_once_cache.setdefault(tag, (False, allocate_lock()))
+ # Common case: we got (True, result), so we return the result.
+ if x[0]:
+ return x[1]
+ # Else, it's a lock. Acquire it to serialize the following tests.
+ with x[1]:
+ # Read again from _init_once_cache the current status.
+ x = self._init_once_cache[tag]
+ if x[0]:
+ return x[1]
+ # Call the function and store the result back.
+ result = func()
+ self._init_once_cache[tag] = (True, result)
+ return result
+
+ def embedding_init_code(self, pysource):
+ if self._embedding:
+ raise ValueError("embedding_init_code() can only be called once")
+ # fix 'pysource' before it gets dumped into the C file:
+ # - remove empty lines at the beginning, so it starts at "line 1"
+ # - dedent, if all non-empty lines are indented
+ # - check for SyntaxErrors
+ import re
+ match = re.match(r'\s*\n', pysource)
+ if match:
+ pysource = pysource[match.end():]
+ lines = pysource.splitlines() or ['']
+ prefix = re.match(r'\s*', lines[0]).group()
+ for i in range(1, len(lines)):
+ line = lines[i]
+ if line.rstrip():
+ while not line.startswith(prefix):
+ prefix = prefix[:-1]
+ i = len(prefix)
+ lines = [line[i:]+'\n' for line in lines]
+ pysource = ''.join(lines)
+ #
+ compile(pysource, "cffi_init", "exec")
+ #
+ self._embedding = pysource
+
+ def def_extern(self, *args, **kwds):
+ raise ValueError("ffi.def_extern() is only available on API-mode FFI "
+ "objects")
+
+ def list_types(self):
+ """Returns the user type names known to this FFI instance.
+ This returns a tuple containing three lists of names:
+ (typedef_names, names_of_structs, names_of_unions)
+ """
+ typedefs = []
+ structs = []
+ unions = []
+ for key in self._parser._declarations:
+ if key.startswith('typedef '):
+ typedefs.append(key[8:])
+ elif key.startswith('struct '):
+ structs.append(key[7:])
+ elif key.startswith('union '):
+ unions.append(key[6:])
+ typedefs.sort()
+ structs.sort()
+ unions.sort()
+ return (typedefs, structs, unions)
+
+
+def _load_backend_lib(backend, name, flags):
+ import os
+ if not isinstance(name, basestring):
+ if sys.platform != "win32" or name is not None:
+ return backend.load_library(name, flags)
+ name = "c" # Windows: load_library(None) fails, but this works
+ # on Python 2 (backward compatibility hack only)
+ first_error = None
+ if '.' in name or '/' in name or os.sep in name:
+ try:
+ return backend.load_library(name, flags)
+ except OSError as e:
+ first_error = e
+ import ctypes.util
+ path = ctypes.util.find_library(name)
+ if path is None:
+ if name == "c" and sys.platform == "win32" and sys.version_info >= (3,):
+ raise OSError("dlopen(None) cannot work on Windows for Python 3 "
+ "(see http://bugs.python.org/issue23606)")
+ msg = ("ctypes.util.find_library() did not manage "
+ "to locate a library called %r" % (name,))
+ if first_error is not None:
+ msg = "%s. Additionally, %s" % (first_error, msg)
+ raise OSError(msg)
+ return backend.load_library(path, flags)
+
+def _make_ffi_library(ffi, libname, flags):
+ backend = ffi._backend
+ backendlib = _load_backend_lib(backend, libname, flags)
+ #
+ def accessor_function(name):
+ key = 'function ' + name
+ tp, _ = ffi._parser._declarations[key]
+ BType = ffi._get_cached_btype(tp)
+ value = backendlib.load_function(BType, name)
+ library.__dict__[name] = value
+ #
+ def accessor_variable(name):
+ key = 'variable ' + name
+ tp, _ = ffi._parser._declarations[key]
+ BType = ffi._get_cached_btype(tp)
+ read_variable = backendlib.read_variable
+ write_variable = backendlib.write_variable
+ setattr(FFILibrary, name, property(
+ lambda self: read_variable(BType, name),
+ lambda self, value: write_variable(BType, name, value)))
+ #
+ def addressof_var(name):
+ try:
+ return addr_variables[name]
+ except KeyError:
+ with ffi._lock:
+ if name not in addr_variables:
+ key = 'variable ' + name
+ tp, _ = ffi._parser._declarations[key]
+ BType = ffi._get_cached_btype(tp)
+ if BType.kind != 'array':
+ BType = model.pointer_cache(ffi, BType)
+ p = backendlib.load_function(BType, name)
+ addr_variables[name] = p
+ return addr_variables[name]
+ #
+ def accessor_constant(name):
+ raise NotImplementedError("non-integer constant '%s' cannot be "
+ "accessed from a dlopen() library" % (name,))
+ #
+ def accessor_int_constant(name):
+ library.__dict__[name] = ffi._parser._int_constants[name]
+ #
+ accessors = {}
+ accessors_version = [False]
+ addr_variables = {}
+ #
+ def update_accessors():
+ if accessors_version[0] is ffi._cdef_version:
+ return
+ #
+ for key, (tp, _) in ffi._parser._declarations.items():
+ if not isinstance(tp, model.EnumType):
+ tag, name = key.split(' ', 1)
+ if tag == 'function':
+ accessors[name] = accessor_function
+ elif tag == 'variable':
+ accessors[name] = accessor_variable
+ elif tag == 'constant':
+ accessors[name] = accessor_constant
+ else:
+ for i, enumname in enumerate(tp.enumerators):
+ def accessor_enum(name, tp=tp, i=i):
+ tp.check_not_partial()
+ library.__dict__[name] = tp.enumvalues[i]
+ accessors[enumname] = accessor_enum
+ for name in ffi._parser._int_constants:
+ accessors.setdefault(name, accessor_int_constant)
+ accessors_version[0] = ffi._cdef_version
+ #
+ def make_accessor(name):
+ with ffi._lock:
+ if name in library.__dict__ or name in FFILibrary.__dict__:
+ return # added by another thread while waiting for the lock
+ if name not in accessors:
+ update_accessors()
+ if name not in accessors:
+ raise AttributeError(name)
+ accessors[name](name)
+ #
+ class FFILibrary(object):
+ def __getattr__(self, name):
+ make_accessor(name)
+ return getattr(self, name)
+ def __setattr__(self, name, value):
+ try:
+ property = getattr(self.__class__, name)
+ except AttributeError:
+ make_accessor(name)
+ setattr(self, name, value)
+ else:
+ property.__set__(self, value)
+ def __dir__(self):
+ with ffi._lock:
+ update_accessors()
+ return accessors.keys()
+ def __addressof__(self, name):
+ if name in library.__dict__:
+ return library.__dict__[name]
+ if name in FFILibrary.__dict__:
+ return addressof_var(name)
+ make_accessor(name)
+ if name in library.__dict__:
+ return library.__dict__[name]
+ if name in FFILibrary.__dict__:
+ return addressof_var(name)
+ raise AttributeError("cffi library has no function or "
+ "global variable named '%s'" % (name,))
+ def __cffi_close__(self):
+ backendlib.close_lib()
+ self.__dict__.clear()
+ #
+ if isinstance(libname, basestring):
+ try:
+ if not isinstance(libname, str): # unicode, on Python 2
+ libname = libname.encode('utf-8')
+ FFILibrary.__name__ = 'FFILibrary_%s' % libname
+ except UnicodeError:
+ pass
+ library = FFILibrary()
+ return library, library.__dict__
+
+def _builtin_function_type(func):
+ # a hack to make at least ffi.typeof(builtin_function) work,
+ # if the builtin function was obtained by 'vengine_cpy'.
+ import sys
+ try:
+ module = sys.modules[func.__module__]
+ ffi = module._cffi_original_ffi
+ types_of_builtin_funcs = module._cffi_types_of_builtin_funcs
+ tp = types_of_builtin_funcs[func]
+ except (KeyError, AttributeError, TypeError):
+ return None
+ else:
+ with ffi._lock:
+ return ffi._get_cached_btype(tp)
diff --git a/templates/skills/file_manager/dependencies/cffi/backend_ctypes.py b/templates/skills/file_manager/dependencies/cffi/backend_ctypes.py
new file mode 100644
index 00000000..e7956a79
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/cffi/backend_ctypes.py
@@ -0,0 +1,1121 @@
+import ctypes, ctypes.util, operator, sys
+from . import model
+
+if sys.version_info < (3,):
+ bytechr = chr
+else:
+ unicode = str
+ long = int
+ xrange = range
+ bytechr = lambda num: bytes([num])
+
+class CTypesType(type):
+ pass
+
+class CTypesData(object):
+ __metaclass__ = CTypesType
+ __slots__ = ['__weakref__']
+ __name__ = ''
+
+ def __init__(self, *args):
+ raise TypeError("cannot instantiate %r" % (self.__class__,))
+
+ @classmethod
+ def _newp(cls, init):
+ raise TypeError("expected a pointer or array ctype, got '%s'"
+ % (cls._get_c_name(),))
+
+ @staticmethod
+ def _to_ctypes(value):
+ raise TypeError
+
+ @classmethod
+ def _arg_to_ctypes(cls, *value):
+ try:
+ ctype = cls._ctype
+ except AttributeError:
+ raise TypeError("cannot create an instance of %r" % (cls,))
+ if value:
+ res = cls._to_ctypes(*value)
+ if not isinstance(res, ctype):
+ res = cls._ctype(res)
+ else:
+ res = cls._ctype()
+ return res
+
+ @classmethod
+ def _create_ctype_obj(cls, init):
+ if init is None:
+ return cls._arg_to_ctypes()
+ else:
+ return cls._arg_to_ctypes(init)
+
+ @staticmethod
+ def _from_ctypes(ctypes_value):
+ raise TypeError
+
+ @classmethod
+ def _get_c_name(cls, replace_with=''):
+ return cls._reftypename.replace(' &', replace_with)
+
+ @classmethod
+ def _fix_class(cls):
+ cls.__name__ = 'CData<%s>' % (cls._get_c_name(),)
+ cls.__qualname__ = 'CData<%s>' % (cls._get_c_name(),)
+ cls.__module__ = 'ffi'
+
+ def _get_own_repr(self):
+ raise NotImplementedError
+
+ def _addr_repr(self, address):
+ if address == 0:
+ return 'NULL'
+ else:
+ if address < 0:
+ address += 1 << (8*ctypes.sizeof(ctypes.c_void_p))
+ return '0x%x' % address
+
+ def __repr__(self, c_name=None):
+ own = self._get_own_repr()
+ return '' % (c_name or self._get_c_name(), own)
+
+ def _convert_to_address(self, BClass):
+ if BClass is None:
+ raise TypeError("cannot convert %r to an address" % (
+ self._get_c_name(),))
+ else:
+ raise TypeError("cannot convert %r to %r" % (
+ self._get_c_name(), BClass._get_c_name()))
+
+ @classmethod
+ def _get_size(cls):
+ return ctypes.sizeof(cls._ctype)
+
+ def _get_size_of_instance(self):
+ return ctypes.sizeof(self._ctype)
+
+ @classmethod
+ def _cast_from(cls, source):
+ raise TypeError("cannot cast to %r" % (cls._get_c_name(),))
+
+ def _cast_to_integer(self):
+ return self._convert_to_address(None)
+
+ @classmethod
+ def _alignment(cls):
+ return ctypes.alignment(cls._ctype)
+
+ def __iter__(self):
+ raise TypeError("cdata %r does not support iteration" % (
+ self._get_c_name()),)
+
+ def _make_cmp(name):
+ cmpfunc = getattr(operator, name)
+ def cmp(self, other):
+ v_is_ptr = not isinstance(self, CTypesGenericPrimitive)
+ w_is_ptr = (isinstance(other, CTypesData) and
+ not isinstance(other, CTypesGenericPrimitive))
+ if v_is_ptr and w_is_ptr:
+ return cmpfunc(self._convert_to_address(None),
+ other._convert_to_address(None))
+ elif v_is_ptr or w_is_ptr:
+ return NotImplemented
+ else:
+ if isinstance(self, CTypesGenericPrimitive):
+ self = self._value
+ if isinstance(other, CTypesGenericPrimitive):
+ other = other._value
+ return cmpfunc(self, other)
+ cmp.func_name = name
+ return cmp
+
+ __eq__ = _make_cmp('__eq__')
+ __ne__ = _make_cmp('__ne__')
+ __lt__ = _make_cmp('__lt__')
+ __le__ = _make_cmp('__le__')
+ __gt__ = _make_cmp('__gt__')
+ __ge__ = _make_cmp('__ge__')
+
+ def __hash__(self):
+ return hash(self._convert_to_address(None))
+
+ def _to_string(self, maxlen):
+ raise TypeError("string(): %r" % (self,))
+
+
+class CTypesGenericPrimitive(CTypesData):
+ __slots__ = []
+
+ def __hash__(self):
+ return hash(self._value)
+
+ def _get_own_repr(self):
+ return repr(self._from_ctypes(self._value))
+
+
+class CTypesGenericArray(CTypesData):
+ __slots__ = []
+
+ @classmethod
+ def _newp(cls, init):
+ return cls(init)
+
+ def __iter__(self):
+ for i in xrange(len(self)):
+ yield self[i]
+
+ def _get_own_repr(self):
+ return self._addr_repr(ctypes.addressof(self._blob))
+
+
+class CTypesGenericPtr(CTypesData):
+ __slots__ = ['_address', '_as_ctype_ptr']
+ _automatic_casts = False
+ kind = "pointer"
+
+ @classmethod
+ def _newp(cls, init):
+ return cls(init)
+
+ @classmethod
+ def _cast_from(cls, source):
+ if source is None:
+ address = 0
+ elif isinstance(source, CTypesData):
+ address = source._cast_to_integer()
+ elif isinstance(source, (int, long)):
+ address = source
+ else:
+ raise TypeError("bad type for cast to %r: %r" %
+ (cls, type(source).__name__))
+ return cls._new_pointer_at(address)
+
+ @classmethod
+ def _new_pointer_at(cls, address):
+ self = cls.__new__(cls)
+ self._address = address
+ self._as_ctype_ptr = ctypes.cast(address, cls._ctype)
+ return self
+
+ def _get_own_repr(self):
+ try:
+ return self._addr_repr(self._address)
+ except AttributeError:
+ return '???'
+
+ def _cast_to_integer(self):
+ return self._address
+
+ def __nonzero__(self):
+ return bool(self._address)
+ __bool__ = __nonzero__
+
+ @classmethod
+ def _to_ctypes(cls, value):
+ if not isinstance(value, CTypesData):
+ raise TypeError("unexpected %s object" % type(value).__name__)
+ address = value._convert_to_address(cls)
+ return ctypes.cast(address, cls._ctype)
+
+ @classmethod
+ def _from_ctypes(cls, ctypes_ptr):
+ address = ctypes.cast(ctypes_ptr, ctypes.c_void_p).value or 0
+ return cls._new_pointer_at(address)
+
+ @classmethod
+ def _initialize(cls, ctypes_ptr, value):
+ if value:
+ ctypes_ptr.contents = cls._to_ctypes(value).contents
+
+ def _convert_to_address(self, BClass):
+ if (BClass in (self.__class__, None) or BClass._automatic_casts
+ or self._automatic_casts):
+ return self._address
+ else:
+ return CTypesData._convert_to_address(self, BClass)
+
+
+class CTypesBaseStructOrUnion(CTypesData):
+ __slots__ = ['_blob']
+
+ @classmethod
+ def _create_ctype_obj(cls, init):
+ # may be overridden
+ raise TypeError("cannot instantiate opaque type %s" % (cls,))
+
+ def _get_own_repr(self):
+ return self._addr_repr(ctypes.addressof(self._blob))
+
+ @classmethod
+ def _offsetof(cls, fieldname):
+ return getattr(cls._ctype, fieldname).offset
+
+ def _convert_to_address(self, BClass):
+ if getattr(BClass, '_BItem', None) is self.__class__:
+ return ctypes.addressof(self._blob)
+ else:
+ return CTypesData._convert_to_address(self, BClass)
+
+ @classmethod
+ def _from_ctypes(cls, ctypes_struct_or_union):
+ self = cls.__new__(cls)
+ self._blob = ctypes_struct_or_union
+ return self
+
+ @classmethod
+ def _to_ctypes(cls, value):
+ return value._blob
+
+ def __repr__(self, c_name=None):
+ return CTypesData.__repr__(self, c_name or self._get_c_name(' &'))
+
+
+class CTypesBackend(object):
+
+ PRIMITIVE_TYPES = {
+ 'char': ctypes.c_char,
+ 'short': ctypes.c_short,
+ 'int': ctypes.c_int,
+ 'long': ctypes.c_long,
+ 'long long': ctypes.c_longlong,
+ 'signed char': ctypes.c_byte,
+ 'unsigned char': ctypes.c_ubyte,
+ 'unsigned short': ctypes.c_ushort,
+ 'unsigned int': ctypes.c_uint,
+ 'unsigned long': ctypes.c_ulong,
+ 'unsigned long long': ctypes.c_ulonglong,
+ 'float': ctypes.c_float,
+ 'double': ctypes.c_double,
+ '_Bool': ctypes.c_bool,
+ }
+
+ for _name in ['unsigned long long', 'unsigned long',
+ 'unsigned int', 'unsigned short', 'unsigned char']:
+ _size = ctypes.sizeof(PRIMITIVE_TYPES[_name])
+ PRIMITIVE_TYPES['uint%d_t' % (8*_size)] = PRIMITIVE_TYPES[_name]
+ if _size == ctypes.sizeof(ctypes.c_void_p):
+ PRIMITIVE_TYPES['uintptr_t'] = PRIMITIVE_TYPES[_name]
+ if _size == ctypes.sizeof(ctypes.c_size_t):
+ PRIMITIVE_TYPES['size_t'] = PRIMITIVE_TYPES[_name]
+
+ for _name in ['long long', 'long', 'int', 'short', 'signed char']:
+ _size = ctypes.sizeof(PRIMITIVE_TYPES[_name])
+ PRIMITIVE_TYPES['int%d_t' % (8*_size)] = PRIMITIVE_TYPES[_name]
+ if _size == ctypes.sizeof(ctypes.c_void_p):
+ PRIMITIVE_TYPES['intptr_t'] = PRIMITIVE_TYPES[_name]
+ PRIMITIVE_TYPES['ptrdiff_t'] = PRIMITIVE_TYPES[_name]
+ if _size == ctypes.sizeof(ctypes.c_size_t):
+ PRIMITIVE_TYPES['ssize_t'] = PRIMITIVE_TYPES[_name]
+
+
+ def __init__(self):
+ self.RTLD_LAZY = 0 # not supported anyway by ctypes
+ self.RTLD_NOW = 0
+ self.RTLD_GLOBAL = ctypes.RTLD_GLOBAL
+ self.RTLD_LOCAL = ctypes.RTLD_LOCAL
+
+ def set_ffi(self, ffi):
+ self.ffi = ffi
+
+ def _get_types(self):
+ return CTypesData, CTypesType
+
+ def load_library(self, path, flags=0):
+ cdll = ctypes.CDLL(path, flags)
+ return CTypesLibrary(self, cdll)
+
+ def new_void_type(self):
+ class CTypesVoid(CTypesData):
+ __slots__ = []
+ _reftypename = 'void &'
+ @staticmethod
+ def _from_ctypes(novalue):
+ return None
+ @staticmethod
+ def _to_ctypes(novalue):
+ if novalue is not None:
+ raise TypeError("None expected, got %s object" %
+ (type(novalue).__name__,))
+ return None
+ CTypesVoid._fix_class()
+ return CTypesVoid
+
+ def new_primitive_type(self, name):
+ if name == 'wchar_t':
+ raise NotImplementedError(name)
+ ctype = self.PRIMITIVE_TYPES[name]
+ if name == 'char':
+ kind = 'char'
+ elif name in ('float', 'double'):
+ kind = 'float'
+ else:
+ if name in ('signed char', 'unsigned char'):
+ kind = 'byte'
+ elif name == '_Bool':
+ kind = 'bool'
+ else:
+ kind = 'int'
+ is_signed = (ctype(-1).value == -1)
+ #
+ def _cast_source_to_int(source):
+ if isinstance(source, (int, long, float)):
+ source = int(source)
+ elif isinstance(source, CTypesData):
+ source = source._cast_to_integer()
+ elif isinstance(source, bytes):
+ source = ord(source)
+ elif source is None:
+ source = 0
+ else:
+ raise TypeError("bad type for cast to %r: %r" %
+ (CTypesPrimitive, type(source).__name__))
+ return source
+ #
+ kind1 = kind
+ class CTypesPrimitive(CTypesGenericPrimitive):
+ __slots__ = ['_value']
+ _ctype = ctype
+ _reftypename = '%s &' % name
+ kind = kind1
+
+ def __init__(self, value):
+ self._value = value
+
+ @staticmethod
+ def _create_ctype_obj(init):
+ if init is None:
+ return ctype()
+ return ctype(CTypesPrimitive._to_ctypes(init))
+
+ if kind == 'int' or kind == 'byte':
+ @classmethod
+ def _cast_from(cls, source):
+ source = _cast_source_to_int(source)
+ source = ctype(source).value # cast within range
+ return cls(source)
+ def __int__(self):
+ return self._value
+
+ if kind == 'bool':
+ @classmethod
+ def _cast_from(cls, source):
+ if not isinstance(source, (int, long, float)):
+ source = _cast_source_to_int(source)
+ return cls(bool(source))
+ def __int__(self):
+ return int(self._value)
+
+ if kind == 'char':
+ @classmethod
+ def _cast_from(cls, source):
+ source = _cast_source_to_int(source)
+ source = bytechr(source & 0xFF)
+ return cls(source)
+ def __int__(self):
+ return ord(self._value)
+
+ if kind == 'float':
+ @classmethod
+ def _cast_from(cls, source):
+ if isinstance(source, float):
+ pass
+ elif isinstance(source, CTypesGenericPrimitive):
+ if hasattr(source, '__float__'):
+ source = float(source)
+ else:
+ source = int(source)
+ else:
+ source = _cast_source_to_int(source)
+ source = ctype(source).value # fix precision
+ return cls(source)
+ def __int__(self):
+ return int(self._value)
+ def __float__(self):
+ return self._value
+
+ _cast_to_integer = __int__
+
+ if kind == 'int' or kind == 'byte' or kind == 'bool':
+ @staticmethod
+ def _to_ctypes(x):
+ if not isinstance(x, (int, long)):
+ if isinstance(x, CTypesData):
+ x = int(x)
+ else:
+ raise TypeError("integer expected, got %s" %
+ type(x).__name__)
+ if ctype(x).value != x:
+ if not is_signed and x < 0:
+ raise OverflowError("%s: negative integer" % name)
+ else:
+ raise OverflowError("%s: integer out of bounds"
+ % name)
+ return x
+
+ if kind == 'char':
+ @staticmethod
+ def _to_ctypes(x):
+ if isinstance(x, bytes) and len(x) == 1:
+ return x
+ if isinstance(x, CTypesPrimitive): # >
+ return x._value
+ raise TypeError("character expected, got %s" %
+ type(x).__name__)
+ def __nonzero__(self):
+ return ord(self._value) != 0
+ else:
+ def __nonzero__(self):
+ return self._value != 0
+ __bool__ = __nonzero__
+
+ if kind == 'float':
+ @staticmethod
+ def _to_ctypes(x):
+ if not isinstance(x, (int, long, float, CTypesData)):
+ raise TypeError("float expected, got %s" %
+ type(x).__name__)
+ return ctype(x).value
+
+ @staticmethod
+ def _from_ctypes(value):
+ return getattr(value, 'value', value)
+
+ @staticmethod
+ def _initialize(blob, init):
+ blob.value = CTypesPrimitive._to_ctypes(init)
+
+ if kind == 'char':
+ def _to_string(self, maxlen):
+ return self._value
+ if kind == 'byte':
+ def _to_string(self, maxlen):
+ return chr(self._value & 0xff)
+ #
+ CTypesPrimitive._fix_class()
+ return CTypesPrimitive
+
+ def new_pointer_type(self, BItem):
+ getbtype = self.ffi._get_cached_btype
+ if BItem is getbtype(model.PrimitiveType('char')):
+ kind = 'charp'
+ elif BItem in (getbtype(model.PrimitiveType('signed char')),
+ getbtype(model.PrimitiveType('unsigned char'))):
+ kind = 'bytep'
+ elif BItem is getbtype(model.void_type):
+ kind = 'voidp'
+ else:
+ kind = 'generic'
+ #
+ class CTypesPtr(CTypesGenericPtr):
+ __slots__ = ['_own']
+ if kind == 'charp':
+ __slots__ += ['__as_strbuf']
+ _BItem = BItem
+ if hasattr(BItem, '_ctype'):
+ _ctype = ctypes.POINTER(BItem._ctype)
+ _bitem_size = ctypes.sizeof(BItem._ctype)
+ else:
+ _ctype = ctypes.c_void_p
+ if issubclass(BItem, CTypesGenericArray):
+ _reftypename = BItem._get_c_name('(* &)')
+ else:
+ _reftypename = BItem._get_c_name(' * &')
+
+ def __init__(self, init):
+ ctypeobj = BItem._create_ctype_obj(init)
+ if kind == 'charp':
+ self.__as_strbuf = ctypes.create_string_buffer(
+ ctypeobj.value + b'\x00')
+ self._as_ctype_ptr = ctypes.cast(
+ self.__as_strbuf, self._ctype)
+ else:
+ self._as_ctype_ptr = ctypes.pointer(ctypeobj)
+ self._address = ctypes.cast(self._as_ctype_ptr,
+ ctypes.c_void_p).value
+ self._own = True
+
+ def __add__(self, other):
+ if isinstance(other, (int, long)):
+ return self._new_pointer_at(self._address +
+ other * self._bitem_size)
+ else:
+ return NotImplemented
+
+ def __sub__(self, other):
+ if isinstance(other, (int, long)):
+ return self._new_pointer_at(self._address -
+ other * self._bitem_size)
+ elif type(self) is type(other):
+ return (self._address - other._address) // self._bitem_size
+ else:
+ return NotImplemented
+
+ def __getitem__(self, index):
+ if getattr(self, '_own', False) and index != 0:
+ raise IndexError
+ return BItem._from_ctypes(self._as_ctype_ptr[index])
+
+ def __setitem__(self, index, value):
+ self._as_ctype_ptr[index] = BItem._to_ctypes(value)
+
+ if kind == 'charp' or kind == 'voidp':
+ @classmethod
+ def _arg_to_ctypes(cls, *value):
+ if value and isinstance(value[0], bytes):
+ return ctypes.c_char_p(value[0])
+ else:
+ return super(CTypesPtr, cls)._arg_to_ctypes(*value)
+
+ if kind == 'charp' or kind == 'bytep':
+ def _to_string(self, maxlen):
+ if maxlen < 0:
+ maxlen = sys.maxsize
+ p = ctypes.cast(self._as_ctype_ptr,
+ ctypes.POINTER(ctypes.c_char))
+ n = 0
+ while n < maxlen and p[n] != b'\x00':
+ n += 1
+ return b''.join([p[i] for i in range(n)])
+
+ def _get_own_repr(self):
+ if getattr(self, '_own', False):
+ return 'owning %d bytes' % (
+ ctypes.sizeof(self._as_ctype_ptr.contents),)
+ return super(CTypesPtr, self)._get_own_repr()
+ #
+ if (BItem is self.ffi._get_cached_btype(model.void_type) or
+ BItem is self.ffi._get_cached_btype(model.PrimitiveType('char'))):
+ CTypesPtr._automatic_casts = True
+ #
+ CTypesPtr._fix_class()
+ return CTypesPtr
+
+ def new_array_type(self, CTypesPtr, length):
+ if length is None:
+ brackets = ' &[]'
+ else:
+ brackets = ' &[%d]' % length
+ BItem = CTypesPtr._BItem
+ getbtype = self.ffi._get_cached_btype
+ if BItem is getbtype(model.PrimitiveType('char')):
+ kind = 'char'
+ elif BItem in (getbtype(model.PrimitiveType('signed char')),
+ getbtype(model.PrimitiveType('unsigned char'))):
+ kind = 'byte'
+ else:
+ kind = 'generic'
+ #
+ class CTypesArray(CTypesGenericArray):
+ __slots__ = ['_blob', '_own']
+ if length is not None:
+ _ctype = BItem._ctype * length
+ else:
+ __slots__.append('_ctype')
+ _reftypename = BItem._get_c_name(brackets)
+ _declared_length = length
+ _CTPtr = CTypesPtr
+
+ def __init__(self, init):
+ if length is None:
+ if isinstance(init, (int, long)):
+ len1 = init
+ init = None
+ elif kind == 'char' and isinstance(init, bytes):
+ len1 = len(init) + 1 # extra null
+ else:
+ init = tuple(init)
+ len1 = len(init)
+ self._ctype = BItem._ctype * len1
+ self._blob = self._ctype()
+ self._own = True
+ if init is not None:
+ self._initialize(self._blob, init)
+
+ @staticmethod
+ def _initialize(blob, init):
+ if isinstance(init, bytes):
+ init = [init[i:i+1] for i in range(len(init))]
+ else:
+ if isinstance(init, CTypesGenericArray):
+ if (len(init) != len(blob) or
+ not isinstance(init, CTypesArray)):
+ raise TypeError("length/type mismatch: %s" % (init,))
+ init = tuple(init)
+ if len(init) > len(blob):
+ raise IndexError("too many initializers")
+ addr = ctypes.cast(blob, ctypes.c_void_p).value
+ PTR = ctypes.POINTER(BItem._ctype)
+ itemsize = ctypes.sizeof(BItem._ctype)
+ for i, value in enumerate(init):
+ p = ctypes.cast(addr + i * itemsize, PTR)
+ BItem._initialize(p.contents, value)
+
+ def __len__(self):
+ return len(self._blob)
+
+ def __getitem__(self, index):
+ if not (0 <= index < len(self._blob)):
+ raise IndexError
+ return BItem._from_ctypes(self._blob[index])
+
+ def __setitem__(self, index, value):
+ if not (0 <= index < len(self._blob)):
+ raise IndexError
+ self._blob[index] = BItem._to_ctypes(value)
+
+ if kind == 'char' or kind == 'byte':
+ def _to_string(self, maxlen):
+ if maxlen < 0:
+ maxlen = len(self._blob)
+ p = ctypes.cast(self._blob,
+ ctypes.POINTER(ctypes.c_char))
+ n = 0
+ while n < maxlen and p[n] != b'\x00':
+ n += 1
+ return b''.join([p[i] for i in range(n)])
+
+ def _get_own_repr(self):
+ if getattr(self, '_own', False):
+ return 'owning %d bytes' % (ctypes.sizeof(self._blob),)
+ return super(CTypesArray, self)._get_own_repr()
+
+ def _convert_to_address(self, BClass):
+ if BClass in (CTypesPtr, None) or BClass._automatic_casts:
+ return ctypes.addressof(self._blob)
+ else:
+ return CTypesData._convert_to_address(self, BClass)
+
+ @staticmethod
+ def _from_ctypes(ctypes_array):
+ self = CTypesArray.__new__(CTypesArray)
+ self._blob = ctypes_array
+ return self
+
+ @staticmethod
+ def _arg_to_ctypes(value):
+ return CTypesPtr._arg_to_ctypes(value)
+
+ def __add__(self, other):
+ if isinstance(other, (int, long)):
+ return CTypesPtr._new_pointer_at(
+ ctypes.addressof(self._blob) +
+ other * ctypes.sizeof(BItem._ctype))
+ else:
+ return NotImplemented
+
+ @classmethod
+ def _cast_from(cls, source):
+ raise NotImplementedError("casting to %r" % (
+ cls._get_c_name(),))
+ #
+ CTypesArray._fix_class()
+ return CTypesArray
+
+ def _new_struct_or_union(self, kind, name, base_ctypes_class):
+ #
+ class struct_or_union(base_ctypes_class):
+ pass
+ struct_or_union.__name__ = '%s_%s' % (kind, name)
+ kind1 = kind
+ #
+ class CTypesStructOrUnion(CTypesBaseStructOrUnion):
+ __slots__ = ['_blob']
+ _ctype = struct_or_union
+ _reftypename = '%s &' % (name,)
+ _kind = kind = kind1
+ #
+ CTypesStructOrUnion._fix_class()
+ return CTypesStructOrUnion
+
+ def new_struct_type(self, name):
+ return self._new_struct_or_union('struct', name, ctypes.Structure)
+
+ def new_union_type(self, name):
+ return self._new_struct_or_union('union', name, ctypes.Union)
+
+ def complete_struct_or_union(self, CTypesStructOrUnion, fields, tp,
+ totalsize=-1, totalalignment=-1, sflags=0,
+ pack=0):
+ if totalsize >= 0 or totalalignment >= 0:
+ raise NotImplementedError("the ctypes backend of CFFI does not support "
+ "structures completed by verify(); please "
+ "compile and install the _cffi_backend module.")
+ struct_or_union = CTypesStructOrUnion._ctype
+ fnames = [fname for (fname, BField, bitsize) in fields]
+ btypes = [BField for (fname, BField, bitsize) in fields]
+ bitfields = [bitsize for (fname, BField, bitsize) in fields]
+ #
+ bfield_types = {}
+ cfields = []
+ for (fname, BField, bitsize) in fields:
+ if bitsize < 0:
+ cfields.append((fname, BField._ctype))
+ bfield_types[fname] = BField
+ else:
+ cfields.append((fname, BField._ctype, bitsize))
+ bfield_types[fname] = Ellipsis
+ if sflags & 8:
+ struct_or_union._pack_ = 1
+ elif pack:
+ struct_or_union._pack_ = pack
+ struct_or_union._fields_ = cfields
+ CTypesStructOrUnion._bfield_types = bfield_types
+ #
+ @staticmethod
+ def _create_ctype_obj(init):
+ result = struct_or_union()
+ if init is not None:
+ initialize(result, init)
+ return result
+ CTypesStructOrUnion._create_ctype_obj = _create_ctype_obj
+ #
+ def initialize(blob, init):
+ if is_union:
+ if len(init) > 1:
+ raise ValueError("union initializer: %d items given, but "
+ "only one supported (use a dict if needed)"
+ % (len(init),))
+ if not isinstance(init, dict):
+ if isinstance(init, (bytes, unicode)):
+ raise TypeError("union initializer: got a str")
+ init = tuple(init)
+ if len(init) > len(fnames):
+ raise ValueError("too many values for %s initializer" %
+ CTypesStructOrUnion._get_c_name())
+ init = dict(zip(fnames, init))
+ addr = ctypes.addressof(blob)
+ for fname, value in init.items():
+ BField, bitsize = name2fieldtype[fname]
+ assert bitsize < 0, \
+ "not implemented: initializer with bit fields"
+ offset = CTypesStructOrUnion._offsetof(fname)
+ PTR = ctypes.POINTER(BField._ctype)
+ p = ctypes.cast(addr + offset, PTR)
+ BField._initialize(p.contents, value)
+ is_union = CTypesStructOrUnion._kind == 'union'
+ name2fieldtype = dict(zip(fnames, zip(btypes, bitfields)))
+ #
+ for fname, BField, bitsize in fields:
+ if fname == '':
+ raise NotImplementedError("nested anonymous structs/unions")
+ if hasattr(CTypesStructOrUnion, fname):
+ raise ValueError("the field name %r conflicts in "
+ "the ctypes backend" % fname)
+ if bitsize < 0:
+ def getter(self, fname=fname, BField=BField,
+ offset=CTypesStructOrUnion._offsetof(fname),
+ PTR=ctypes.POINTER(BField._ctype)):
+ addr = ctypes.addressof(self._blob)
+ p = ctypes.cast(addr + offset, PTR)
+ return BField._from_ctypes(p.contents)
+ def setter(self, value, fname=fname, BField=BField):
+ setattr(self._blob, fname, BField._to_ctypes(value))
+ #
+ if issubclass(BField, CTypesGenericArray):
+ setter = None
+ if BField._declared_length == 0:
+ def getter(self, fname=fname, BFieldPtr=BField._CTPtr,
+ offset=CTypesStructOrUnion._offsetof(fname),
+ PTR=ctypes.POINTER(BField._ctype)):
+ addr = ctypes.addressof(self._blob)
+ p = ctypes.cast(addr + offset, PTR)
+ return BFieldPtr._from_ctypes(p)
+ #
+ else:
+ def getter(self, fname=fname, BField=BField):
+ return BField._from_ctypes(getattr(self._blob, fname))
+ def setter(self, value, fname=fname, BField=BField):
+ # xxx obscure workaround
+ value = BField._to_ctypes(value)
+ oldvalue = getattr(self._blob, fname)
+ setattr(self._blob, fname, value)
+ if value != getattr(self._blob, fname):
+ setattr(self._blob, fname, oldvalue)
+ raise OverflowError("value too large for bitfield")
+ setattr(CTypesStructOrUnion, fname, property(getter, setter))
+ #
+ CTypesPtr = self.ffi._get_cached_btype(model.PointerType(tp))
+ for fname in fnames:
+ if hasattr(CTypesPtr, fname):
+ raise ValueError("the field name %r conflicts in "
+ "the ctypes backend" % fname)
+ def getter(self, fname=fname):
+ return getattr(self[0], fname)
+ def setter(self, value, fname=fname):
+ setattr(self[0], fname, value)
+ setattr(CTypesPtr, fname, property(getter, setter))
+
+ def new_function_type(self, BArgs, BResult, has_varargs):
+ nameargs = [BArg._get_c_name() for BArg in BArgs]
+ if has_varargs:
+ nameargs.append('...')
+ nameargs = ', '.join(nameargs)
+ #
+ class CTypesFunctionPtr(CTypesGenericPtr):
+ __slots__ = ['_own_callback', '_name']
+ _ctype = ctypes.CFUNCTYPE(getattr(BResult, '_ctype', None),
+ *[BArg._ctype for BArg in BArgs],
+ use_errno=True)
+ _reftypename = BResult._get_c_name('(* &)(%s)' % (nameargs,))
+
+ def __init__(self, init, error=None):
+ # create a callback to the Python callable init()
+ import traceback
+ assert not has_varargs, "varargs not supported for callbacks"
+ if getattr(BResult, '_ctype', None) is not None:
+ error = BResult._from_ctypes(
+ BResult._create_ctype_obj(error))
+ else:
+ error = None
+ def callback(*args):
+ args2 = []
+ for arg, BArg in zip(args, BArgs):
+ args2.append(BArg._from_ctypes(arg))
+ try:
+ res2 = init(*args2)
+ res2 = BResult._to_ctypes(res2)
+ except:
+ traceback.print_exc()
+ res2 = error
+ if issubclass(BResult, CTypesGenericPtr):
+ if res2:
+ res2 = ctypes.cast(res2, ctypes.c_void_p).value
+ # .value: http://bugs.python.org/issue1574593
+ else:
+ res2 = None
+ #print repr(res2)
+ return res2
+ if issubclass(BResult, CTypesGenericPtr):
+ # The only pointers callbacks can return are void*s:
+ # http://bugs.python.org/issue5710
+ callback_ctype = ctypes.CFUNCTYPE(
+ ctypes.c_void_p,
+ *[BArg._ctype for BArg in BArgs],
+ use_errno=True)
+ else:
+ callback_ctype = CTypesFunctionPtr._ctype
+ self._as_ctype_ptr = callback_ctype(callback)
+ self._address = ctypes.cast(self._as_ctype_ptr,
+ ctypes.c_void_p).value
+ self._own_callback = init
+
+ @staticmethod
+ def _initialize(ctypes_ptr, value):
+ if value:
+ raise NotImplementedError("ctypes backend: not supported: "
+ "initializers for function pointers")
+
+ def __repr__(self):
+ c_name = getattr(self, '_name', None)
+ if c_name:
+ i = self._reftypename.index('(* &)')
+ if self._reftypename[i-1] not in ' )*':
+ c_name = ' ' + c_name
+ c_name = self._reftypename.replace('(* &)', c_name)
+ return CTypesData.__repr__(self, c_name)
+
+ def _get_own_repr(self):
+ if getattr(self, '_own_callback', None) is not None:
+ return 'calling %r' % (self._own_callback,)
+ return super(CTypesFunctionPtr, self)._get_own_repr()
+
+ def __call__(self, *args):
+ if has_varargs:
+ assert len(args) >= len(BArgs)
+ extraargs = args[len(BArgs):]
+ args = args[:len(BArgs)]
+ else:
+ assert len(args) == len(BArgs)
+ ctypes_args = []
+ for arg, BArg in zip(args, BArgs):
+ ctypes_args.append(BArg._arg_to_ctypes(arg))
+ if has_varargs:
+ for i, arg in enumerate(extraargs):
+ if arg is None:
+ ctypes_args.append(ctypes.c_void_p(0)) # NULL
+ continue
+ if not isinstance(arg, CTypesData):
+ raise TypeError(
+ "argument %d passed in the variadic part "
+ "needs to be a cdata object (got %s)" %
+ (1 + len(BArgs) + i, type(arg).__name__))
+ ctypes_args.append(arg._arg_to_ctypes(arg))
+ result = self._as_ctype_ptr(*ctypes_args)
+ return BResult._from_ctypes(result)
+ #
+ CTypesFunctionPtr._fix_class()
+ return CTypesFunctionPtr
+
+ def new_enum_type(self, name, enumerators, enumvalues, CTypesInt):
+ assert isinstance(name, str)
+ reverse_mapping = dict(zip(reversed(enumvalues),
+ reversed(enumerators)))
+ #
+ class CTypesEnum(CTypesInt):
+ __slots__ = []
+ _reftypename = '%s &' % name
+
+ def _get_own_repr(self):
+ value = self._value
+ try:
+ return '%d: %s' % (value, reverse_mapping[value])
+ except KeyError:
+ return str(value)
+
+ def _to_string(self, maxlen):
+ value = self._value
+ try:
+ return reverse_mapping[value]
+ except KeyError:
+ return str(value)
+ #
+ CTypesEnum._fix_class()
+ return CTypesEnum
+
+ def get_errno(self):
+ return ctypes.get_errno()
+
+ def set_errno(self, value):
+ ctypes.set_errno(value)
+
+ def string(self, b, maxlen=-1):
+ return b._to_string(maxlen)
+
+ def buffer(self, bptr, size=-1):
+ raise NotImplementedError("buffer() with ctypes backend")
+
+ def sizeof(self, cdata_or_BType):
+ if isinstance(cdata_or_BType, CTypesData):
+ return cdata_or_BType._get_size_of_instance()
+ else:
+ assert issubclass(cdata_or_BType, CTypesData)
+ return cdata_or_BType._get_size()
+
+ def alignof(self, BType):
+ assert issubclass(BType, CTypesData)
+ return BType._alignment()
+
+ def newp(self, BType, source):
+ if not issubclass(BType, CTypesData):
+ raise TypeError
+ return BType._newp(source)
+
+ def cast(self, BType, source):
+ return BType._cast_from(source)
+
+ def callback(self, BType, source, error, onerror):
+ assert onerror is None # XXX not implemented
+ return BType(source, error)
+
+ _weakref_cache_ref = None
+
+ def gcp(self, cdata, destructor, size=0):
+ if self._weakref_cache_ref is None:
+ import weakref
+ class MyRef(weakref.ref):
+ def __eq__(self, other):
+ myref = self()
+ return self is other or (
+ myref is not None and myref is other())
+ def __ne__(self, other):
+ return not (self == other)
+ def __hash__(self):
+ try:
+ return self._hash
+ except AttributeError:
+ self._hash = hash(self())
+ return self._hash
+ self._weakref_cache_ref = {}, MyRef
+ weak_cache, MyRef = self._weakref_cache_ref
+
+ if destructor is None:
+ try:
+ del weak_cache[MyRef(cdata)]
+ except KeyError:
+ raise TypeError("Can remove destructor only on a object "
+ "previously returned by ffi.gc()")
+ return None
+
+ def remove(k):
+ cdata, destructor = weak_cache.pop(k, (None, None))
+ if destructor is not None:
+ destructor(cdata)
+
+ new_cdata = self.cast(self.typeof(cdata), cdata)
+ assert new_cdata is not cdata
+ weak_cache[MyRef(new_cdata, remove)] = (cdata, destructor)
+ return new_cdata
+
+ typeof = type
+
+ def getcname(self, BType, replace_with):
+ return BType._get_c_name(replace_with)
+
+ def typeoffsetof(self, BType, fieldname, num=0):
+ if isinstance(fieldname, str):
+ if num == 0 and issubclass(BType, CTypesGenericPtr):
+ BType = BType._BItem
+ if not issubclass(BType, CTypesBaseStructOrUnion):
+ raise TypeError("expected a struct or union ctype")
+ BField = BType._bfield_types[fieldname]
+ if BField is Ellipsis:
+ raise TypeError("not supported for bitfields")
+ return (BField, BType._offsetof(fieldname))
+ elif isinstance(fieldname, (int, long)):
+ if issubclass(BType, CTypesGenericArray):
+ BType = BType._CTPtr
+ if not issubclass(BType, CTypesGenericPtr):
+ raise TypeError("expected an array or ptr ctype")
+ BItem = BType._BItem
+ offset = BItem._get_size() * fieldname
+ if offset > sys.maxsize:
+ raise OverflowError
+ return (BItem, offset)
+ else:
+ raise TypeError(type(fieldname))
+
+ def rawaddressof(self, BTypePtr, cdata, offset=None):
+ if isinstance(cdata, CTypesBaseStructOrUnion):
+ ptr = ctypes.pointer(type(cdata)._to_ctypes(cdata))
+ elif isinstance(cdata, CTypesGenericPtr):
+ if offset is None or not issubclass(type(cdata)._BItem,
+ CTypesBaseStructOrUnion):
+ raise TypeError("unexpected cdata type")
+ ptr = type(cdata)._to_ctypes(cdata)
+ elif isinstance(cdata, CTypesGenericArray):
+ ptr = type(cdata)._to_ctypes(cdata)
+ else:
+ raise TypeError("expected a ")
+ if offset:
+ ptr = ctypes.cast(
+ ctypes.c_void_p(
+ ctypes.cast(ptr, ctypes.c_void_p).value + offset),
+ type(ptr))
+ return BTypePtr._from_ctypes(ptr)
+
+
+class CTypesLibrary(object):
+
+ def __init__(self, backend, cdll):
+ self.backend = backend
+ self.cdll = cdll
+
+ def load_function(self, BType, name):
+ c_func = getattr(self.cdll, name)
+ funcobj = BType._from_ctypes(c_func)
+ funcobj._name = name
+ return funcobj
+
+ def read_variable(self, BType, name):
+ try:
+ ctypes_obj = BType._ctype.in_dll(self.cdll, name)
+ except AttributeError as e:
+ raise NotImplementedError(e)
+ return BType._from_ctypes(ctypes_obj)
+
+ def write_variable(self, BType, name, value):
+ new_ctypes_obj = BType._to_ctypes(value)
+ ctypes_obj = BType._ctype.in_dll(self.cdll, name)
+ ctypes.memmove(ctypes.addressof(ctypes_obj),
+ ctypes.addressof(new_ctypes_obj),
+ ctypes.sizeof(BType._ctype))
diff --git a/templates/skills/file_manager/dependencies/cffi/cffi_opcode.py b/templates/skills/file_manager/dependencies/cffi/cffi_opcode.py
new file mode 100644
index 00000000..6421df62
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/cffi/cffi_opcode.py
@@ -0,0 +1,187 @@
+from .error import VerificationError
+
+class CffiOp(object):
+ def __init__(self, op, arg):
+ self.op = op
+ self.arg = arg
+
+ def as_c_expr(self):
+ if self.op is None:
+ assert isinstance(self.arg, str)
+ return '(_cffi_opcode_t)(%s)' % (self.arg,)
+ classname = CLASS_NAME[self.op]
+ return '_CFFI_OP(_CFFI_OP_%s, %s)' % (classname, self.arg)
+
+ def as_python_bytes(self):
+ if self.op is None and self.arg.isdigit():
+ value = int(self.arg) # non-negative: '-' not in self.arg
+ if value >= 2**31:
+ raise OverflowError("cannot emit %r: limited to 2**31-1"
+ % (self.arg,))
+ return format_four_bytes(value)
+ if isinstance(self.arg, str):
+ raise VerificationError("cannot emit to Python: %r" % (self.arg,))
+ return format_four_bytes((self.arg << 8) | self.op)
+
+ def __str__(self):
+ classname = CLASS_NAME.get(self.op, self.op)
+ return '(%s %s)' % (classname, self.arg)
+
+def format_four_bytes(num):
+ return '\\x%02X\\x%02X\\x%02X\\x%02X' % (
+ (num >> 24) & 0xFF,
+ (num >> 16) & 0xFF,
+ (num >> 8) & 0xFF,
+ (num ) & 0xFF)
+
+OP_PRIMITIVE = 1
+OP_POINTER = 3
+OP_ARRAY = 5
+OP_OPEN_ARRAY = 7
+OP_STRUCT_UNION = 9
+OP_ENUM = 11
+OP_FUNCTION = 13
+OP_FUNCTION_END = 15
+OP_NOOP = 17
+OP_BITFIELD = 19
+OP_TYPENAME = 21
+OP_CPYTHON_BLTN_V = 23 # varargs
+OP_CPYTHON_BLTN_N = 25 # noargs
+OP_CPYTHON_BLTN_O = 27 # O (i.e. a single arg)
+OP_CONSTANT = 29
+OP_CONSTANT_INT = 31
+OP_GLOBAL_VAR = 33
+OP_DLOPEN_FUNC = 35
+OP_DLOPEN_CONST = 37
+OP_GLOBAL_VAR_F = 39
+OP_EXTERN_PYTHON = 41
+
+PRIM_VOID = 0
+PRIM_BOOL = 1
+PRIM_CHAR = 2
+PRIM_SCHAR = 3
+PRIM_UCHAR = 4
+PRIM_SHORT = 5
+PRIM_USHORT = 6
+PRIM_INT = 7
+PRIM_UINT = 8
+PRIM_LONG = 9
+PRIM_ULONG = 10
+PRIM_LONGLONG = 11
+PRIM_ULONGLONG = 12
+PRIM_FLOAT = 13
+PRIM_DOUBLE = 14
+PRIM_LONGDOUBLE = 15
+
+PRIM_WCHAR = 16
+PRIM_INT8 = 17
+PRIM_UINT8 = 18
+PRIM_INT16 = 19
+PRIM_UINT16 = 20
+PRIM_INT32 = 21
+PRIM_UINT32 = 22
+PRIM_INT64 = 23
+PRIM_UINT64 = 24
+PRIM_INTPTR = 25
+PRIM_UINTPTR = 26
+PRIM_PTRDIFF = 27
+PRIM_SIZE = 28
+PRIM_SSIZE = 29
+PRIM_INT_LEAST8 = 30
+PRIM_UINT_LEAST8 = 31
+PRIM_INT_LEAST16 = 32
+PRIM_UINT_LEAST16 = 33
+PRIM_INT_LEAST32 = 34
+PRIM_UINT_LEAST32 = 35
+PRIM_INT_LEAST64 = 36
+PRIM_UINT_LEAST64 = 37
+PRIM_INT_FAST8 = 38
+PRIM_UINT_FAST8 = 39
+PRIM_INT_FAST16 = 40
+PRIM_UINT_FAST16 = 41
+PRIM_INT_FAST32 = 42
+PRIM_UINT_FAST32 = 43
+PRIM_INT_FAST64 = 44
+PRIM_UINT_FAST64 = 45
+PRIM_INTMAX = 46
+PRIM_UINTMAX = 47
+PRIM_FLOATCOMPLEX = 48
+PRIM_DOUBLECOMPLEX = 49
+PRIM_CHAR16 = 50
+PRIM_CHAR32 = 51
+
+_NUM_PRIM = 52
+_UNKNOWN_PRIM = -1
+_UNKNOWN_FLOAT_PRIM = -2
+_UNKNOWN_LONG_DOUBLE = -3
+
+_IO_FILE_STRUCT = -1
+
+PRIMITIVE_TO_INDEX = {
+ 'char': PRIM_CHAR,
+ 'short': PRIM_SHORT,
+ 'int': PRIM_INT,
+ 'long': PRIM_LONG,
+ 'long long': PRIM_LONGLONG,
+ 'signed char': PRIM_SCHAR,
+ 'unsigned char': PRIM_UCHAR,
+ 'unsigned short': PRIM_USHORT,
+ 'unsigned int': PRIM_UINT,
+ 'unsigned long': PRIM_ULONG,
+ 'unsigned long long': PRIM_ULONGLONG,
+ 'float': PRIM_FLOAT,
+ 'double': PRIM_DOUBLE,
+ 'long double': PRIM_LONGDOUBLE,
+ '_cffi_float_complex_t': PRIM_FLOATCOMPLEX,
+ '_cffi_double_complex_t': PRIM_DOUBLECOMPLEX,
+ '_Bool': PRIM_BOOL,
+ 'wchar_t': PRIM_WCHAR,
+ 'char16_t': PRIM_CHAR16,
+ 'char32_t': PRIM_CHAR32,
+ 'int8_t': PRIM_INT8,
+ 'uint8_t': PRIM_UINT8,
+ 'int16_t': PRIM_INT16,
+ 'uint16_t': PRIM_UINT16,
+ 'int32_t': PRIM_INT32,
+ 'uint32_t': PRIM_UINT32,
+ 'int64_t': PRIM_INT64,
+ 'uint64_t': PRIM_UINT64,
+ 'intptr_t': PRIM_INTPTR,
+ 'uintptr_t': PRIM_UINTPTR,
+ 'ptrdiff_t': PRIM_PTRDIFF,
+ 'size_t': PRIM_SIZE,
+ 'ssize_t': PRIM_SSIZE,
+ 'int_least8_t': PRIM_INT_LEAST8,
+ 'uint_least8_t': PRIM_UINT_LEAST8,
+ 'int_least16_t': PRIM_INT_LEAST16,
+ 'uint_least16_t': PRIM_UINT_LEAST16,
+ 'int_least32_t': PRIM_INT_LEAST32,
+ 'uint_least32_t': PRIM_UINT_LEAST32,
+ 'int_least64_t': PRIM_INT_LEAST64,
+ 'uint_least64_t': PRIM_UINT_LEAST64,
+ 'int_fast8_t': PRIM_INT_FAST8,
+ 'uint_fast8_t': PRIM_UINT_FAST8,
+ 'int_fast16_t': PRIM_INT_FAST16,
+ 'uint_fast16_t': PRIM_UINT_FAST16,
+ 'int_fast32_t': PRIM_INT_FAST32,
+ 'uint_fast32_t': PRIM_UINT_FAST32,
+ 'int_fast64_t': PRIM_INT_FAST64,
+ 'uint_fast64_t': PRIM_UINT_FAST64,
+ 'intmax_t': PRIM_INTMAX,
+ 'uintmax_t': PRIM_UINTMAX,
+ }
+
+F_UNION = 0x01
+F_CHECK_FIELDS = 0x02
+F_PACKED = 0x04
+F_EXTERNAL = 0x08
+F_OPAQUE = 0x10
+
+G_FLAGS = dict([('_CFFI_' + _key, globals()[_key])
+ for _key in ['F_UNION', 'F_CHECK_FIELDS', 'F_PACKED',
+ 'F_EXTERNAL', 'F_OPAQUE']])
+
+CLASS_NAME = {}
+for _name, _value in list(globals().items()):
+ if _name.startswith('OP_') and isinstance(_value, int):
+ CLASS_NAME[_value] = _name[3:]
diff --git a/templates/skills/file_manager/dependencies/cffi/commontypes.py b/templates/skills/file_manager/dependencies/cffi/commontypes.py
new file mode 100644
index 00000000..d4dae351
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/cffi/commontypes.py
@@ -0,0 +1,82 @@
+import sys
+from . import model
+from .error import FFIError
+
+
+COMMON_TYPES = {}
+
+try:
+ # fetch "bool" and all simple Windows types
+ from _cffi_backend import _get_common_types
+ _get_common_types(COMMON_TYPES)
+except ImportError:
+ pass
+
+COMMON_TYPES['FILE'] = model.unknown_type('FILE', '_IO_FILE')
+COMMON_TYPES['bool'] = '_Bool' # in case we got ImportError above
+COMMON_TYPES['float _Complex'] = '_cffi_float_complex_t'
+COMMON_TYPES['double _Complex'] = '_cffi_double_complex_t'
+
+for _type in model.PrimitiveType.ALL_PRIMITIVE_TYPES:
+ if _type.endswith('_t'):
+ COMMON_TYPES[_type] = _type
+del _type
+
+_CACHE = {}
+
+def resolve_common_type(parser, commontype):
+ try:
+ return _CACHE[commontype]
+ except KeyError:
+ cdecl = COMMON_TYPES.get(commontype, commontype)
+ if not isinstance(cdecl, str):
+ result, quals = cdecl, 0 # cdecl is already a BaseType
+ elif cdecl in model.PrimitiveType.ALL_PRIMITIVE_TYPES:
+ result, quals = model.PrimitiveType(cdecl), 0
+ elif cdecl == 'set-unicode-needed':
+ raise FFIError("The Windows type %r is only available after "
+ "you call ffi.set_unicode()" % (commontype,))
+ else:
+ if commontype == cdecl:
+ raise FFIError(
+ "Unsupported type: %r. Please look at "
+ "http://cffi.readthedocs.io/en/latest/cdef.html#ffi-cdef-limitations "
+ "and file an issue if you think this type should really "
+ "be supported." % (commontype,))
+ result, quals = parser.parse_type_and_quals(cdecl) # recursive
+
+ assert isinstance(result, model.BaseTypeByIdentity)
+ _CACHE[commontype] = result, quals
+ return result, quals
+
+
+# ____________________________________________________________
+# extra types for Windows (most of them are in commontypes.c)
+
+
+def win_common_types():
+ return {
+ "UNICODE_STRING": model.StructType(
+ "_UNICODE_STRING",
+ ["Length",
+ "MaximumLength",
+ "Buffer"],
+ [model.PrimitiveType("unsigned short"),
+ model.PrimitiveType("unsigned short"),
+ model.PointerType(model.PrimitiveType("wchar_t"))],
+ [-1, -1, -1]),
+ "PUNICODE_STRING": "UNICODE_STRING *",
+ "PCUNICODE_STRING": "const UNICODE_STRING *",
+
+ "TBYTE": "set-unicode-needed",
+ "TCHAR": "set-unicode-needed",
+ "LPCTSTR": "set-unicode-needed",
+ "PCTSTR": "set-unicode-needed",
+ "LPTSTR": "set-unicode-needed",
+ "PTSTR": "set-unicode-needed",
+ "PTBYTE": "set-unicode-needed",
+ "PTCHAR": "set-unicode-needed",
+ }
+
+if sys.platform == 'win32':
+ COMMON_TYPES.update(win_common_types())
diff --git a/templates/skills/file_manager/dependencies/cffi/cparser.py b/templates/skills/file_manager/dependencies/cffi/cparser.py
new file mode 100644
index 00000000..eee83caf
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/cffi/cparser.py
@@ -0,0 +1,1015 @@
+from . import model
+from .commontypes import COMMON_TYPES, resolve_common_type
+from .error import FFIError, CDefError
+try:
+ from . import _pycparser as pycparser
+except ImportError:
+ import pycparser
+import weakref, re, sys
+
+try:
+ if sys.version_info < (3,):
+ import thread as _thread
+ else:
+ import _thread
+ lock = _thread.allocate_lock()
+except ImportError:
+ lock = None
+
+def _workaround_for_static_import_finders():
+ # Issue #392: packaging tools like cx_Freeze can not find these
+ # because pycparser uses exec dynamic import. This is an obscure
+ # workaround. This function is never called.
+ import pycparser.yacctab
+ import pycparser.lextab
+
+CDEF_SOURCE_STRING = ""
+_r_comment = re.compile(r"/\*.*?\*/|//([^\n\\]|\\.)*?$",
+ re.DOTALL | re.MULTILINE)
+_r_define = re.compile(r"^\s*#\s*define\s+([A-Za-z_][A-Za-z_0-9]*)"
+ r"\b((?:[^\n\\]|\\.)*?)$",
+ re.DOTALL | re.MULTILINE)
+_r_line_directive = re.compile(r"^[ \t]*#[ \t]*(?:line|\d+)\b.*$", re.MULTILINE)
+_r_partial_enum = re.compile(r"=\s*\.\.\.\s*[,}]|\.\.\.\s*\}")
+_r_enum_dotdotdot = re.compile(r"__dotdotdot\d+__$")
+_r_partial_array = re.compile(r"\[\s*\.\.\.\s*\]")
+_r_words = re.compile(r"\w+|\S")
+_parser_cache = None
+_r_int_literal = re.compile(r"-?0?x?[0-9a-f]+[lu]*$", re.IGNORECASE)
+_r_stdcall1 = re.compile(r"\b(__stdcall|WINAPI)\b")
+_r_stdcall2 = re.compile(r"[(]\s*(__stdcall|WINAPI)\b")
+_r_cdecl = re.compile(r"\b__cdecl\b")
+_r_extern_python = re.compile(r'\bextern\s*"'
+ r'(Python|Python\s*\+\s*C|C\s*\+\s*Python)"\s*.')
+_r_star_const_space = re.compile( # matches "* const "
+ r"[*]\s*((const|volatile|restrict)\b\s*)+")
+_r_int_dotdotdot = re.compile(r"(\b(int|long|short|signed|unsigned|char)\s*)+"
+ r"\.\.\.")
+_r_float_dotdotdot = re.compile(r"\b(double|float)\s*\.\.\.")
+
+def _get_parser():
+ global _parser_cache
+ if _parser_cache is None:
+ _parser_cache = pycparser.CParser()
+ return _parser_cache
+
+def _workaround_for_old_pycparser(csource):
+ # Workaround for a pycparser issue (fixed between pycparser 2.10 and
+ # 2.14): "char*const***" gives us a wrong syntax tree, the same as
+ # for "char***(*const)". This means we can't tell the difference
+ # afterwards. But "char(*const(***))" gives us the right syntax
+ # tree. The issue only occurs if there are several stars in
+ # sequence with no parenthesis inbetween, just possibly qualifiers.
+ # Attempt to fix it by adding some parentheses in the source: each
+ # time we see "* const" or "* const *", we add an opening
+ # parenthesis before each star---the hard part is figuring out where
+ # to close them.
+ parts = []
+ while True:
+ match = _r_star_const_space.search(csource)
+ if not match:
+ break
+ #print repr(''.join(parts)+csource), '=>',
+ parts.append(csource[:match.start()])
+ parts.append('('); closing = ')'
+ parts.append(match.group()) # e.g. "* const "
+ endpos = match.end()
+ if csource.startswith('*', endpos):
+ parts.append('('); closing += ')'
+ level = 0
+ i = endpos
+ while i < len(csource):
+ c = csource[i]
+ if c == '(':
+ level += 1
+ elif c == ')':
+ if level == 0:
+ break
+ level -= 1
+ elif c in ',;=':
+ if level == 0:
+ break
+ i += 1
+ csource = csource[endpos:i] + closing + csource[i:]
+ #print repr(''.join(parts)+csource)
+ parts.append(csource)
+ return ''.join(parts)
+
+def _preprocess_extern_python(csource):
+ # input: `extern "Python" int foo(int);` or
+ # `extern "Python" { int foo(int); }`
+ # output:
+ # void __cffi_extern_python_start;
+ # int foo(int);
+ # void __cffi_extern_python_stop;
+ #
+ # input: `extern "Python+C" int foo(int);`
+ # output:
+ # void __cffi_extern_python_plus_c_start;
+ # int foo(int);
+ # void __cffi_extern_python_stop;
+ parts = []
+ while True:
+ match = _r_extern_python.search(csource)
+ if not match:
+ break
+ endpos = match.end() - 1
+ #print
+ #print ''.join(parts)+csource
+ #print '=>'
+ parts.append(csource[:match.start()])
+ if 'C' in match.group(1):
+ parts.append('void __cffi_extern_python_plus_c_start; ')
+ else:
+ parts.append('void __cffi_extern_python_start; ')
+ if csource[endpos] == '{':
+ # grouping variant
+ closing = csource.find('}', endpos)
+ if closing < 0:
+ raise CDefError("'extern \"Python\" {': no '}' found")
+ if csource.find('{', endpos + 1, closing) >= 0:
+ raise NotImplementedError("cannot use { } inside a block "
+ "'extern \"Python\" { ... }'")
+ parts.append(csource[endpos+1:closing])
+ csource = csource[closing+1:]
+ else:
+ # non-grouping variant
+ semicolon = csource.find(';', endpos)
+ if semicolon < 0:
+ raise CDefError("'extern \"Python\": no ';' found")
+ parts.append(csource[endpos:semicolon+1])
+ csource = csource[semicolon+1:]
+ parts.append(' void __cffi_extern_python_stop;')
+ #print ''.join(parts)+csource
+ #print
+ parts.append(csource)
+ return ''.join(parts)
+
+def _warn_for_string_literal(csource):
+ if '"' not in csource:
+ return
+ for line in csource.splitlines():
+ if '"' in line and not line.lstrip().startswith('#'):
+ import warnings
+ warnings.warn("String literal found in cdef() or type source. "
+ "String literals are ignored here, but you should "
+ "remove them anyway because some character sequences "
+ "confuse pre-parsing.")
+ break
+
+def _warn_for_non_extern_non_static_global_variable(decl):
+ if not decl.storage:
+ import warnings
+ warnings.warn("Global variable '%s' in cdef(): for consistency "
+ "with C it should have a storage class specifier "
+ "(usually 'extern')" % (decl.name,))
+
+def _remove_line_directives(csource):
+ # _r_line_directive matches whole lines, without the final \n, if they
+ # start with '#line' with some spacing allowed, or '#NUMBER'. This
+ # function stores them away and replaces them with exactly the string
+ # '#line@N', where N is the index in the list 'line_directives'.
+ line_directives = []
+ def replace(m):
+ i = len(line_directives)
+ line_directives.append(m.group())
+ return '#line@%d' % i
+ csource = _r_line_directive.sub(replace, csource)
+ return csource, line_directives
+
+def _put_back_line_directives(csource, line_directives):
+ def replace(m):
+ s = m.group()
+ if not s.startswith('#line@'):
+ raise AssertionError("unexpected #line directive "
+ "(should have been processed and removed")
+ return line_directives[int(s[6:])]
+ return _r_line_directive.sub(replace, csource)
+
+def _preprocess(csource):
+ # First, remove the lines of the form '#line N "filename"' because
+ # the "filename" part could confuse the rest
+ csource, line_directives = _remove_line_directives(csource)
+ # Remove comments. NOTE: this only work because the cdef() section
+ # should not contain any string literals (except in line directives)!
+ def replace_keeping_newlines(m):
+ return ' ' + m.group().count('\n') * '\n'
+ csource = _r_comment.sub(replace_keeping_newlines, csource)
+ # Remove the "#define FOO x" lines
+ macros = {}
+ for match in _r_define.finditer(csource):
+ macroname, macrovalue = match.groups()
+ macrovalue = macrovalue.replace('\\\n', '').strip()
+ macros[macroname] = macrovalue
+ csource = _r_define.sub('', csource)
+ #
+ if pycparser.__version__ < '2.14':
+ csource = _workaround_for_old_pycparser(csource)
+ #
+ # BIG HACK: replace WINAPI or __stdcall with "volatile const".
+ # It doesn't make sense for the return type of a function to be
+ # "volatile volatile const", so we abuse it to detect __stdcall...
+ # Hack number 2 is that "int(volatile *fptr)();" is not valid C
+ # syntax, so we place the "volatile" before the opening parenthesis.
+ csource = _r_stdcall2.sub(' volatile volatile const(', csource)
+ csource = _r_stdcall1.sub(' volatile volatile const ', csource)
+ csource = _r_cdecl.sub(' ', csource)
+ #
+ # Replace `extern "Python"` with start/end markers
+ csource = _preprocess_extern_python(csource)
+ #
+ # Now there should not be any string literal left; warn if we get one
+ _warn_for_string_literal(csource)
+ #
+ # Replace "[...]" with "[__dotdotdotarray__]"
+ csource = _r_partial_array.sub('[__dotdotdotarray__]', csource)
+ #
+ # Replace "...}" with "__dotdotdotNUM__}". This construction should
+ # occur only at the end of enums; at the end of structs we have "...;}"
+ # and at the end of vararg functions "...);". Also replace "=...[,}]"
+ # with ",__dotdotdotNUM__[,}]": this occurs in the enums too, when
+ # giving an unknown value.
+ matches = list(_r_partial_enum.finditer(csource))
+ for number, match in enumerate(reversed(matches)):
+ p = match.start()
+ if csource[p] == '=':
+ p2 = csource.find('...', p, match.end())
+ assert p2 > p
+ csource = '%s,__dotdotdot%d__ %s' % (csource[:p], number,
+ csource[p2+3:])
+ else:
+ assert csource[p:p+3] == '...'
+ csource = '%s __dotdotdot%d__ %s' % (csource[:p], number,
+ csource[p+3:])
+ # Replace "int ..." or "unsigned long int..." with "__dotdotdotint__"
+ csource = _r_int_dotdotdot.sub(' __dotdotdotint__ ', csource)
+ # Replace "float ..." or "double..." with "__dotdotdotfloat__"
+ csource = _r_float_dotdotdot.sub(' __dotdotdotfloat__ ', csource)
+ # Replace all remaining "..." with the same name, "__dotdotdot__",
+ # which is declared with a typedef for the purpose of C parsing.
+ csource = csource.replace('...', ' __dotdotdot__ ')
+ # Finally, put back the line directives
+ csource = _put_back_line_directives(csource, line_directives)
+ return csource, macros
+
+def _common_type_names(csource):
+ # Look in the source for what looks like usages of types from the
+ # list of common types. A "usage" is approximated here as the
+ # appearance of the word, minus a "definition" of the type, which
+ # is the last word in a "typedef" statement. Approximative only
+ # but should be fine for all the common types.
+ look_for_words = set(COMMON_TYPES)
+ look_for_words.add(';')
+ look_for_words.add(',')
+ look_for_words.add('(')
+ look_for_words.add(')')
+ look_for_words.add('typedef')
+ words_used = set()
+ is_typedef = False
+ paren = 0
+ previous_word = ''
+ for word in _r_words.findall(csource):
+ if word in look_for_words:
+ if word == ';':
+ if is_typedef:
+ words_used.discard(previous_word)
+ look_for_words.discard(previous_word)
+ is_typedef = False
+ elif word == 'typedef':
+ is_typedef = True
+ paren = 0
+ elif word == '(':
+ paren += 1
+ elif word == ')':
+ paren -= 1
+ elif word == ',':
+ if is_typedef and paren == 0:
+ words_used.discard(previous_word)
+ look_for_words.discard(previous_word)
+ else: # word in COMMON_TYPES
+ words_used.add(word)
+ previous_word = word
+ return words_used
+
+
+class Parser(object):
+
+ def __init__(self):
+ self._declarations = {}
+ self._included_declarations = set()
+ self._anonymous_counter = 0
+ self._structnode2type = weakref.WeakKeyDictionary()
+ self._options = {}
+ self._int_constants = {}
+ self._recomplete = []
+ self._uses_new_feature = None
+
+ def _parse(self, csource):
+ csource, macros = _preprocess(csource)
+ # XXX: for more efficiency we would need to poke into the
+ # internals of CParser... the following registers the
+ # typedefs, because their presence or absence influences the
+ # parsing itself (but what they are typedef'ed to plays no role)
+ ctn = _common_type_names(csource)
+ typenames = []
+ for name in sorted(self._declarations):
+ if name.startswith('typedef '):
+ name = name[8:]
+ typenames.append(name)
+ ctn.discard(name)
+ typenames += sorted(ctn)
+ #
+ csourcelines = []
+ csourcelines.append('# 1 ""')
+ for typename in typenames:
+ csourcelines.append('typedef int %s;' % typename)
+ csourcelines.append('typedef int __dotdotdotint__, __dotdotdotfloat__,'
+ ' __dotdotdot__;')
+ # this forces pycparser to consider the following in the file
+ # called from line 1
+ csourcelines.append('# 1 "%s"' % (CDEF_SOURCE_STRING,))
+ csourcelines.append(csource)
+ csourcelines.append('') # see test_missing_newline_bug
+ fullcsource = '\n'.join(csourcelines)
+ if lock is not None:
+ lock.acquire() # pycparser is not thread-safe...
+ try:
+ ast = _get_parser().parse(fullcsource)
+ except pycparser.c_parser.ParseError as e:
+ self.convert_pycparser_error(e, csource)
+ finally:
+ if lock is not None:
+ lock.release()
+ # csource will be used to find buggy source text
+ return ast, macros, csource
+
+ def _convert_pycparser_error(self, e, csource):
+ # xxx look for ":NUM:" at the start of str(e)
+ # and interpret that as a line number. This will not work if
+ # the user gives explicit ``# NUM "FILE"`` directives.
+ line = None
+ msg = str(e)
+ match = re.match(r"%s:(\d+):" % (CDEF_SOURCE_STRING,), msg)
+ if match:
+ linenum = int(match.group(1), 10)
+ csourcelines = csource.splitlines()
+ if 1 <= linenum <= len(csourcelines):
+ line = csourcelines[linenum-1]
+ return line
+
+ def convert_pycparser_error(self, e, csource):
+ line = self._convert_pycparser_error(e, csource)
+
+ msg = str(e)
+ if line:
+ msg = 'cannot parse "%s"\n%s' % (line.strip(), msg)
+ else:
+ msg = 'parse error\n%s' % (msg,)
+ raise CDefError(msg)
+
+ def parse(self, csource, override=False, packed=False, pack=None,
+ dllexport=False):
+ if packed:
+ if packed != True:
+ raise ValueError("'packed' should be False or True; use "
+ "'pack' to give another value")
+ if pack:
+ raise ValueError("cannot give both 'pack' and 'packed'")
+ pack = 1
+ elif pack:
+ if pack & (pack - 1):
+ raise ValueError("'pack' must be a power of two, not %r" %
+ (pack,))
+ else:
+ pack = 0
+ prev_options = self._options
+ try:
+ self._options = {'override': override,
+ 'packed': pack,
+ 'dllexport': dllexport}
+ self._internal_parse(csource)
+ finally:
+ self._options = prev_options
+
+ def _internal_parse(self, csource):
+ ast, macros, csource = self._parse(csource)
+ # add the macros
+ self._process_macros(macros)
+ # find the first "__dotdotdot__" and use that as a separator
+ # between the repeated typedefs and the real csource
+ iterator = iter(ast.ext)
+ for decl in iterator:
+ if decl.name == '__dotdotdot__':
+ break
+ else:
+ assert 0
+ current_decl = None
+ #
+ try:
+ self._inside_extern_python = '__cffi_extern_python_stop'
+ for decl in iterator:
+ current_decl = decl
+ if isinstance(decl, pycparser.c_ast.Decl):
+ self._parse_decl(decl)
+ elif isinstance(decl, pycparser.c_ast.Typedef):
+ if not decl.name:
+ raise CDefError("typedef does not declare any name",
+ decl)
+ quals = 0
+ if (isinstance(decl.type.type, pycparser.c_ast.IdentifierType) and
+ decl.type.type.names[-1].startswith('__dotdotdot')):
+ realtype = self._get_unknown_type(decl)
+ elif (isinstance(decl.type, pycparser.c_ast.PtrDecl) and
+ isinstance(decl.type.type, pycparser.c_ast.TypeDecl) and
+ isinstance(decl.type.type.type,
+ pycparser.c_ast.IdentifierType) and
+ decl.type.type.type.names[-1].startswith('__dotdotdot')):
+ realtype = self._get_unknown_ptr_type(decl)
+ else:
+ realtype, quals = self._get_type_and_quals(
+ decl.type, name=decl.name, partial_length_ok=True,
+ typedef_example="*(%s *)0" % (decl.name,))
+ self._declare('typedef ' + decl.name, realtype, quals=quals)
+ elif decl.__class__.__name__ == 'Pragma':
+ # skip pragma, only in pycparser 2.15
+ import warnings
+ warnings.warn(
+ "#pragma in cdef() are entirely ignored. "
+ "They should be removed for now, otherwise your "
+ "code might behave differently in a future version "
+ "of CFFI if #pragma support gets added. Note that "
+ "'#pragma pack' needs to be replaced with the "
+ "'packed' keyword argument to cdef().")
+ else:
+ raise CDefError("unexpected <%s>: this construct is valid "
+ "C but not valid in cdef()" %
+ decl.__class__.__name__, decl)
+ except CDefError as e:
+ if len(e.args) == 1:
+ e.args = e.args + (current_decl,)
+ raise
+ except FFIError as e:
+ msg = self._convert_pycparser_error(e, csource)
+ if msg:
+ e.args = (e.args[0] + "\n *** Err: %s" % msg,)
+ raise
+
+ def _add_constants(self, key, val):
+ if key in self._int_constants:
+ if self._int_constants[key] == val:
+ return # ignore identical double declarations
+ raise FFIError(
+ "multiple declarations of constant: %s" % (key,))
+ self._int_constants[key] = val
+
+ def _add_integer_constant(self, name, int_str):
+ int_str = int_str.lower().rstrip("ul")
+ neg = int_str.startswith('-')
+ if neg:
+ int_str = int_str[1:]
+ # "010" is not valid oct in py3
+ if (int_str.startswith("0") and int_str != '0'
+ and not int_str.startswith("0x")):
+ int_str = "0o" + int_str[1:]
+ pyvalue = int(int_str, 0)
+ if neg:
+ pyvalue = -pyvalue
+ self._add_constants(name, pyvalue)
+ self._declare('macro ' + name, pyvalue)
+
+ def _process_macros(self, macros):
+ for key, value in macros.items():
+ value = value.strip()
+ if _r_int_literal.match(value):
+ self._add_integer_constant(key, value)
+ elif value == '...':
+ self._declare('macro ' + key, value)
+ else:
+ raise CDefError(
+ 'only supports one of the following syntax:\n'
+ ' #define %s ... (literally dot-dot-dot)\n'
+ ' #define %s NUMBER (with NUMBER an integer'
+ ' constant, decimal/hex/octal)\n'
+ 'got:\n'
+ ' #define %s %s'
+ % (key, key, key, value))
+
+ def _declare_function(self, tp, quals, decl):
+ tp = self._get_type_pointer(tp, quals)
+ if self._options.get('dllexport'):
+ tag = 'dllexport_python '
+ elif self._inside_extern_python == '__cffi_extern_python_start':
+ tag = 'extern_python '
+ elif self._inside_extern_python == '__cffi_extern_python_plus_c_start':
+ tag = 'extern_python_plus_c '
+ else:
+ tag = 'function '
+ self._declare(tag + decl.name, tp)
+
+ def _parse_decl(self, decl):
+ node = decl.type
+ if isinstance(node, pycparser.c_ast.FuncDecl):
+ tp, quals = self._get_type_and_quals(node, name=decl.name)
+ assert isinstance(tp, model.RawFunctionType)
+ self._declare_function(tp, quals, decl)
+ else:
+ if isinstance(node, pycparser.c_ast.Struct):
+ self._get_struct_union_enum_type('struct', node)
+ elif isinstance(node, pycparser.c_ast.Union):
+ self._get_struct_union_enum_type('union', node)
+ elif isinstance(node, pycparser.c_ast.Enum):
+ self._get_struct_union_enum_type('enum', node)
+ elif not decl.name:
+ raise CDefError("construct does not declare any variable",
+ decl)
+ #
+ if decl.name:
+ tp, quals = self._get_type_and_quals(node,
+ partial_length_ok=True)
+ if tp.is_raw_function:
+ self._declare_function(tp, quals, decl)
+ elif (tp.is_integer_type() and
+ hasattr(decl, 'init') and
+ hasattr(decl.init, 'value') and
+ _r_int_literal.match(decl.init.value)):
+ self._add_integer_constant(decl.name, decl.init.value)
+ elif (tp.is_integer_type() and
+ isinstance(decl.init, pycparser.c_ast.UnaryOp) and
+ decl.init.op == '-' and
+ hasattr(decl.init.expr, 'value') and
+ _r_int_literal.match(decl.init.expr.value)):
+ self._add_integer_constant(decl.name,
+ '-' + decl.init.expr.value)
+ elif (tp is model.void_type and
+ decl.name.startswith('__cffi_extern_python_')):
+ # hack: `extern "Python"` in the C source is replaced
+ # with "void __cffi_extern_python_start;" and
+ # "void __cffi_extern_python_stop;"
+ self._inside_extern_python = decl.name
+ else:
+ if self._inside_extern_python !='__cffi_extern_python_stop':
+ raise CDefError(
+ "cannot declare constants or "
+ "variables with 'extern \"Python\"'")
+ if (quals & model.Q_CONST) and not tp.is_array_type:
+ self._declare('constant ' + decl.name, tp, quals=quals)
+ else:
+ _warn_for_non_extern_non_static_global_variable(decl)
+ self._declare('variable ' + decl.name, tp, quals=quals)
+
+ def parse_type(self, cdecl):
+ return self.parse_type_and_quals(cdecl)[0]
+
+ def parse_type_and_quals(self, cdecl):
+ ast, macros = self._parse('void __dummy(\n%s\n);' % cdecl)[:2]
+ assert not macros
+ exprnode = ast.ext[-1].type.args.params[0]
+ if isinstance(exprnode, pycparser.c_ast.ID):
+ raise CDefError("unknown identifier '%s'" % (exprnode.name,))
+ return self._get_type_and_quals(exprnode.type)
+
+ def _declare(self, name, obj, included=False, quals=0):
+ if name in self._declarations:
+ prevobj, prevquals = self._declarations[name]
+ if prevobj is obj and prevquals == quals:
+ return
+ if not self._options.get('override'):
+ raise FFIError(
+ "multiple declarations of %s (for interactive usage, "
+ "try cdef(xx, override=True))" % (name,))
+ assert '__dotdotdot__' not in name.split()
+ self._declarations[name] = (obj, quals)
+ if included:
+ self._included_declarations.add(obj)
+
+ def _extract_quals(self, type):
+ quals = 0
+ if isinstance(type, (pycparser.c_ast.TypeDecl,
+ pycparser.c_ast.PtrDecl)):
+ if 'const' in type.quals:
+ quals |= model.Q_CONST
+ if 'volatile' in type.quals:
+ quals |= model.Q_VOLATILE
+ if 'restrict' in type.quals:
+ quals |= model.Q_RESTRICT
+ return quals
+
+ def _get_type_pointer(self, type, quals, declname=None):
+ if isinstance(type, model.RawFunctionType):
+ return type.as_function_pointer()
+ if (isinstance(type, model.StructOrUnionOrEnum) and
+ type.name.startswith('$') and type.name[1:].isdigit() and
+ type.forcename is None and declname is not None):
+ return model.NamedPointerType(type, declname, quals)
+ return model.PointerType(type, quals)
+
+ def _get_type_and_quals(self, typenode, name=None, partial_length_ok=False,
+ typedef_example=None):
+ # first, dereference typedefs, if we have it already parsed, we're good
+ if (isinstance(typenode, pycparser.c_ast.TypeDecl) and
+ isinstance(typenode.type, pycparser.c_ast.IdentifierType) and
+ len(typenode.type.names) == 1 and
+ ('typedef ' + typenode.type.names[0]) in self._declarations):
+ tp, quals = self._declarations['typedef ' + typenode.type.names[0]]
+ quals |= self._extract_quals(typenode)
+ return tp, quals
+ #
+ if isinstance(typenode, pycparser.c_ast.ArrayDecl):
+ # array type
+ if typenode.dim is None:
+ length = None
+ else:
+ length = self._parse_constant(
+ typenode.dim, partial_length_ok=partial_length_ok)
+ # a hack: in 'typedef int foo_t[...][...];', don't use '...' as
+ # the length but use directly the C expression that would be
+ # generated by recompiler.py. This lets the typedef be used in
+ # many more places within recompiler.py
+ if typedef_example is not None:
+ if length == '...':
+ length = '_cffi_array_len(%s)' % (typedef_example,)
+ typedef_example = "*" + typedef_example
+ #
+ tp, quals = self._get_type_and_quals(typenode.type,
+ partial_length_ok=partial_length_ok,
+ typedef_example=typedef_example)
+ return model.ArrayType(tp, length), quals
+ #
+ if isinstance(typenode, pycparser.c_ast.PtrDecl):
+ # pointer type
+ itemtype, itemquals = self._get_type_and_quals(typenode.type)
+ tp = self._get_type_pointer(itemtype, itemquals, declname=name)
+ quals = self._extract_quals(typenode)
+ return tp, quals
+ #
+ if isinstance(typenode, pycparser.c_ast.TypeDecl):
+ quals = self._extract_quals(typenode)
+ type = typenode.type
+ if isinstance(type, pycparser.c_ast.IdentifierType):
+ # assume a primitive type. get it from .names, but reduce
+ # synonyms to a single chosen combination
+ names = list(type.names)
+ if names != ['signed', 'char']: # keep this unmodified
+ prefixes = {}
+ while names:
+ name = names[0]
+ if name in ('short', 'long', 'signed', 'unsigned'):
+ prefixes[name] = prefixes.get(name, 0) + 1
+ del names[0]
+ else:
+ break
+ # ignore the 'signed' prefix below, and reorder the others
+ newnames = []
+ for prefix in ('unsigned', 'short', 'long'):
+ for i in range(prefixes.get(prefix, 0)):
+ newnames.append(prefix)
+ if not names:
+ names = ['int'] # implicitly
+ if names == ['int']: # but kill it if 'short' or 'long'
+ if 'short' in prefixes or 'long' in prefixes:
+ names = []
+ names = newnames + names
+ ident = ' '.join(names)
+ if ident == 'void':
+ return model.void_type, quals
+ if ident == '__dotdotdot__':
+ raise FFIError(':%d: bad usage of "..."' %
+ typenode.coord.line)
+ tp0, quals0 = resolve_common_type(self, ident)
+ return tp0, (quals | quals0)
+ #
+ if isinstance(type, pycparser.c_ast.Struct):
+ # 'struct foobar'
+ tp = self._get_struct_union_enum_type('struct', type, name)
+ return tp, quals
+ #
+ if isinstance(type, pycparser.c_ast.Union):
+ # 'union foobar'
+ tp = self._get_struct_union_enum_type('union', type, name)
+ return tp, quals
+ #
+ if isinstance(type, pycparser.c_ast.Enum):
+ # 'enum foobar'
+ tp = self._get_struct_union_enum_type('enum', type, name)
+ return tp, quals
+ #
+ if isinstance(typenode, pycparser.c_ast.FuncDecl):
+ # a function type
+ return self._parse_function_type(typenode, name), 0
+ #
+ # nested anonymous structs or unions end up here
+ if isinstance(typenode, pycparser.c_ast.Struct):
+ return self._get_struct_union_enum_type('struct', typenode, name,
+ nested=True), 0
+ if isinstance(typenode, pycparser.c_ast.Union):
+ return self._get_struct_union_enum_type('union', typenode, name,
+ nested=True), 0
+ #
+ raise FFIError(":%d: bad or unsupported type declaration" %
+ typenode.coord.line)
+
+ def _parse_function_type(self, typenode, funcname=None):
+ params = list(getattr(typenode.args, 'params', []))
+ for i, arg in enumerate(params):
+ if not hasattr(arg, 'type'):
+ raise CDefError("%s arg %d: unknown type '%s'"
+ " (if you meant to use the old C syntax of giving"
+ " untyped arguments, it is not supported)"
+ % (funcname or 'in expression', i + 1,
+ getattr(arg, 'name', '?')))
+ ellipsis = (
+ len(params) > 0 and
+ isinstance(params[-1].type, pycparser.c_ast.TypeDecl) and
+ isinstance(params[-1].type.type,
+ pycparser.c_ast.IdentifierType) and
+ params[-1].type.type.names == ['__dotdotdot__'])
+ if ellipsis:
+ params.pop()
+ if not params:
+ raise CDefError(
+ "%s: a function with only '(...)' as argument"
+ " is not correct C" % (funcname or 'in expression'))
+ args = [self._as_func_arg(*self._get_type_and_quals(argdeclnode.type))
+ for argdeclnode in params]
+ if not ellipsis and args == [model.void_type]:
+ args = []
+ result, quals = self._get_type_and_quals(typenode.type)
+ # the 'quals' on the result type are ignored. HACK: we absure them
+ # to detect __stdcall functions: we textually replace "__stdcall"
+ # with "volatile volatile const" above.
+ abi = None
+ if hasattr(typenode.type, 'quals'): # else, probable syntax error anyway
+ if typenode.type.quals[-3:] == ['volatile', 'volatile', 'const']:
+ abi = '__stdcall'
+ return model.RawFunctionType(tuple(args), result, ellipsis, abi)
+
+ def _as_func_arg(self, type, quals):
+ if isinstance(type, model.ArrayType):
+ return model.PointerType(type.item, quals)
+ elif isinstance(type, model.RawFunctionType):
+ return type.as_function_pointer()
+ else:
+ return type
+
+ def _get_struct_union_enum_type(self, kind, type, name=None, nested=False):
+ # First, a level of caching on the exact 'type' node of the AST.
+ # This is obscure, but needed because pycparser "unrolls" declarations
+ # such as "typedef struct { } foo_t, *foo_p" and we end up with
+ # an AST that is not a tree, but a DAG, with the "type" node of the
+ # two branches foo_t and foo_p of the trees being the same node.
+ # It's a bit silly but detecting "DAG-ness" in the AST tree seems
+ # to be the only way to distinguish this case from two independent
+ # structs. See test_struct_with_two_usages.
+ try:
+ return self._structnode2type[type]
+ except KeyError:
+ pass
+ #
+ # Note that this must handle parsing "struct foo" any number of
+ # times and always return the same StructType object. Additionally,
+ # one of these times (not necessarily the first), the fields of
+ # the struct can be specified with "struct foo { ...fields... }".
+ # If no name is given, then we have to create a new anonymous struct
+ # with no caching; in this case, the fields are either specified
+ # right now or never.
+ #
+ force_name = name
+ name = type.name
+ #
+ # get the type or create it if needed
+ if name is None:
+ # 'force_name' is used to guess a more readable name for
+ # anonymous structs, for the common case "typedef struct { } foo".
+ if force_name is not None:
+ explicit_name = '$%s' % force_name
+ else:
+ self._anonymous_counter += 1
+ explicit_name = '$%d' % self._anonymous_counter
+ tp = None
+ else:
+ explicit_name = name
+ key = '%s %s' % (kind, name)
+ tp, _ = self._declarations.get(key, (None, None))
+ #
+ if tp is None:
+ if kind == 'struct':
+ tp = model.StructType(explicit_name, None, None, None)
+ elif kind == 'union':
+ tp = model.UnionType(explicit_name, None, None, None)
+ elif kind == 'enum':
+ if explicit_name == '__dotdotdot__':
+ raise CDefError("Enums cannot be declared with ...")
+ tp = self._build_enum_type(explicit_name, type.values)
+ else:
+ raise AssertionError("kind = %r" % (kind,))
+ if name is not None:
+ self._declare(key, tp)
+ else:
+ if kind == 'enum' and type.values is not None:
+ raise NotImplementedError(
+ "enum %s: the '{}' declaration should appear on the first "
+ "time the enum is mentioned, not later" % explicit_name)
+ if not tp.forcename:
+ tp.force_the_name(force_name)
+ if tp.forcename and '$' in tp.name:
+ self._declare('anonymous %s' % tp.forcename, tp)
+ #
+ self._structnode2type[type] = tp
+ #
+ # enums: done here
+ if kind == 'enum':
+ return tp
+ #
+ # is there a 'type.decls'? If yes, then this is the place in the
+ # C sources that declare the fields. If no, then just return the
+ # existing type, possibly still incomplete.
+ if type.decls is None:
+ return tp
+ #
+ if tp.fldnames is not None:
+ raise CDefError("duplicate declaration of struct %s" % name)
+ fldnames = []
+ fldtypes = []
+ fldbitsize = []
+ fldquals = []
+ for decl in type.decls:
+ if (isinstance(decl.type, pycparser.c_ast.IdentifierType) and
+ ''.join(decl.type.names) == '__dotdotdot__'):
+ # XXX pycparser is inconsistent: 'names' should be a list
+ # of strings, but is sometimes just one string. Use
+ # str.join() as a way to cope with both.
+ self._make_partial(tp, nested)
+ continue
+ if decl.bitsize is None:
+ bitsize = -1
+ else:
+ bitsize = self._parse_constant(decl.bitsize)
+ self._partial_length = False
+ type, fqual = self._get_type_and_quals(decl.type,
+ partial_length_ok=True)
+ if self._partial_length:
+ self._make_partial(tp, nested)
+ if isinstance(type, model.StructType) and type.partial:
+ self._make_partial(tp, nested)
+ fldnames.append(decl.name or '')
+ fldtypes.append(type)
+ fldbitsize.append(bitsize)
+ fldquals.append(fqual)
+ tp.fldnames = tuple(fldnames)
+ tp.fldtypes = tuple(fldtypes)
+ tp.fldbitsize = tuple(fldbitsize)
+ tp.fldquals = tuple(fldquals)
+ if fldbitsize != [-1] * len(fldbitsize):
+ if isinstance(tp, model.StructType) and tp.partial:
+ raise NotImplementedError("%s: using both bitfields and '...;'"
+ % (tp,))
+ tp.packed = self._options.get('packed')
+ if tp.completed: # must be re-completed: it is not opaque any more
+ tp.completed = 0
+ self._recomplete.append(tp)
+ return tp
+
+ def _make_partial(self, tp, nested):
+ if not isinstance(tp, model.StructOrUnion):
+ raise CDefError("%s cannot be partial" % (tp,))
+ if not tp.has_c_name() and not nested:
+ raise NotImplementedError("%s is partial but has no C name" %(tp,))
+ tp.partial = True
+
+ def _parse_constant(self, exprnode, partial_length_ok=False):
+ # for now, limited to expressions that are an immediate number
+ # or positive/negative number
+ if isinstance(exprnode, pycparser.c_ast.Constant):
+ s = exprnode.value
+ if '0' <= s[0] <= '9':
+ s = s.rstrip('uUlL')
+ try:
+ if s.startswith('0'):
+ return int(s, 8)
+ else:
+ return int(s, 10)
+ except ValueError:
+ if len(s) > 1:
+ if s.lower()[0:2] == '0x':
+ return int(s, 16)
+ elif s.lower()[0:2] == '0b':
+ return int(s, 2)
+ raise CDefError("invalid constant %r" % (s,))
+ elif s[0] == "'" and s[-1] == "'" and (
+ len(s) == 3 or (len(s) == 4 and s[1] == "\\")):
+ return ord(s[-2])
+ else:
+ raise CDefError("invalid constant %r" % (s,))
+ #
+ if (isinstance(exprnode, pycparser.c_ast.UnaryOp) and
+ exprnode.op == '+'):
+ return self._parse_constant(exprnode.expr)
+ #
+ if (isinstance(exprnode, pycparser.c_ast.UnaryOp) and
+ exprnode.op == '-'):
+ return -self._parse_constant(exprnode.expr)
+ # load previously defined int constant
+ if (isinstance(exprnode, pycparser.c_ast.ID) and
+ exprnode.name in self._int_constants):
+ return self._int_constants[exprnode.name]
+ #
+ if (isinstance(exprnode, pycparser.c_ast.ID) and
+ exprnode.name == '__dotdotdotarray__'):
+ if partial_length_ok:
+ self._partial_length = True
+ return '...'
+ raise FFIError(":%d: unsupported '[...]' here, cannot derive "
+ "the actual array length in this context"
+ % exprnode.coord.line)
+ #
+ if isinstance(exprnode, pycparser.c_ast.BinaryOp):
+ left = self._parse_constant(exprnode.left)
+ right = self._parse_constant(exprnode.right)
+ if exprnode.op == '+':
+ return left + right
+ elif exprnode.op == '-':
+ return left - right
+ elif exprnode.op == '*':
+ return left * right
+ elif exprnode.op == '/':
+ return self._c_div(left, right)
+ elif exprnode.op == '%':
+ return left - self._c_div(left, right) * right
+ elif exprnode.op == '<<':
+ return left << right
+ elif exprnode.op == '>>':
+ return left >> right
+ elif exprnode.op == '&':
+ return left & right
+ elif exprnode.op == '|':
+ return left | right
+ elif exprnode.op == '^':
+ return left ^ right
+ #
+ raise FFIError(":%d: unsupported expression: expected a "
+ "simple numeric constant" % exprnode.coord.line)
+
+ def _c_div(self, a, b):
+ result = a // b
+ if ((a < 0) ^ (b < 0)) and (a % b) != 0:
+ result += 1
+ return result
+
+ def _build_enum_type(self, explicit_name, decls):
+ if decls is not None:
+ partial = False
+ enumerators = []
+ enumvalues = []
+ nextenumvalue = 0
+ for enum in decls.enumerators:
+ if _r_enum_dotdotdot.match(enum.name):
+ partial = True
+ continue
+ if enum.value is not None:
+ nextenumvalue = self._parse_constant(enum.value)
+ enumerators.append(enum.name)
+ enumvalues.append(nextenumvalue)
+ self._add_constants(enum.name, nextenumvalue)
+ nextenumvalue += 1
+ enumerators = tuple(enumerators)
+ enumvalues = tuple(enumvalues)
+ tp = model.EnumType(explicit_name, enumerators, enumvalues)
+ tp.partial = partial
+ else: # opaque enum
+ tp = model.EnumType(explicit_name, (), ())
+ return tp
+
+ def include(self, other):
+ for name, (tp, quals) in other._declarations.items():
+ if name.startswith('anonymous $enum_$'):
+ continue # fix for test_anonymous_enum_include
+ kind = name.split(' ', 1)[0]
+ if kind in ('struct', 'union', 'enum', 'anonymous', 'typedef'):
+ self._declare(name, tp, included=True, quals=quals)
+ for k, v in other._int_constants.items():
+ self._add_constants(k, v)
+
+ def _get_unknown_type(self, decl):
+ typenames = decl.type.type.names
+ if typenames == ['__dotdotdot__']:
+ return model.unknown_type(decl.name)
+
+ if typenames == ['__dotdotdotint__']:
+ if self._uses_new_feature is None:
+ self._uses_new_feature = "'typedef int... %s'" % decl.name
+ return model.UnknownIntegerType(decl.name)
+
+ if typenames == ['__dotdotdotfloat__']:
+ # note: not for 'long double' so far
+ if self._uses_new_feature is None:
+ self._uses_new_feature = "'typedef float... %s'" % decl.name
+ return model.UnknownFloatType(decl.name)
+
+ raise FFIError(':%d: unsupported usage of "..." in typedef'
+ % decl.coord.line)
+
+ def _get_unknown_ptr_type(self, decl):
+ if decl.type.type.type.names == ['__dotdotdot__']:
+ return model.unknown_ptr_type(decl.name)
+ raise FFIError(':%d: unsupported usage of "..." in typedef'
+ % decl.coord.line)
diff --git a/templates/skills/file_manager/dependencies/cffi/error.py b/templates/skills/file_manager/dependencies/cffi/error.py
new file mode 100644
index 00000000..0a27247c
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/cffi/error.py
@@ -0,0 +1,31 @@
+
+class FFIError(Exception):
+ __module__ = 'cffi'
+
+class CDefError(Exception):
+ __module__ = 'cffi'
+ def __str__(self):
+ try:
+ current_decl = self.args[1]
+ filename = current_decl.coord.file
+ linenum = current_decl.coord.line
+ prefix = '%s:%d: ' % (filename, linenum)
+ except (AttributeError, TypeError, IndexError):
+ prefix = ''
+ return '%s%s' % (prefix, self.args[0])
+
+class VerificationError(Exception):
+ """ An error raised when verification fails
+ """
+ __module__ = 'cffi'
+
+class VerificationMissing(Exception):
+ """ An error raised when incomplete structures are passed into
+ cdef, but no verification has been done
+ """
+ __module__ = 'cffi'
+
+class PkgConfigError(Exception):
+ """ An error raised for missing modules in pkg-config
+ """
+ __module__ = 'cffi'
diff --git a/templates/skills/file_manager/dependencies/cffi/ffiplatform.py b/templates/skills/file_manager/dependencies/cffi/ffiplatform.py
new file mode 100644
index 00000000..adca28f1
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/cffi/ffiplatform.py
@@ -0,0 +1,113 @@
+import sys, os
+from .error import VerificationError
+
+
+LIST_OF_FILE_NAMES = ['sources', 'include_dirs', 'library_dirs',
+ 'extra_objects', 'depends']
+
+def get_extension(srcfilename, modname, sources=(), **kwds):
+ from cffi._shimmed_dist_utils import Extension
+ allsources = [srcfilename]
+ for src in sources:
+ allsources.append(os.path.normpath(src))
+ return Extension(name=modname, sources=allsources, **kwds)
+
+def compile(tmpdir, ext, compiler_verbose=0, debug=None):
+ """Compile a C extension module using distutils."""
+
+ saved_environ = os.environ.copy()
+ try:
+ outputfilename = _build(tmpdir, ext, compiler_verbose, debug)
+ outputfilename = os.path.abspath(outputfilename)
+ finally:
+ # workaround for a distutils bugs where some env vars can
+ # become longer and longer every time it is used
+ for key, value in saved_environ.items():
+ if os.environ.get(key) != value:
+ os.environ[key] = value
+ return outputfilename
+
+def _build(tmpdir, ext, compiler_verbose=0, debug=None):
+ # XXX compact but horrible :-(
+ from cffi._shimmed_dist_utils import Distribution, CompileError, LinkError, set_threshold, set_verbosity
+
+ dist = Distribution({'ext_modules': [ext]})
+ dist.parse_config_files()
+ options = dist.get_option_dict('build_ext')
+ if debug is None:
+ debug = sys.flags.debug
+ options['debug'] = ('ffiplatform', debug)
+ options['force'] = ('ffiplatform', True)
+ options['build_lib'] = ('ffiplatform', tmpdir)
+ options['build_temp'] = ('ffiplatform', tmpdir)
+ #
+ try:
+ old_level = set_threshold(0) or 0
+ try:
+ set_verbosity(compiler_verbose)
+ dist.run_command('build_ext')
+ cmd_obj = dist.get_command_obj('build_ext')
+ [soname] = cmd_obj.get_outputs()
+ finally:
+ set_threshold(old_level)
+ except (CompileError, LinkError) as e:
+ raise VerificationError('%s: %s' % (e.__class__.__name__, e))
+ #
+ return soname
+
+try:
+ from os.path import samefile
+except ImportError:
+ def samefile(f1, f2):
+ return os.path.abspath(f1) == os.path.abspath(f2)
+
+def maybe_relative_path(path):
+ if not os.path.isabs(path):
+ return path # already relative
+ dir = path
+ names = []
+ while True:
+ prevdir = dir
+ dir, name = os.path.split(prevdir)
+ if dir == prevdir or not dir:
+ return path # failed to make it relative
+ names.append(name)
+ try:
+ if samefile(dir, os.curdir):
+ names.reverse()
+ return os.path.join(*names)
+ except OSError:
+ pass
+
+# ____________________________________________________________
+
+try:
+ int_or_long = (int, long)
+ import cStringIO
+except NameError:
+ int_or_long = int # Python 3
+ import io as cStringIO
+
+def _flatten(x, f):
+ if isinstance(x, str):
+ f.write('%ds%s' % (len(x), x))
+ elif isinstance(x, dict):
+ keys = sorted(x.keys())
+ f.write('%dd' % len(keys))
+ for key in keys:
+ _flatten(key, f)
+ _flatten(x[key], f)
+ elif isinstance(x, (list, tuple)):
+ f.write('%dl' % len(x))
+ for value in x:
+ _flatten(value, f)
+ elif isinstance(x, int_or_long):
+ f.write('%di' % (x,))
+ else:
+ raise TypeError(
+ "the keywords to verify() contains unsupported object %r" % (x,))
+
+def flatten(x):
+ f = cStringIO.StringIO()
+ _flatten(x, f)
+ return f.getvalue()
diff --git a/templates/skills/file_manager/dependencies/cffi/lock.py b/templates/skills/file_manager/dependencies/cffi/lock.py
new file mode 100644
index 00000000..db91b715
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/cffi/lock.py
@@ -0,0 +1,30 @@
+import sys
+
+if sys.version_info < (3,):
+ try:
+ from thread import allocate_lock
+ except ImportError:
+ from dummy_thread import allocate_lock
+else:
+ try:
+ from _thread import allocate_lock
+ except ImportError:
+ from _dummy_thread import allocate_lock
+
+
+##import sys
+##l1 = allocate_lock
+
+##class allocate_lock(object):
+## def __init__(self):
+## self._real = l1()
+## def __enter__(self):
+## for i in range(4, 0, -1):
+## print sys._getframe(i).f_code
+## print
+## return self._real.__enter__()
+## def __exit__(self, *args):
+## return self._real.__exit__(*args)
+## def acquire(self, f):
+## assert f is False
+## return self._real.acquire(f)
diff --git a/templates/skills/file_manager/dependencies/cffi/model.py b/templates/skills/file_manager/dependencies/cffi/model.py
new file mode 100644
index 00000000..e5f4cae3
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/cffi/model.py
@@ -0,0 +1,618 @@
+import types
+import weakref
+
+from .lock import allocate_lock
+from .error import CDefError, VerificationError, VerificationMissing
+
+# type qualifiers
+Q_CONST = 0x01
+Q_RESTRICT = 0x02
+Q_VOLATILE = 0x04
+
+def qualify(quals, replace_with):
+ if quals & Q_CONST:
+ replace_with = ' const ' + replace_with.lstrip()
+ if quals & Q_VOLATILE:
+ replace_with = ' volatile ' + replace_with.lstrip()
+ if quals & Q_RESTRICT:
+ # It seems that __restrict is supported by gcc and msvc.
+ # If you hit some different compiler, add a #define in
+ # _cffi_include.h for it (and in its copies, documented there)
+ replace_with = ' __restrict ' + replace_with.lstrip()
+ return replace_with
+
+
+class BaseTypeByIdentity(object):
+ is_array_type = False
+ is_raw_function = False
+
+ def get_c_name(self, replace_with='', context='a C file', quals=0):
+ result = self.c_name_with_marker
+ assert result.count('&') == 1
+ # some logic duplication with ffi.getctype()... :-(
+ replace_with = replace_with.strip()
+ if replace_with:
+ if replace_with.startswith('*') and '&[' in result:
+ replace_with = '(%s)' % replace_with
+ elif not replace_with[0] in '[(':
+ replace_with = ' ' + replace_with
+ replace_with = qualify(quals, replace_with)
+ result = result.replace('&', replace_with)
+ if '$' in result:
+ raise VerificationError(
+ "cannot generate '%s' in %s: unknown type name"
+ % (self._get_c_name(), context))
+ return result
+
+ def _get_c_name(self):
+ return self.c_name_with_marker.replace('&', '')
+
+ def has_c_name(self):
+ return '$' not in self._get_c_name()
+
+ def is_integer_type(self):
+ return False
+
+ def get_cached_btype(self, ffi, finishlist, can_delay=False):
+ try:
+ BType = ffi._cached_btypes[self]
+ except KeyError:
+ BType = self.build_backend_type(ffi, finishlist)
+ BType2 = ffi._cached_btypes.setdefault(self, BType)
+ assert BType2 is BType
+ return BType
+
+ def __repr__(self):
+ return '<%s>' % (self._get_c_name(),)
+
+ def _get_items(self):
+ return [(name, getattr(self, name)) for name in self._attrs_]
+
+
+class BaseType(BaseTypeByIdentity):
+
+ def __eq__(self, other):
+ return (self.__class__ == other.__class__ and
+ self._get_items() == other._get_items())
+
+ def __ne__(self, other):
+ return not self == other
+
+ def __hash__(self):
+ return hash((self.__class__, tuple(self._get_items())))
+
+
+class VoidType(BaseType):
+ _attrs_ = ()
+
+ def __init__(self):
+ self.c_name_with_marker = 'void&'
+
+ def build_backend_type(self, ffi, finishlist):
+ return global_cache(self, ffi, 'new_void_type')
+
+void_type = VoidType()
+
+
+class BasePrimitiveType(BaseType):
+ def is_complex_type(self):
+ return False
+
+
+class PrimitiveType(BasePrimitiveType):
+ _attrs_ = ('name',)
+
+ ALL_PRIMITIVE_TYPES = {
+ 'char': 'c',
+ 'short': 'i',
+ 'int': 'i',
+ 'long': 'i',
+ 'long long': 'i',
+ 'signed char': 'i',
+ 'unsigned char': 'i',
+ 'unsigned short': 'i',
+ 'unsigned int': 'i',
+ 'unsigned long': 'i',
+ 'unsigned long long': 'i',
+ 'float': 'f',
+ 'double': 'f',
+ 'long double': 'f',
+ '_cffi_float_complex_t': 'j',
+ '_cffi_double_complex_t': 'j',
+ '_Bool': 'i',
+ # the following types are not primitive in the C sense
+ 'wchar_t': 'c',
+ 'char16_t': 'c',
+ 'char32_t': 'c',
+ 'int8_t': 'i',
+ 'uint8_t': 'i',
+ 'int16_t': 'i',
+ 'uint16_t': 'i',
+ 'int32_t': 'i',
+ 'uint32_t': 'i',
+ 'int64_t': 'i',
+ 'uint64_t': 'i',
+ 'int_least8_t': 'i',
+ 'uint_least8_t': 'i',
+ 'int_least16_t': 'i',
+ 'uint_least16_t': 'i',
+ 'int_least32_t': 'i',
+ 'uint_least32_t': 'i',
+ 'int_least64_t': 'i',
+ 'uint_least64_t': 'i',
+ 'int_fast8_t': 'i',
+ 'uint_fast8_t': 'i',
+ 'int_fast16_t': 'i',
+ 'uint_fast16_t': 'i',
+ 'int_fast32_t': 'i',
+ 'uint_fast32_t': 'i',
+ 'int_fast64_t': 'i',
+ 'uint_fast64_t': 'i',
+ 'intptr_t': 'i',
+ 'uintptr_t': 'i',
+ 'intmax_t': 'i',
+ 'uintmax_t': 'i',
+ 'ptrdiff_t': 'i',
+ 'size_t': 'i',
+ 'ssize_t': 'i',
+ }
+
+ def __init__(self, name):
+ assert name in self.ALL_PRIMITIVE_TYPES
+ self.name = name
+ self.c_name_with_marker = name + '&'
+
+ def is_char_type(self):
+ return self.ALL_PRIMITIVE_TYPES[self.name] == 'c'
+ def is_integer_type(self):
+ return self.ALL_PRIMITIVE_TYPES[self.name] == 'i'
+ def is_float_type(self):
+ return self.ALL_PRIMITIVE_TYPES[self.name] == 'f'
+ def is_complex_type(self):
+ return self.ALL_PRIMITIVE_TYPES[self.name] == 'j'
+
+ def build_backend_type(self, ffi, finishlist):
+ return global_cache(self, ffi, 'new_primitive_type', self.name)
+
+
+class UnknownIntegerType(BasePrimitiveType):
+ _attrs_ = ('name',)
+
+ def __init__(self, name):
+ self.name = name
+ self.c_name_with_marker = name + '&'
+
+ def is_integer_type(self):
+ return True
+
+ def build_backend_type(self, ffi, finishlist):
+ raise NotImplementedError("integer type '%s' can only be used after "
+ "compilation" % self.name)
+
+class UnknownFloatType(BasePrimitiveType):
+ _attrs_ = ('name', )
+
+ def __init__(self, name):
+ self.name = name
+ self.c_name_with_marker = name + '&'
+
+ def build_backend_type(self, ffi, finishlist):
+ raise NotImplementedError("float type '%s' can only be used after "
+ "compilation" % self.name)
+
+
+class BaseFunctionType(BaseType):
+ _attrs_ = ('args', 'result', 'ellipsis', 'abi')
+
+ def __init__(self, args, result, ellipsis, abi=None):
+ self.args = args
+ self.result = result
+ self.ellipsis = ellipsis
+ self.abi = abi
+ #
+ reprargs = [arg._get_c_name() for arg in self.args]
+ if self.ellipsis:
+ reprargs.append('...')
+ reprargs = reprargs or ['void']
+ replace_with = self._base_pattern % (', '.join(reprargs),)
+ if abi is not None:
+ replace_with = replace_with[:1] + abi + ' ' + replace_with[1:]
+ self.c_name_with_marker = (
+ self.result.c_name_with_marker.replace('&', replace_with))
+
+
+class RawFunctionType(BaseFunctionType):
+ # Corresponds to a C type like 'int(int)', which is the C type of
+ # a function, but not a pointer-to-function. The backend has no
+ # notion of such a type; it's used temporarily by parsing.
+ _base_pattern = '(&)(%s)'
+ is_raw_function = True
+
+ def build_backend_type(self, ffi, finishlist):
+ raise CDefError("cannot render the type %r: it is a function "
+ "type, not a pointer-to-function type" % (self,))
+
+ def as_function_pointer(self):
+ return FunctionPtrType(self.args, self.result, self.ellipsis, self.abi)
+
+
+class FunctionPtrType(BaseFunctionType):
+ _base_pattern = '(*&)(%s)'
+
+ def build_backend_type(self, ffi, finishlist):
+ result = self.result.get_cached_btype(ffi, finishlist)
+ args = []
+ for tp in self.args:
+ args.append(tp.get_cached_btype(ffi, finishlist))
+ abi_args = ()
+ if self.abi == "__stdcall":
+ if not self.ellipsis: # __stdcall ignored for variadic funcs
+ try:
+ abi_args = (ffi._backend.FFI_STDCALL,)
+ except AttributeError:
+ pass
+ return global_cache(self, ffi, 'new_function_type',
+ tuple(args), result, self.ellipsis, *abi_args)
+
+ def as_raw_function(self):
+ return RawFunctionType(self.args, self.result, self.ellipsis, self.abi)
+
+
+class PointerType(BaseType):
+ _attrs_ = ('totype', 'quals')
+
+ def __init__(self, totype, quals=0):
+ self.totype = totype
+ self.quals = quals
+ extra = " *&"
+ if totype.is_array_type:
+ extra = "(%s)" % (extra.lstrip(),)
+ extra = qualify(quals, extra)
+ self.c_name_with_marker = totype.c_name_with_marker.replace('&', extra)
+
+ def build_backend_type(self, ffi, finishlist):
+ BItem = self.totype.get_cached_btype(ffi, finishlist, can_delay=True)
+ return global_cache(self, ffi, 'new_pointer_type', BItem)
+
+voidp_type = PointerType(void_type)
+
+def ConstPointerType(totype):
+ return PointerType(totype, Q_CONST)
+
+const_voidp_type = ConstPointerType(void_type)
+
+
+class NamedPointerType(PointerType):
+ _attrs_ = ('totype', 'name')
+
+ def __init__(self, totype, name, quals=0):
+ PointerType.__init__(self, totype, quals)
+ self.name = name
+ self.c_name_with_marker = name + '&'
+
+
+class ArrayType(BaseType):
+ _attrs_ = ('item', 'length')
+ is_array_type = True
+
+ def __init__(self, item, length):
+ self.item = item
+ self.length = length
+ #
+ if length is None:
+ brackets = '&[]'
+ elif length == '...':
+ brackets = '&[/*...*/]'
+ else:
+ brackets = '&[%s]' % length
+ self.c_name_with_marker = (
+ self.item.c_name_with_marker.replace('&', brackets))
+
+ def length_is_unknown(self):
+ return isinstance(self.length, str)
+
+ def resolve_length(self, newlength):
+ return ArrayType(self.item, newlength)
+
+ def build_backend_type(self, ffi, finishlist):
+ if self.length_is_unknown():
+ raise CDefError("cannot render the type %r: unknown length" %
+ (self,))
+ self.item.get_cached_btype(ffi, finishlist) # force the item BType
+ BPtrItem = PointerType(self.item).get_cached_btype(ffi, finishlist)
+ return global_cache(self, ffi, 'new_array_type', BPtrItem, self.length)
+
+char_array_type = ArrayType(PrimitiveType('char'), None)
+
+
+class StructOrUnionOrEnum(BaseTypeByIdentity):
+ _attrs_ = ('name',)
+ forcename = None
+
+ def build_c_name_with_marker(self):
+ name = self.forcename or '%s %s' % (self.kind, self.name)
+ self.c_name_with_marker = name + '&'
+
+ def force_the_name(self, forcename):
+ self.forcename = forcename
+ self.build_c_name_with_marker()
+
+ def get_official_name(self):
+ assert self.c_name_with_marker.endswith('&')
+ return self.c_name_with_marker[:-1]
+
+
+class StructOrUnion(StructOrUnionOrEnum):
+ fixedlayout = None
+ completed = 0
+ partial = False
+ packed = 0
+
+ def __init__(self, name, fldnames, fldtypes, fldbitsize, fldquals=None):
+ self.name = name
+ self.fldnames = fldnames
+ self.fldtypes = fldtypes
+ self.fldbitsize = fldbitsize
+ self.fldquals = fldquals
+ self.build_c_name_with_marker()
+
+ def anonymous_struct_fields(self):
+ if self.fldtypes is not None:
+ for name, type in zip(self.fldnames, self.fldtypes):
+ if name == '' and isinstance(type, StructOrUnion):
+ yield type
+
+ def enumfields(self, expand_anonymous_struct_union=True):
+ fldquals = self.fldquals
+ if fldquals is None:
+ fldquals = (0,) * len(self.fldnames)
+ for name, type, bitsize, quals in zip(self.fldnames, self.fldtypes,
+ self.fldbitsize, fldquals):
+ if (name == '' and isinstance(type, StructOrUnion)
+ and expand_anonymous_struct_union):
+ # nested anonymous struct/union
+ for result in type.enumfields():
+ yield result
+ else:
+ yield (name, type, bitsize, quals)
+
+ def force_flatten(self):
+ # force the struct or union to have a declaration that lists
+ # directly all fields returned by enumfields(), flattening
+ # nested anonymous structs/unions.
+ names = []
+ types = []
+ bitsizes = []
+ fldquals = []
+ for name, type, bitsize, quals in self.enumfields():
+ names.append(name)
+ types.append(type)
+ bitsizes.append(bitsize)
+ fldquals.append(quals)
+ self.fldnames = tuple(names)
+ self.fldtypes = tuple(types)
+ self.fldbitsize = tuple(bitsizes)
+ self.fldquals = tuple(fldquals)
+
+ def get_cached_btype(self, ffi, finishlist, can_delay=False):
+ BType = StructOrUnionOrEnum.get_cached_btype(self, ffi, finishlist,
+ can_delay)
+ if not can_delay:
+ self.finish_backend_type(ffi, finishlist)
+ return BType
+
+ def finish_backend_type(self, ffi, finishlist):
+ if self.completed:
+ if self.completed != 2:
+ raise NotImplementedError("recursive structure declaration "
+ "for '%s'" % (self.name,))
+ return
+ BType = ffi._cached_btypes[self]
+ #
+ self.completed = 1
+ #
+ if self.fldtypes is None:
+ pass # not completing it: it's an opaque struct
+ #
+ elif self.fixedlayout is None:
+ fldtypes = [tp.get_cached_btype(ffi, finishlist)
+ for tp in self.fldtypes]
+ lst = list(zip(self.fldnames, fldtypes, self.fldbitsize))
+ extra_flags = ()
+ if self.packed:
+ if self.packed == 1:
+ extra_flags = (8,) # SF_PACKED
+ else:
+ extra_flags = (0, self.packed)
+ ffi._backend.complete_struct_or_union(BType, lst, self,
+ -1, -1, *extra_flags)
+ #
+ else:
+ fldtypes = []
+ fieldofs, fieldsize, totalsize, totalalignment = self.fixedlayout
+ for i in range(len(self.fldnames)):
+ fsize = fieldsize[i]
+ ftype = self.fldtypes[i]
+ #
+ if isinstance(ftype, ArrayType) and ftype.length_is_unknown():
+ # fix the length to match the total size
+ BItemType = ftype.item.get_cached_btype(ffi, finishlist)
+ nlen, nrest = divmod(fsize, ffi.sizeof(BItemType))
+ if nrest != 0:
+ self._verification_error(
+ "field '%s.%s' has a bogus size?" % (
+ self.name, self.fldnames[i] or '{}'))
+ ftype = ftype.resolve_length(nlen)
+ self.fldtypes = (self.fldtypes[:i] + (ftype,) +
+ self.fldtypes[i+1:])
+ #
+ BFieldType = ftype.get_cached_btype(ffi, finishlist)
+ if isinstance(ftype, ArrayType) and ftype.length is None:
+ assert fsize == 0
+ else:
+ bitemsize = ffi.sizeof(BFieldType)
+ if bitemsize != fsize:
+ self._verification_error(
+ "field '%s.%s' is declared as %d bytes, but is "
+ "really %d bytes" % (self.name,
+ self.fldnames[i] or '{}',
+ bitemsize, fsize))
+ fldtypes.append(BFieldType)
+ #
+ lst = list(zip(self.fldnames, fldtypes, self.fldbitsize, fieldofs))
+ ffi._backend.complete_struct_or_union(BType, lst, self,
+ totalsize, totalalignment)
+ self.completed = 2
+
+ def _verification_error(self, msg):
+ raise VerificationError(msg)
+
+ def check_not_partial(self):
+ if self.partial and self.fixedlayout is None:
+ raise VerificationMissing(self._get_c_name())
+
+ def build_backend_type(self, ffi, finishlist):
+ self.check_not_partial()
+ finishlist.append(self)
+ #
+ return global_cache(self, ffi, 'new_%s_type' % self.kind,
+ self.get_official_name(), key=self)
+
+
+class StructType(StructOrUnion):
+ kind = 'struct'
+
+
+class UnionType(StructOrUnion):
+ kind = 'union'
+
+
+class EnumType(StructOrUnionOrEnum):
+ kind = 'enum'
+ partial = False
+ partial_resolved = False
+
+ def __init__(self, name, enumerators, enumvalues, baseinttype=None):
+ self.name = name
+ self.enumerators = enumerators
+ self.enumvalues = enumvalues
+ self.baseinttype = baseinttype
+ self.build_c_name_with_marker()
+
+ def force_the_name(self, forcename):
+ StructOrUnionOrEnum.force_the_name(self, forcename)
+ if self.forcename is None:
+ name = self.get_official_name()
+ self.forcename = '$' + name.replace(' ', '_')
+
+ def check_not_partial(self):
+ if self.partial and not self.partial_resolved:
+ raise VerificationMissing(self._get_c_name())
+
+ def build_backend_type(self, ffi, finishlist):
+ self.check_not_partial()
+ base_btype = self.build_baseinttype(ffi, finishlist)
+ return global_cache(self, ffi, 'new_enum_type',
+ self.get_official_name(),
+ self.enumerators, self.enumvalues,
+ base_btype, key=self)
+
+ def build_baseinttype(self, ffi, finishlist):
+ if self.baseinttype is not None:
+ return self.baseinttype.get_cached_btype(ffi, finishlist)
+ #
+ if self.enumvalues:
+ smallest_value = min(self.enumvalues)
+ largest_value = max(self.enumvalues)
+ else:
+ import warnings
+ try:
+ # XXX! The goal is to ensure that the warnings.warn()
+ # will not suppress the warning. We want to get it
+ # several times if we reach this point several times.
+ __warningregistry__.clear()
+ except NameError:
+ pass
+ warnings.warn("%r has no values explicitly defined; "
+ "guessing that it is equivalent to 'unsigned int'"
+ % self._get_c_name())
+ smallest_value = largest_value = 0
+ if smallest_value < 0: # needs a signed type
+ sign = 1
+ candidate1 = PrimitiveType("int")
+ candidate2 = PrimitiveType("long")
+ else:
+ sign = 0
+ candidate1 = PrimitiveType("unsigned int")
+ candidate2 = PrimitiveType("unsigned long")
+ btype1 = candidate1.get_cached_btype(ffi, finishlist)
+ btype2 = candidate2.get_cached_btype(ffi, finishlist)
+ size1 = ffi.sizeof(btype1)
+ size2 = ffi.sizeof(btype2)
+ if (smallest_value >= ((-1) << (8*size1-1)) and
+ largest_value < (1 << (8*size1-sign))):
+ return btype1
+ if (smallest_value >= ((-1) << (8*size2-1)) and
+ largest_value < (1 << (8*size2-sign))):
+ return btype2
+ raise CDefError("%s values don't all fit into either 'long' "
+ "or 'unsigned long'" % self._get_c_name())
+
+def unknown_type(name, structname=None):
+ if structname is None:
+ structname = '$%s' % name
+ tp = StructType(structname, None, None, None)
+ tp.force_the_name(name)
+ tp.origin = "unknown_type"
+ return tp
+
+def unknown_ptr_type(name, structname=None):
+ if structname is None:
+ structname = '$$%s' % name
+ tp = StructType(structname, None, None, None)
+ return NamedPointerType(tp, name)
+
+
+global_lock = allocate_lock()
+_typecache_cffi_backend = weakref.WeakValueDictionary()
+
+def get_typecache(backend):
+ # returns _typecache_cffi_backend if backend is the _cffi_backend
+ # module, or type(backend).__typecache if backend is an instance of
+ # CTypesBackend (or some FakeBackend class during tests)
+ if isinstance(backend, types.ModuleType):
+ return _typecache_cffi_backend
+ with global_lock:
+ if not hasattr(type(backend), '__typecache'):
+ type(backend).__typecache = weakref.WeakValueDictionary()
+ return type(backend).__typecache
+
+def global_cache(srctype, ffi, funcname, *args, **kwds):
+ key = kwds.pop('key', (funcname, args))
+ assert not kwds
+ try:
+ return ffi._typecache[key]
+ except KeyError:
+ pass
+ try:
+ res = getattr(ffi._backend, funcname)(*args)
+ except NotImplementedError as e:
+ raise NotImplementedError("%s: %r: %s" % (funcname, srctype, e))
+ # note that setdefault() on WeakValueDictionary is not atomic
+ # and contains a rare bug (http://bugs.python.org/issue19542);
+ # we have to use a lock and do it ourselves
+ cache = ffi._typecache
+ with global_lock:
+ res1 = cache.get(key)
+ if res1 is None:
+ cache[key] = res
+ return res
+ else:
+ return res1
+
+def pointer_cache(ffi, BType):
+ return global_cache('?', ffi, 'new_pointer_type', BType)
+
+def attach_exception_info(e, name):
+ if e.args and type(e.args[0]) is str:
+ e.args = ('%s: %s' % (name, e.args[0]),) + e.args[1:]
diff --git a/templates/skills/file_manager/dependencies/cffi/parse_c_type.h b/templates/skills/file_manager/dependencies/cffi/parse_c_type.h
new file mode 100644
index 00000000..84e4ef85
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/cffi/parse_c_type.h
@@ -0,0 +1,181 @@
+
+/* This part is from file 'cffi/parse_c_type.h'. It is copied at the
+ beginning of C sources generated by CFFI's ffi.set_source(). */
+
+typedef void *_cffi_opcode_t;
+
+#define _CFFI_OP(opcode, arg) (_cffi_opcode_t)(opcode | (((uintptr_t)(arg)) << 8))
+#define _CFFI_GETOP(cffi_opcode) ((unsigned char)(uintptr_t)cffi_opcode)
+#define _CFFI_GETARG(cffi_opcode) (((intptr_t)cffi_opcode) >> 8)
+
+#define _CFFI_OP_PRIMITIVE 1
+#define _CFFI_OP_POINTER 3
+#define _CFFI_OP_ARRAY 5
+#define _CFFI_OP_OPEN_ARRAY 7
+#define _CFFI_OP_STRUCT_UNION 9
+#define _CFFI_OP_ENUM 11
+#define _CFFI_OP_FUNCTION 13
+#define _CFFI_OP_FUNCTION_END 15
+#define _CFFI_OP_NOOP 17
+#define _CFFI_OP_BITFIELD 19
+#define _CFFI_OP_TYPENAME 21
+#define _CFFI_OP_CPYTHON_BLTN_V 23 // varargs
+#define _CFFI_OP_CPYTHON_BLTN_N 25 // noargs
+#define _CFFI_OP_CPYTHON_BLTN_O 27 // O (i.e. a single arg)
+#define _CFFI_OP_CONSTANT 29
+#define _CFFI_OP_CONSTANT_INT 31
+#define _CFFI_OP_GLOBAL_VAR 33
+#define _CFFI_OP_DLOPEN_FUNC 35
+#define _CFFI_OP_DLOPEN_CONST 37
+#define _CFFI_OP_GLOBAL_VAR_F 39
+#define _CFFI_OP_EXTERN_PYTHON 41
+
+#define _CFFI_PRIM_VOID 0
+#define _CFFI_PRIM_BOOL 1
+#define _CFFI_PRIM_CHAR 2
+#define _CFFI_PRIM_SCHAR 3
+#define _CFFI_PRIM_UCHAR 4
+#define _CFFI_PRIM_SHORT 5
+#define _CFFI_PRIM_USHORT 6
+#define _CFFI_PRIM_INT 7
+#define _CFFI_PRIM_UINT 8
+#define _CFFI_PRIM_LONG 9
+#define _CFFI_PRIM_ULONG 10
+#define _CFFI_PRIM_LONGLONG 11
+#define _CFFI_PRIM_ULONGLONG 12
+#define _CFFI_PRIM_FLOAT 13
+#define _CFFI_PRIM_DOUBLE 14
+#define _CFFI_PRIM_LONGDOUBLE 15
+
+#define _CFFI_PRIM_WCHAR 16
+#define _CFFI_PRIM_INT8 17
+#define _CFFI_PRIM_UINT8 18
+#define _CFFI_PRIM_INT16 19
+#define _CFFI_PRIM_UINT16 20
+#define _CFFI_PRIM_INT32 21
+#define _CFFI_PRIM_UINT32 22
+#define _CFFI_PRIM_INT64 23
+#define _CFFI_PRIM_UINT64 24
+#define _CFFI_PRIM_INTPTR 25
+#define _CFFI_PRIM_UINTPTR 26
+#define _CFFI_PRIM_PTRDIFF 27
+#define _CFFI_PRIM_SIZE 28
+#define _CFFI_PRIM_SSIZE 29
+#define _CFFI_PRIM_INT_LEAST8 30
+#define _CFFI_PRIM_UINT_LEAST8 31
+#define _CFFI_PRIM_INT_LEAST16 32
+#define _CFFI_PRIM_UINT_LEAST16 33
+#define _CFFI_PRIM_INT_LEAST32 34
+#define _CFFI_PRIM_UINT_LEAST32 35
+#define _CFFI_PRIM_INT_LEAST64 36
+#define _CFFI_PRIM_UINT_LEAST64 37
+#define _CFFI_PRIM_INT_FAST8 38
+#define _CFFI_PRIM_UINT_FAST8 39
+#define _CFFI_PRIM_INT_FAST16 40
+#define _CFFI_PRIM_UINT_FAST16 41
+#define _CFFI_PRIM_INT_FAST32 42
+#define _CFFI_PRIM_UINT_FAST32 43
+#define _CFFI_PRIM_INT_FAST64 44
+#define _CFFI_PRIM_UINT_FAST64 45
+#define _CFFI_PRIM_INTMAX 46
+#define _CFFI_PRIM_UINTMAX 47
+#define _CFFI_PRIM_FLOATCOMPLEX 48
+#define _CFFI_PRIM_DOUBLECOMPLEX 49
+#define _CFFI_PRIM_CHAR16 50
+#define _CFFI_PRIM_CHAR32 51
+
+#define _CFFI__NUM_PRIM 52
+#define _CFFI__UNKNOWN_PRIM (-1)
+#define _CFFI__UNKNOWN_FLOAT_PRIM (-2)
+#define _CFFI__UNKNOWN_LONG_DOUBLE (-3)
+
+#define _CFFI__IO_FILE_STRUCT (-1)
+
+
+struct _cffi_global_s {
+ const char *name;
+ void *address;
+ _cffi_opcode_t type_op;
+ void *size_or_direct_fn; // OP_GLOBAL_VAR: size, or 0 if unknown
+ // OP_CPYTHON_BLTN_*: addr of direct function
+};
+
+struct _cffi_getconst_s {
+ unsigned long long value;
+ const struct _cffi_type_context_s *ctx;
+ int gindex;
+};
+
+struct _cffi_struct_union_s {
+ const char *name;
+ int type_index; // -> _cffi_types, on a OP_STRUCT_UNION
+ int flags; // _CFFI_F_* flags below
+ size_t size;
+ int alignment;
+ int first_field_index; // -> _cffi_fields array
+ int num_fields;
+};
+#define _CFFI_F_UNION 0x01 // is a union, not a struct
+#define _CFFI_F_CHECK_FIELDS 0x02 // complain if fields are not in the
+ // "standard layout" or if some are missing
+#define _CFFI_F_PACKED 0x04 // for CHECK_FIELDS, assume a packed struct
+#define _CFFI_F_EXTERNAL 0x08 // in some other ffi.include()
+#define _CFFI_F_OPAQUE 0x10 // opaque
+
+struct _cffi_field_s {
+ const char *name;
+ size_t field_offset;
+ size_t field_size;
+ _cffi_opcode_t field_type_op;
+};
+
+struct _cffi_enum_s {
+ const char *name;
+ int type_index; // -> _cffi_types, on a OP_ENUM
+ int type_prim; // _CFFI_PRIM_xxx
+ const char *enumerators; // comma-delimited string
+};
+
+struct _cffi_typename_s {
+ const char *name;
+ int type_index; /* if opaque, points to a possibly artificial
+ OP_STRUCT which is itself opaque */
+};
+
+struct _cffi_type_context_s {
+ _cffi_opcode_t *types;
+ const struct _cffi_global_s *globals;
+ const struct _cffi_field_s *fields;
+ const struct _cffi_struct_union_s *struct_unions;
+ const struct _cffi_enum_s *enums;
+ const struct _cffi_typename_s *typenames;
+ int num_globals;
+ int num_struct_unions;
+ int num_enums;
+ int num_typenames;
+ const char *const *includes;
+ int num_types;
+ int flags; /* future extension */
+};
+
+struct _cffi_parse_info_s {
+ const struct _cffi_type_context_s *ctx;
+ _cffi_opcode_t *output;
+ unsigned int output_size;
+ size_t error_location;
+ const char *error_message;
+};
+
+struct _cffi_externpy_s {
+ const char *name;
+ size_t size_of_result;
+ void *reserved1, *reserved2;
+};
+
+#ifdef _CFFI_INTERNAL
+static int parse_c_type(struct _cffi_parse_info_s *info, const char *input);
+static int search_in_globals(const struct _cffi_type_context_s *ctx,
+ const char *search, size_t search_len);
+static int search_in_struct_unions(const struct _cffi_type_context_s *ctx,
+ const char *search, size_t search_len);
+#endif
diff --git a/templates/skills/file_manager/dependencies/cffi/pkgconfig.py b/templates/skills/file_manager/dependencies/cffi/pkgconfig.py
new file mode 100644
index 00000000..5c93f15a
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/cffi/pkgconfig.py
@@ -0,0 +1,121 @@
+# pkg-config, https://www.freedesktop.org/wiki/Software/pkg-config/ integration for cffi
+import sys, os, subprocess
+
+from .error import PkgConfigError
+
+
+def merge_flags(cfg1, cfg2):
+ """Merge values from cffi config flags cfg2 to cf1
+
+ Example:
+ merge_flags({"libraries": ["one"]}, {"libraries": ["two"]})
+ {"libraries": ["one", "two"]}
+ """
+ for key, value in cfg2.items():
+ if key not in cfg1:
+ cfg1[key] = value
+ else:
+ if not isinstance(cfg1[key], list):
+ raise TypeError("cfg1[%r] should be a list of strings" % (key,))
+ if not isinstance(value, list):
+ raise TypeError("cfg2[%r] should be a list of strings" % (key,))
+ cfg1[key].extend(value)
+ return cfg1
+
+
+def call(libname, flag, encoding=sys.getfilesystemencoding()):
+ """Calls pkg-config and returns the output if found
+ """
+ a = ["pkg-config", "--print-errors"]
+ a.append(flag)
+ a.append(libname)
+ try:
+ pc = subprocess.Popen(a, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
+ except EnvironmentError as e:
+ raise PkgConfigError("cannot run pkg-config: %s" % (str(e).strip(),))
+
+ bout, berr = pc.communicate()
+ if pc.returncode != 0:
+ try:
+ berr = berr.decode(encoding)
+ except Exception:
+ pass
+ raise PkgConfigError(berr.strip())
+
+ if sys.version_info >= (3,) and not isinstance(bout, str): # Python 3.x
+ try:
+ bout = bout.decode(encoding)
+ except UnicodeDecodeError:
+ raise PkgConfigError("pkg-config %s %s returned bytes that cannot "
+ "be decoded with encoding %r:\n%r" %
+ (flag, libname, encoding, bout))
+
+ if os.altsep != '\\' and '\\' in bout:
+ raise PkgConfigError("pkg-config %s %s returned an unsupported "
+ "backslash-escaped output:\n%r" %
+ (flag, libname, bout))
+ return bout
+
+
+def flags_from_pkgconfig(libs):
+ r"""Return compiler line flags for FFI.set_source based on pkg-config output
+
+ Usage
+ ...
+ ffibuilder.set_source("_foo", pkgconfig = ["libfoo", "libbar >= 1.8.3"])
+
+ If pkg-config is installed on build machine, then arguments include_dirs,
+ library_dirs, libraries, define_macros, extra_compile_args and
+ extra_link_args are extended with an output of pkg-config for libfoo and
+ libbar.
+
+ Raises PkgConfigError in case the pkg-config call fails.
+ """
+
+ def get_include_dirs(string):
+ return [x[2:] for x in string.split() if x.startswith("-I")]
+
+ def get_library_dirs(string):
+ return [x[2:] for x in string.split() if x.startswith("-L")]
+
+ def get_libraries(string):
+ return [x[2:] for x in string.split() if x.startswith("-l")]
+
+ # convert -Dfoo=bar to list of tuples [("foo", "bar")] expected by distutils
+ def get_macros(string):
+ def _macro(x):
+ x = x[2:] # drop "-D"
+ if '=' in x:
+ return tuple(x.split("=", 1)) # "-Dfoo=bar" => ("foo", "bar")
+ else:
+ return (x, None) # "-Dfoo" => ("foo", None)
+ return [_macro(x) for x in string.split() if x.startswith("-D")]
+
+ def get_other_cflags(string):
+ return [x for x in string.split() if not x.startswith("-I") and
+ not x.startswith("-D")]
+
+ def get_other_libs(string):
+ return [x for x in string.split() if not x.startswith("-L") and
+ not x.startswith("-l")]
+
+ # return kwargs for given libname
+ def kwargs(libname):
+ fse = sys.getfilesystemencoding()
+ all_cflags = call(libname, "--cflags")
+ all_libs = call(libname, "--libs")
+ return {
+ "include_dirs": get_include_dirs(all_cflags),
+ "library_dirs": get_library_dirs(all_libs),
+ "libraries": get_libraries(all_libs),
+ "define_macros": get_macros(all_cflags),
+ "extra_compile_args": get_other_cflags(all_cflags),
+ "extra_link_args": get_other_libs(all_libs),
+ }
+
+ # merge all arguments together
+ ret = {}
+ for libname in libs:
+ lib_flags = kwargs(libname)
+ merge_flags(ret, lib_flags)
+ return ret
diff --git a/templates/skills/file_manager/dependencies/cffi/recompiler.py b/templates/skills/file_manager/dependencies/cffi/recompiler.py
new file mode 100644
index 00000000..57781a3c
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/cffi/recompiler.py
@@ -0,0 +1,1598 @@
+import os, sys, io
+from . import ffiplatform, model
+from .error import VerificationError
+from .cffi_opcode import *
+
+VERSION_BASE = 0x2601
+VERSION_EMBEDDED = 0x2701
+VERSION_CHAR16CHAR32 = 0x2801
+
+USE_LIMITED_API = (sys.platform != 'win32' or sys.version_info < (3, 0) or
+ sys.version_info >= (3, 5))
+
+
+class GlobalExpr:
+ def __init__(self, name, address, type_op, size=0, check_value=0):
+ self.name = name
+ self.address = address
+ self.type_op = type_op
+ self.size = size
+ self.check_value = check_value
+
+ def as_c_expr(self):
+ return ' { "%s", (void *)%s, %s, (void *)%s },' % (
+ self.name, self.address, self.type_op.as_c_expr(), self.size)
+
+ def as_python_expr(self):
+ return "b'%s%s',%d" % (self.type_op.as_python_bytes(), self.name,
+ self.check_value)
+
+class FieldExpr:
+ def __init__(self, name, field_offset, field_size, fbitsize, field_type_op):
+ self.name = name
+ self.field_offset = field_offset
+ self.field_size = field_size
+ self.fbitsize = fbitsize
+ self.field_type_op = field_type_op
+
+ def as_c_expr(self):
+ spaces = " " * len(self.name)
+ return (' { "%s", %s,\n' % (self.name, self.field_offset) +
+ ' %s %s,\n' % (spaces, self.field_size) +
+ ' %s %s },' % (spaces, self.field_type_op.as_c_expr()))
+
+ def as_python_expr(self):
+ raise NotImplementedError
+
+ def as_field_python_expr(self):
+ if self.field_type_op.op == OP_NOOP:
+ size_expr = ''
+ elif self.field_type_op.op == OP_BITFIELD:
+ size_expr = format_four_bytes(self.fbitsize)
+ else:
+ raise NotImplementedError
+ return "b'%s%s%s'" % (self.field_type_op.as_python_bytes(),
+ size_expr,
+ self.name)
+
+class StructUnionExpr:
+ def __init__(self, name, type_index, flags, size, alignment, comment,
+ first_field_index, c_fields):
+ self.name = name
+ self.type_index = type_index
+ self.flags = flags
+ self.size = size
+ self.alignment = alignment
+ self.comment = comment
+ self.first_field_index = first_field_index
+ self.c_fields = c_fields
+
+ def as_c_expr(self):
+ return (' { "%s", %d, %s,' % (self.name, self.type_index, self.flags)
+ + '\n %s, %s, ' % (self.size, self.alignment)
+ + '%d, %d ' % (self.first_field_index, len(self.c_fields))
+ + ('/* %s */ ' % self.comment if self.comment else '')
+ + '},')
+
+ def as_python_expr(self):
+ flags = eval(self.flags, G_FLAGS)
+ fields_expr = [c_field.as_field_python_expr()
+ for c_field in self.c_fields]
+ return "(b'%s%s%s',%s)" % (
+ format_four_bytes(self.type_index),
+ format_four_bytes(flags),
+ self.name,
+ ','.join(fields_expr))
+
+class EnumExpr:
+ def __init__(self, name, type_index, size, signed, allenums):
+ self.name = name
+ self.type_index = type_index
+ self.size = size
+ self.signed = signed
+ self.allenums = allenums
+
+ def as_c_expr(self):
+ return (' { "%s", %d, _cffi_prim_int(%s, %s),\n'
+ ' "%s" },' % (self.name, self.type_index,
+ self.size, self.signed, self.allenums))
+
+ def as_python_expr(self):
+ prim_index = {
+ (1, 0): PRIM_UINT8, (1, 1): PRIM_INT8,
+ (2, 0): PRIM_UINT16, (2, 1): PRIM_INT16,
+ (4, 0): PRIM_UINT32, (4, 1): PRIM_INT32,
+ (8, 0): PRIM_UINT64, (8, 1): PRIM_INT64,
+ }[self.size, self.signed]
+ return "b'%s%s%s\\x00%s'" % (format_four_bytes(self.type_index),
+ format_four_bytes(prim_index),
+ self.name, self.allenums)
+
+class TypenameExpr:
+ def __init__(self, name, type_index):
+ self.name = name
+ self.type_index = type_index
+
+ def as_c_expr(self):
+ return ' { "%s", %d },' % (self.name, self.type_index)
+
+ def as_python_expr(self):
+ return "b'%s%s'" % (format_four_bytes(self.type_index), self.name)
+
+
+# ____________________________________________________________
+
+
+class Recompiler:
+ _num_externpy = 0
+
+ def __init__(self, ffi, module_name, target_is_python=False):
+ self.ffi = ffi
+ self.module_name = module_name
+ self.target_is_python = target_is_python
+ self._version = VERSION_BASE
+
+ def needs_version(self, ver):
+ self._version = max(self._version, ver)
+
+ def collect_type_table(self):
+ self._typesdict = {}
+ self._generate("collecttype")
+ #
+ all_decls = sorted(self._typesdict, key=str)
+ #
+ # prepare all FUNCTION bytecode sequences first
+ self.cffi_types = []
+ for tp in all_decls:
+ if tp.is_raw_function:
+ assert self._typesdict[tp] is None
+ self._typesdict[tp] = len(self.cffi_types)
+ self.cffi_types.append(tp) # placeholder
+ for tp1 in tp.args:
+ assert isinstance(tp1, (model.VoidType,
+ model.BasePrimitiveType,
+ model.PointerType,
+ model.StructOrUnionOrEnum,
+ model.FunctionPtrType))
+ if self._typesdict[tp1] is None:
+ self._typesdict[tp1] = len(self.cffi_types)
+ self.cffi_types.append(tp1) # placeholder
+ self.cffi_types.append('END') # placeholder
+ #
+ # prepare all OTHER bytecode sequences
+ for tp in all_decls:
+ if not tp.is_raw_function and self._typesdict[tp] is None:
+ self._typesdict[tp] = len(self.cffi_types)
+ self.cffi_types.append(tp) # placeholder
+ if tp.is_array_type and tp.length is not None:
+ self.cffi_types.append('LEN') # placeholder
+ assert None not in self._typesdict.values()
+ #
+ # collect all structs and unions and enums
+ self._struct_unions = {}
+ self._enums = {}
+ for tp in all_decls:
+ if isinstance(tp, model.StructOrUnion):
+ self._struct_unions[tp] = None
+ elif isinstance(tp, model.EnumType):
+ self._enums[tp] = None
+ for i, tp in enumerate(sorted(self._struct_unions,
+ key=lambda tp: tp.name)):
+ self._struct_unions[tp] = i
+ for i, tp in enumerate(sorted(self._enums,
+ key=lambda tp: tp.name)):
+ self._enums[tp] = i
+ #
+ # emit all bytecode sequences now
+ for tp in all_decls:
+ method = getattr(self, '_emit_bytecode_' + tp.__class__.__name__)
+ method(tp, self._typesdict[tp])
+ #
+ # consistency check
+ for op in self.cffi_types:
+ assert isinstance(op, CffiOp)
+ self.cffi_types = tuple(self.cffi_types) # don't change any more
+
+ def _enum_fields(self, tp):
+ # When producing C, expand all anonymous struct/union fields.
+ # That's necessary to have C code checking the offsets of the
+ # individual fields contained in them. When producing Python,
+ # don't do it and instead write it like it is, with the
+ # corresponding fields having an empty name. Empty names are
+ # recognized at runtime when we import the generated Python
+ # file.
+ expand_anonymous_struct_union = not self.target_is_python
+ return tp.enumfields(expand_anonymous_struct_union)
+
+ def _do_collect_type(self, tp):
+ if not isinstance(tp, model.BaseTypeByIdentity):
+ if isinstance(tp, tuple):
+ for x in tp:
+ self._do_collect_type(x)
+ return
+ if tp not in self._typesdict:
+ self._typesdict[tp] = None
+ if isinstance(tp, model.FunctionPtrType):
+ self._do_collect_type(tp.as_raw_function())
+ elif isinstance(tp, model.StructOrUnion):
+ if tp.fldtypes is not None and (
+ tp not in self.ffi._parser._included_declarations):
+ for name1, tp1, _, _ in self._enum_fields(tp):
+ self._do_collect_type(self._field_type(tp, name1, tp1))
+ else:
+ for _, x in tp._get_items():
+ self._do_collect_type(x)
+
+ def _generate(self, step_name):
+ lst = self.ffi._parser._declarations.items()
+ for name, (tp, quals) in sorted(lst):
+ kind, realname = name.split(' ', 1)
+ try:
+ method = getattr(self, '_generate_cpy_%s_%s' % (kind,
+ step_name))
+ except AttributeError:
+ raise VerificationError(
+ "not implemented in recompile(): %r" % name)
+ try:
+ self._current_quals = quals
+ method(tp, realname)
+ except Exception as e:
+ model.attach_exception_info(e, name)
+ raise
+
+ # ----------
+
+ ALL_STEPS = ["global", "field", "struct_union", "enum", "typename"]
+
+ def collect_step_tables(self):
+ # collect the declarations for '_cffi_globals', '_cffi_typenames', etc.
+ self._lsts = {}
+ for step_name in self.ALL_STEPS:
+ self._lsts[step_name] = []
+ self._seen_struct_unions = set()
+ self._generate("ctx")
+ self._add_missing_struct_unions()
+ #
+ for step_name in self.ALL_STEPS:
+ lst = self._lsts[step_name]
+ if step_name != "field":
+ lst.sort(key=lambda entry: entry.name)
+ self._lsts[step_name] = tuple(lst) # don't change any more
+ #
+ # check for a possible internal inconsistency: _cffi_struct_unions
+ # should have been generated with exactly self._struct_unions
+ lst = self._lsts["struct_union"]
+ for tp, i in self._struct_unions.items():
+ assert i < len(lst)
+ assert lst[i].name == tp.name
+ assert len(lst) == len(self._struct_unions)
+ # same with enums
+ lst = self._lsts["enum"]
+ for tp, i in self._enums.items():
+ assert i < len(lst)
+ assert lst[i].name == tp.name
+ assert len(lst) == len(self._enums)
+
+ # ----------
+
+ def _prnt(self, what=''):
+ self._f.write(what + '\n')
+
+ def write_source_to_f(self, f, preamble):
+ if self.target_is_python:
+ assert preamble is None
+ self.write_py_source_to_f(f)
+ else:
+ assert preamble is not None
+ self.write_c_source_to_f(f, preamble)
+
+ def _rel_readlines(self, filename):
+ g = open(os.path.join(os.path.dirname(__file__), filename), 'r')
+ lines = g.readlines()
+ g.close()
+ return lines
+
+ def write_c_source_to_f(self, f, preamble):
+ self._f = f
+ prnt = self._prnt
+ if self.ffi._embedding is not None:
+ prnt('#define _CFFI_USE_EMBEDDING')
+ if not USE_LIMITED_API:
+ prnt('#define _CFFI_NO_LIMITED_API')
+ #
+ # first the '#include' (actually done by inlining the file's content)
+ lines = self._rel_readlines('_cffi_include.h')
+ i = lines.index('#include "parse_c_type.h"\n')
+ lines[i:i+1] = self._rel_readlines('parse_c_type.h')
+ prnt(''.join(lines))
+ #
+ # if we have ffi._embedding != None, we give it here as a macro
+ # and include an extra file
+ base_module_name = self.module_name.split('.')[-1]
+ if self.ffi._embedding is not None:
+ prnt('#define _CFFI_MODULE_NAME "%s"' % (self.module_name,))
+ prnt('static const char _CFFI_PYTHON_STARTUP_CODE[] = {')
+ self._print_string_literal_in_array(self.ffi._embedding)
+ prnt('0 };')
+ prnt('#ifdef PYPY_VERSION')
+ prnt('# define _CFFI_PYTHON_STARTUP_FUNC _cffi_pypyinit_%s' % (
+ base_module_name,))
+ prnt('#elif PY_MAJOR_VERSION >= 3')
+ prnt('# define _CFFI_PYTHON_STARTUP_FUNC PyInit_%s' % (
+ base_module_name,))
+ prnt('#else')
+ prnt('# define _CFFI_PYTHON_STARTUP_FUNC init%s' % (
+ base_module_name,))
+ prnt('#endif')
+ lines = self._rel_readlines('_embedding.h')
+ i = lines.index('#include "_cffi_errors.h"\n')
+ lines[i:i+1] = self._rel_readlines('_cffi_errors.h')
+ prnt(''.join(lines))
+ self.needs_version(VERSION_EMBEDDED)
+ #
+ # then paste the C source given by the user, verbatim.
+ prnt('/************************************************************/')
+ prnt()
+ prnt(preamble)
+ prnt()
+ prnt('/************************************************************/')
+ prnt()
+ #
+ # the declaration of '_cffi_types'
+ prnt('static void *_cffi_types[] = {')
+ typeindex2type = dict([(i, tp) for (tp, i) in self._typesdict.items()])
+ for i, op in enumerate(self.cffi_types):
+ comment = ''
+ if i in typeindex2type:
+ comment = ' // ' + typeindex2type[i]._get_c_name()
+ prnt('/* %2d */ %s,%s' % (i, op.as_c_expr(), comment))
+ if not self.cffi_types:
+ prnt(' 0')
+ prnt('};')
+ prnt()
+ #
+ # call generate_cpy_xxx_decl(), for every xxx found from
+ # ffi._parser._declarations. This generates all the functions.
+ self._seen_constants = set()
+ self._generate("decl")
+ #
+ # the declaration of '_cffi_globals' and '_cffi_typenames'
+ nums = {}
+ for step_name in self.ALL_STEPS:
+ lst = self._lsts[step_name]
+ nums[step_name] = len(lst)
+ if nums[step_name] > 0:
+ prnt('static const struct _cffi_%s_s _cffi_%ss[] = {' % (
+ step_name, step_name))
+ for entry in lst:
+ prnt(entry.as_c_expr())
+ prnt('};')
+ prnt()
+ #
+ # the declaration of '_cffi_includes'
+ if self.ffi._included_ffis:
+ prnt('static const char * const _cffi_includes[] = {')
+ for ffi_to_include in self.ffi._included_ffis:
+ try:
+ included_module_name, included_source = (
+ ffi_to_include._assigned_source[:2])
+ except AttributeError:
+ raise VerificationError(
+ "ffi object %r includes %r, but the latter has not "
+ "been prepared with set_source()" % (
+ self.ffi, ffi_to_include,))
+ if included_source is None:
+ raise VerificationError(
+ "not implemented yet: ffi.include() of a Python-based "
+ "ffi inside a C-based ffi")
+ prnt(' "%s",' % (included_module_name,))
+ prnt(' NULL')
+ prnt('};')
+ prnt()
+ #
+ # the declaration of '_cffi_type_context'
+ prnt('static const struct _cffi_type_context_s _cffi_type_context = {')
+ prnt(' _cffi_types,')
+ for step_name in self.ALL_STEPS:
+ if nums[step_name] > 0:
+ prnt(' _cffi_%ss,' % step_name)
+ else:
+ prnt(' NULL, /* no %ss */' % step_name)
+ for step_name in self.ALL_STEPS:
+ if step_name != "field":
+ prnt(' %d, /* num_%ss */' % (nums[step_name], step_name))
+ if self.ffi._included_ffis:
+ prnt(' _cffi_includes,')
+ else:
+ prnt(' NULL, /* no includes */')
+ prnt(' %d, /* num_types */' % (len(self.cffi_types),))
+ flags = 0
+ if self._num_externpy > 0 or self.ffi._embedding is not None:
+ flags |= 1 # set to mean that we use extern "Python"
+ prnt(' %d, /* flags */' % flags)
+ prnt('};')
+ prnt()
+ #
+ # the init function
+ prnt('#ifdef __GNUC__')
+ prnt('# pragma GCC visibility push(default) /* for -fvisibility= */')
+ prnt('#endif')
+ prnt()
+ prnt('#ifdef PYPY_VERSION')
+ prnt('PyMODINIT_FUNC')
+ prnt('_cffi_pypyinit_%s(const void *p[])' % (base_module_name,))
+ prnt('{')
+ if flags & 1:
+ prnt(' if (((intptr_t)p[0]) >= 0x0A03) {')
+ prnt(' _cffi_call_python_org = '
+ '(void(*)(struct _cffi_externpy_s *, char *))p[1];')
+ prnt(' }')
+ prnt(' p[0] = (const void *)0x%x;' % self._version)
+ prnt(' p[1] = &_cffi_type_context;')
+ prnt('#if PY_MAJOR_VERSION >= 3')
+ prnt(' return NULL;')
+ prnt('#endif')
+ prnt('}')
+ # on Windows, distutils insists on putting init_cffi_xyz in
+ # 'export_symbols', so instead of fighting it, just give up and
+ # give it one
+ prnt('# ifdef _MSC_VER')
+ prnt(' PyMODINIT_FUNC')
+ prnt('# if PY_MAJOR_VERSION >= 3')
+ prnt(' PyInit_%s(void) { return NULL; }' % (base_module_name,))
+ prnt('# else')
+ prnt(' init%s(void) { }' % (base_module_name,))
+ prnt('# endif')
+ prnt('# endif')
+ prnt('#elif PY_MAJOR_VERSION >= 3')
+ prnt('PyMODINIT_FUNC')
+ prnt('PyInit_%s(void)' % (base_module_name,))
+ prnt('{')
+ prnt(' return _cffi_init("%s", 0x%x, &_cffi_type_context);' % (
+ self.module_name, self._version))
+ prnt('}')
+ prnt('#else')
+ prnt('PyMODINIT_FUNC')
+ prnt('init%s(void)' % (base_module_name,))
+ prnt('{')
+ prnt(' _cffi_init("%s", 0x%x, &_cffi_type_context);' % (
+ self.module_name, self._version))
+ prnt('}')
+ prnt('#endif')
+ prnt()
+ prnt('#ifdef __GNUC__')
+ prnt('# pragma GCC visibility pop')
+ prnt('#endif')
+ self._version = None
+
+ def _to_py(self, x):
+ if isinstance(x, str):
+ return "b'%s'" % (x,)
+ if isinstance(x, (list, tuple)):
+ rep = [self._to_py(item) for item in x]
+ if len(rep) == 1:
+ rep.append('')
+ return "(%s)" % (','.join(rep),)
+ return x.as_python_expr() # Py2: unicode unexpected; Py3: bytes unexp.
+
+ def write_py_source_to_f(self, f):
+ self._f = f
+ prnt = self._prnt
+ #
+ # header
+ prnt("# auto-generated file")
+ prnt("import _cffi_backend")
+ #
+ # the 'import' of the included ffis
+ num_includes = len(self.ffi._included_ffis or ())
+ for i in range(num_includes):
+ ffi_to_include = self.ffi._included_ffis[i]
+ try:
+ included_module_name, included_source = (
+ ffi_to_include._assigned_source[:2])
+ except AttributeError:
+ raise VerificationError(
+ "ffi object %r includes %r, but the latter has not "
+ "been prepared with set_source()" % (
+ self.ffi, ffi_to_include,))
+ if included_source is not None:
+ raise VerificationError(
+ "not implemented yet: ffi.include() of a C-based "
+ "ffi inside a Python-based ffi")
+ prnt('from %s import ffi as _ffi%d' % (included_module_name, i))
+ prnt()
+ prnt("ffi = _cffi_backend.FFI('%s'," % (self.module_name,))
+ prnt(" _version = 0x%x," % (self._version,))
+ self._version = None
+ #
+ # the '_types' keyword argument
+ self.cffi_types = tuple(self.cffi_types) # don't change any more
+ types_lst = [op.as_python_bytes() for op in self.cffi_types]
+ prnt(' _types = %s,' % (self._to_py(''.join(types_lst)),))
+ typeindex2type = dict([(i, tp) for (tp, i) in self._typesdict.items()])
+ #
+ # the keyword arguments from ALL_STEPS
+ for step_name in self.ALL_STEPS:
+ lst = self._lsts[step_name]
+ if len(lst) > 0 and step_name != "field":
+ prnt(' _%ss = %s,' % (step_name, self._to_py(lst)))
+ #
+ # the '_includes' keyword argument
+ if num_includes > 0:
+ prnt(' _includes = (%s,),' % (
+ ', '.join(['_ffi%d' % i for i in range(num_includes)]),))
+ #
+ # the footer
+ prnt(')')
+
+ # ----------
+
+ def _gettypenum(self, type):
+ # a KeyError here is a bug. please report it! :-)
+ return self._typesdict[type]
+
+ def _convert_funcarg_to_c(self, tp, fromvar, tovar, errcode):
+ extraarg = ''
+ if isinstance(tp, model.BasePrimitiveType) and not tp.is_complex_type():
+ if tp.is_integer_type() and tp.name != '_Bool':
+ converter = '_cffi_to_c_int'
+ extraarg = ', %s' % tp.name
+ elif isinstance(tp, model.UnknownFloatType):
+ # don't check with is_float_type(): it may be a 'long
+ # double' here, and _cffi_to_c_double would loose precision
+ converter = '(%s)_cffi_to_c_double' % (tp.get_c_name(''),)
+ else:
+ cname = tp.get_c_name('')
+ converter = '(%s)_cffi_to_c_%s' % (cname,
+ tp.name.replace(' ', '_'))
+ if cname in ('char16_t', 'char32_t'):
+ self.needs_version(VERSION_CHAR16CHAR32)
+ errvalue = '-1'
+ #
+ elif isinstance(tp, model.PointerType):
+ self._convert_funcarg_to_c_ptr_or_array(tp, fromvar,
+ tovar, errcode)
+ return
+ #
+ elif (isinstance(tp, model.StructOrUnionOrEnum) or
+ isinstance(tp, model.BasePrimitiveType)):
+ # a struct (not a struct pointer) as a function argument;
+ # or, a complex (the same code works)
+ self._prnt(' if (_cffi_to_c((char *)&%s, _cffi_type(%d), %s) < 0)'
+ % (tovar, self._gettypenum(tp), fromvar))
+ self._prnt(' %s;' % errcode)
+ return
+ #
+ elif isinstance(tp, model.FunctionPtrType):
+ converter = '(%s)_cffi_to_c_pointer' % tp.get_c_name('')
+ extraarg = ', _cffi_type(%d)' % self._gettypenum(tp)
+ errvalue = 'NULL'
+ #
+ else:
+ raise NotImplementedError(tp)
+ #
+ self._prnt(' %s = %s(%s%s);' % (tovar, converter, fromvar, extraarg))
+ self._prnt(' if (%s == (%s)%s && PyErr_Occurred())' % (
+ tovar, tp.get_c_name(''), errvalue))
+ self._prnt(' %s;' % errcode)
+
+ def _extra_local_variables(self, tp, localvars, freelines):
+ if isinstance(tp, model.PointerType):
+ localvars.add('Py_ssize_t datasize')
+ localvars.add('struct _cffi_freeme_s *large_args_free = NULL')
+ freelines.add('if (large_args_free != NULL)'
+ ' _cffi_free_array_arguments(large_args_free);')
+
+ def _convert_funcarg_to_c_ptr_or_array(self, tp, fromvar, tovar, errcode):
+ self._prnt(' datasize = _cffi_prepare_pointer_call_argument(')
+ self._prnt(' _cffi_type(%d), %s, (char **)&%s);' % (
+ self._gettypenum(tp), fromvar, tovar))
+ self._prnt(' if (datasize != 0) {')
+ self._prnt(' %s = ((size_t)datasize) <= 640 ? '
+ '(%s)alloca((size_t)datasize) : NULL;' % (
+ tovar, tp.get_c_name('')))
+ self._prnt(' if (_cffi_convert_array_argument(_cffi_type(%d), %s, '
+ '(char **)&%s,' % (self._gettypenum(tp), fromvar, tovar))
+ self._prnt(' datasize, &large_args_free) < 0)')
+ self._prnt(' %s;' % errcode)
+ self._prnt(' }')
+
+ def _convert_expr_from_c(self, tp, var, context):
+ if isinstance(tp, model.BasePrimitiveType):
+ if tp.is_integer_type() and tp.name != '_Bool':
+ return '_cffi_from_c_int(%s, %s)' % (var, tp.name)
+ elif isinstance(tp, model.UnknownFloatType):
+ return '_cffi_from_c_double(%s)' % (var,)
+ elif tp.name != 'long double' and not tp.is_complex_type():
+ cname = tp.name.replace(' ', '_')
+ if cname in ('char16_t', 'char32_t'):
+ self.needs_version(VERSION_CHAR16CHAR32)
+ return '_cffi_from_c_%s(%s)' % (cname, var)
+ else:
+ return '_cffi_from_c_deref((char *)&%s, _cffi_type(%d))' % (
+ var, self._gettypenum(tp))
+ elif isinstance(tp, (model.PointerType, model.FunctionPtrType)):
+ return '_cffi_from_c_pointer((char *)%s, _cffi_type(%d))' % (
+ var, self._gettypenum(tp))
+ elif isinstance(tp, model.ArrayType):
+ return '_cffi_from_c_pointer((char *)%s, _cffi_type(%d))' % (
+ var, self._gettypenum(model.PointerType(tp.item)))
+ elif isinstance(tp, model.StructOrUnion):
+ if tp.fldnames is None:
+ raise TypeError("'%s' is used as %s, but is opaque" % (
+ tp._get_c_name(), context))
+ return '_cffi_from_c_struct((char *)&%s, _cffi_type(%d))' % (
+ var, self._gettypenum(tp))
+ elif isinstance(tp, model.EnumType):
+ return '_cffi_from_c_deref((char *)&%s, _cffi_type(%d))' % (
+ var, self._gettypenum(tp))
+ else:
+ raise NotImplementedError(tp)
+
+ # ----------
+ # typedefs
+
+ def _typedef_type(self, tp, name):
+ return self._global_type(tp, "(*(%s *)0)" % (name,))
+
+ def _generate_cpy_typedef_collecttype(self, tp, name):
+ self._do_collect_type(self._typedef_type(tp, name))
+
+ def _generate_cpy_typedef_decl(self, tp, name):
+ pass
+
+ def _typedef_ctx(self, tp, name):
+ type_index = self._typesdict[tp]
+ self._lsts["typename"].append(TypenameExpr(name, type_index))
+
+ def _generate_cpy_typedef_ctx(self, tp, name):
+ tp = self._typedef_type(tp, name)
+ self._typedef_ctx(tp, name)
+ if getattr(tp, "origin", None) == "unknown_type":
+ self._struct_ctx(tp, tp.name, approxname=None)
+ elif isinstance(tp, model.NamedPointerType):
+ self._struct_ctx(tp.totype, tp.totype.name, approxname=tp.name,
+ named_ptr=tp)
+
+ # ----------
+ # function declarations
+
+ def _generate_cpy_function_collecttype(self, tp, name):
+ self._do_collect_type(tp.as_raw_function())
+ if tp.ellipsis and not self.target_is_python:
+ self._do_collect_type(tp)
+
+ def _generate_cpy_function_decl(self, tp, name):
+ assert not self.target_is_python
+ assert isinstance(tp, model.FunctionPtrType)
+ if tp.ellipsis:
+ # cannot support vararg functions better than this: check for its
+ # exact type (including the fixed arguments), and build it as a
+ # constant function pointer (no CPython wrapper)
+ self._generate_cpy_constant_decl(tp, name)
+ return
+ prnt = self._prnt
+ numargs = len(tp.args)
+ if numargs == 0:
+ argname = 'noarg'
+ elif numargs == 1:
+ argname = 'arg0'
+ else:
+ argname = 'args'
+ #
+ # ------------------------------
+ # the 'd' version of the function, only for addressof(lib, 'func')
+ arguments = []
+ call_arguments = []
+ context = 'argument of %s' % name
+ for i, type in enumerate(tp.args):
+ arguments.append(type.get_c_name(' x%d' % i, context))
+ call_arguments.append('x%d' % i)
+ repr_arguments = ', '.join(arguments)
+ repr_arguments = repr_arguments or 'void'
+ if tp.abi:
+ abi = tp.abi + ' '
+ else:
+ abi = ''
+ name_and_arguments = '%s_cffi_d_%s(%s)' % (abi, name, repr_arguments)
+ prnt('static %s' % (tp.result.get_c_name(name_and_arguments),))
+ prnt('{')
+ call_arguments = ', '.join(call_arguments)
+ result_code = 'return '
+ if isinstance(tp.result, model.VoidType):
+ result_code = ''
+ prnt(' %s%s(%s);' % (result_code, name, call_arguments))
+ prnt('}')
+ #
+ prnt('#ifndef PYPY_VERSION') # ------------------------------
+ #
+ prnt('static PyObject *')
+ prnt('_cffi_f_%s(PyObject *self, PyObject *%s)' % (name, argname))
+ prnt('{')
+ #
+ context = 'argument of %s' % name
+ for i, type in enumerate(tp.args):
+ arg = type.get_c_name(' x%d' % i, context)
+ prnt(' %s;' % arg)
+ #
+ localvars = set()
+ freelines = set()
+ for type in tp.args:
+ self._extra_local_variables(type, localvars, freelines)
+ for decl in sorted(localvars):
+ prnt(' %s;' % (decl,))
+ #
+ if not isinstance(tp.result, model.VoidType):
+ result_code = 'result = '
+ context = 'result of %s' % name
+ result_decl = ' %s;' % tp.result.get_c_name(' result', context)
+ prnt(result_decl)
+ prnt(' PyObject *pyresult;')
+ else:
+ result_decl = None
+ result_code = ''
+ #
+ if len(tp.args) > 1:
+ rng = range(len(tp.args))
+ for i in rng:
+ prnt(' PyObject *arg%d;' % i)
+ prnt()
+ prnt(' if (!PyArg_UnpackTuple(args, "%s", %d, %d, %s))' % (
+ name, len(rng), len(rng),
+ ', '.join(['&arg%d' % i for i in rng])))
+ prnt(' return NULL;')
+ prnt()
+ #
+ for i, type in enumerate(tp.args):
+ self._convert_funcarg_to_c(type, 'arg%d' % i, 'x%d' % i,
+ 'return NULL')
+ prnt()
+ #
+ prnt(' Py_BEGIN_ALLOW_THREADS')
+ prnt(' _cffi_restore_errno();')
+ call_arguments = ['x%d' % i for i in range(len(tp.args))]
+ call_arguments = ', '.join(call_arguments)
+ prnt(' { %s%s(%s); }' % (result_code, name, call_arguments))
+ prnt(' _cffi_save_errno();')
+ prnt(' Py_END_ALLOW_THREADS')
+ prnt()
+ #
+ prnt(' (void)self; /* unused */')
+ if numargs == 0:
+ prnt(' (void)noarg; /* unused */')
+ if result_code:
+ prnt(' pyresult = %s;' %
+ self._convert_expr_from_c(tp.result, 'result', 'result type'))
+ for freeline in freelines:
+ prnt(' ' + freeline)
+ prnt(' return pyresult;')
+ else:
+ for freeline in freelines:
+ prnt(' ' + freeline)
+ prnt(' Py_INCREF(Py_None);')
+ prnt(' return Py_None;')
+ prnt('}')
+ #
+ prnt('#else') # ------------------------------
+ #
+ # the PyPy version: need to replace struct/union arguments with
+ # pointers, and if the result is a struct/union, insert a first
+ # arg that is a pointer to the result. We also do that for
+ # complex args and return type.
+ def need_indirection(type):
+ return (isinstance(type, model.StructOrUnion) or
+ (isinstance(type, model.PrimitiveType) and
+ type.is_complex_type()))
+ difference = False
+ arguments = []
+ call_arguments = []
+ context = 'argument of %s' % name
+ for i, type in enumerate(tp.args):
+ indirection = ''
+ if need_indirection(type):
+ indirection = '*'
+ difference = True
+ arg = type.get_c_name(' %sx%d' % (indirection, i), context)
+ arguments.append(arg)
+ call_arguments.append('%sx%d' % (indirection, i))
+ tp_result = tp.result
+ if need_indirection(tp_result):
+ context = 'result of %s' % name
+ arg = tp_result.get_c_name(' *result', context)
+ arguments.insert(0, arg)
+ tp_result = model.void_type
+ result_decl = None
+ result_code = '*result = '
+ difference = True
+ if difference:
+ repr_arguments = ', '.join(arguments)
+ repr_arguments = repr_arguments or 'void'
+ name_and_arguments = '%s_cffi_f_%s(%s)' % (abi, name,
+ repr_arguments)
+ prnt('static %s' % (tp_result.get_c_name(name_and_arguments),))
+ prnt('{')
+ if result_decl:
+ prnt(result_decl)
+ call_arguments = ', '.join(call_arguments)
+ prnt(' { %s%s(%s); }' % (result_code, name, call_arguments))
+ if result_decl:
+ prnt(' return result;')
+ prnt('}')
+ else:
+ prnt('# define _cffi_f_%s _cffi_d_%s' % (name, name))
+ #
+ prnt('#endif') # ------------------------------
+ prnt()
+
+ def _generate_cpy_function_ctx(self, tp, name):
+ if tp.ellipsis and not self.target_is_python:
+ self._generate_cpy_constant_ctx(tp, name)
+ return
+ type_index = self._typesdict[tp.as_raw_function()]
+ numargs = len(tp.args)
+ if self.target_is_python:
+ meth_kind = OP_DLOPEN_FUNC
+ elif numargs == 0:
+ meth_kind = OP_CPYTHON_BLTN_N # 'METH_NOARGS'
+ elif numargs == 1:
+ meth_kind = OP_CPYTHON_BLTN_O # 'METH_O'
+ else:
+ meth_kind = OP_CPYTHON_BLTN_V # 'METH_VARARGS'
+ self._lsts["global"].append(
+ GlobalExpr(name, '_cffi_f_%s' % name,
+ CffiOp(meth_kind, type_index),
+ size='_cffi_d_%s' % name))
+
+ # ----------
+ # named structs or unions
+
+ def _field_type(self, tp_struct, field_name, tp_field):
+ if isinstance(tp_field, model.ArrayType):
+ actual_length = tp_field.length
+ if actual_length == '...':
+ ptr_struct_name = tp_struct.get_c_name('*')
+ actual_length = '_cffi_array_len(((%s)0)->%s)' % (
+ ptr_struct_name, field_name)
+ tp_item = self._field_type(tp_struct, '%s[0]' % field_name,
+ tp_field.item)
+ tp_field = model.ArrayType(tp_item, actual_length)
+ return tp_field
+
+ def _struct_collecttype(self, tp):
+ self._do_collect_type(tp)
+ if self.target_is_python:
+ # also requires nested anon struct/unions in ABI mode, recursively
+ for fldtype in tp.anonymous_struct_fields():
+ self._struct_collecttype(fldtype)
+
+ def _struct_decl(self, tp, cname, approxname):
+ if tp.fldtypes is None:
+ return
+ prnt = self._prnt
+ checkfuncname = '_cffi_checkfld_%s' % (approxname,)
+ prnt('_CFFI_UNUSED_FN')
+ prnt('static void %s(%s *p)' % (checkfuncname, cname))
+ prnt('{')
+ prnt(' /* only to generate compile-time warnings or errors */')
+ prnt(' (void)p;')
+ for fname, ftype, fbitsize, fqual in self._enum_fields(tp):
+ try:
+ if ftype.is_integer_type() or fbitsize >= 0:
+ # accept all integers, but complain on float or double
+ if fname != '':
+ prnt(" (void)((p->%s) | 0); /* check that '%s.%s' is "
+ "an integer */" % (fname, cname, fname))
+ continue
+ # only accept exactly the type declared, except that '[]'
+ # is interpreted as a '*' and so will match any array length.
+ # (It would also match '*', but that's harder to detect...)
+ while (isinstance(ftype, model.ArrayType)
+ and (ftype.length is None or ftype.length == '...')):
+ ftype = ftype.item
+ fname = fname + '[0]'
+ prnt(' { %s = &p->%s; (void)tmp; }' % (
+ ftype.get_c_name('*tmp', 'field %r'%fname, quals=fqual),
+ fname))
+ except VerificationError as e:
+ prnt(' /* %s */' % str(e)) # cannot verify it, ignore
+ prnt('}')
+ prnt('struct _cffi_align_%s { char x; %s y; };' % (approxname, cname))
+ prnt()
+
+ def _struct_ctx(self, tp, cname, approxname, named_ptr=None):
+ type_index = self._typesdict[tp]
+ reason_for_not_expanding = None
+ flags = []
+ if isinstance(tp, model.UnionType):
+ flags.append("_CFFI_F_UNION")
+ if tp.fldtypes is None:
+ flags.append("_CFFI_F_OPAQUE")
+ reason_for_not_expanding = "opaque"
+ if (tp not in self.ffi._parser._included_declarations and
+ (named_ptr is None or
+ named_ptr not in self.ffi._parser._included_declarations)):
+ if tp.fldtypes is None:
+ pass # opaque
+ elif tp.partial or any(tp.anonymous_struct_fields()):
+ pass # field layout obtained silently from the C compiler
+ else:
+ flags.append("_CFFI_F_CHECK_FIELDS")
+ if tp.packed:
+ if tp.packed > 1:
+ raise NotImplementedError(
+ "%r is declared with 'pack=%r'; only 0 or 1 are "
+ "supported in API mode (try to use \"...;\", which "
+ "does not require a 'pack' declaration)" %
+ (tp, tp.packed))
+ flags.append("_CFFI_F_PACKED")
+ else:
+ flags.append("_CFFI_F_EXTERNAL")
+ reason_for_not_expanding = "external"
+ flags = '|'.join(flags) or '0'
+ c_fields = []
+ if reason_for_not_expanding is None:
+ enumfields = list(self._enum_fields(tp))
+ for fldname, fldtype, fbitsize, fqual in enumfields:
+ fldtype = self._field_type(tp, fldname, fldtype)
+ self._check_not_opaque(fldtype,
+ "field '%s.%s'" % (tp.name, fldname))
+ # cname is None for _add_missing_struct_unions() only
+ op = OP_NOOP
+ if fbitsize >= 0:
+ op = OP_BITFIELD
+ size = '%d /* bits */' % fbitsize
+ elif cname is None or (
+ isinstance(fldtype, model.ArrayType) and
+ fldtype.length is None):
+ size = '(size_t)-1'
+ else:
+ size = 'sizeof(((%s)0)->%s)' % (
+ tp.get_c_name('*') if named_ptr is None
+ else named_ptr.name,
+ fldname)
+ if cname is None or fbitsize >= 0:
+ offset = '(size_t)-1'
+ elif named_ptr is not None:
+ offset = '((char *)&((%s)4096)->%s) - (char *)4096' % (
+ named_ptr.name, fldname)
+ else:
+ offset = 'offsetof(%s, %s)' % (tp.get_c_name(''), fldname)
+ c_fields.append(
+ FieldExpr(fldname, offset, size, fbitsize,
+ CffiOp(op, self._typesdict[fldtype])))
+ first_field_index = len(self._lsts["field"])
+ self._lsts["field"].extend(c_fields)
+ #
+ if cname is None: # unknown name, for _add_missing_struct_unions
+ size = '(size_t)-2'
+ align = -2
+ comment = "unnamed"
+ else:
+ if named_ptr is not None:
+ size = 'sizeof(*(%s)0)' % (named_ptr.name,)
+ align = '-1 /* unknown alignment */'
+ else:
+ size = 'sizeof(%s)' % (cname,)
+ align = 'offsetof(struct _cffi_align_%s, y)' % (approxname,)
+ comment = None
+ else:
+ size = '(size_t)-1'
+ align = -1
+ first_field_index = -1
+ comment = reason_for_not_expanding
+ self._lsts["struct_union"].append(
+ StructUnionExpr(tp.name, type_index, flags, size, align, comment,
+ first_field_index, c_fields))
+ self._seen_struct_unions.add(tp)
+
+ def _check_not_opaque(self, tp, location):
+ while isinstance(tp, model.ArrayType):
+ tp = tp.item
+ if isinstance(tp, model.StructOrUnion) and tp.fldtypes is None:
+ raise TypeError(
+ "%s is of an opaque type (not declared in cdef())" % location)
+
+ def _add_missing_struct_unions(self):
+ # not very nice, but some struct declarations might be missing
+ # because they don't have any known C name. Check that they are
+ # not partial (we can't complete or verify them!) and emit them
+ # anonymously.
+ lst = list(self._struct_unions.items())
+ lst.sort(key=lambda tp_order: tp_order[1])
+ for tp, order in lst:
+ if tp not in self._seen_struct_unions:
+ if tp.partial:
+ raise NotImplementedError("internal inconsistency: %r is "
+ "partial but was not seen at "
+ "this point" % (tp,))
+ if tp.name.startswith('$') and tp.name[1:].isdigit():
+ approxname = tp.name[1:]
+ elif tp.name == '_IO_FILE' and tp.forcename == 'FILE':
+ approxname = 'FILE'
+ self._typedef_ctx(tp, 'FILE')
+ else:
+ raise NotImplementedError("internal inconsistency: %r" %
+ (tp,))
+ self._struct_ctx(tp, None, approxname)
+
+ def _generate_cpy_struct_collecttype(self, tp, name):
+ self._struct_collecttype(tp)
+ _generate_cpy_union_collecttype = _generate_cpy_struct_collecttype
+
+ def _struct_names(self, tp):
+ cname = tp.get_c_name('')
+ if ' ' in cname:
+ return cname, cname.replace(' ', '_')
+ else:
+ return cname, '_' + cname
+
+ def _generate_cpy_struct_decl(self, tp, name):
+ self._struct_decl(tp, *self._struct_names(tp))
+ _generate_cpy_union_decl = _generate_cpy_struct_decl
+
+ def _generate_cpy_struct_ctx(self, tp, name):
+ self._struct_ctx(tp, *self._struct_names(tp))
+ _generate_cpy_union_ctx = _generate_cpy_struct_ctx
+
+ # ----------
+ # 'anonymous' declarations. These are produced for anonymous structs
+ # or unions; the 'name' is obtained by a typedef.
+
+ def _generate_cpy_anonymous_collecttype(self, tp, name):
+ if isinstance(tp, model.EnumType):
+ self._generate_cpy_enum_collecttype(tp, name)
+ else:
+ self._struct_collecttype(tp)
+
+ def _generate_cpy_anonymous_decl(self, tp, name):
+ if isinstance(tp, model.EnumType):
+ self._generate_cpy_enum_decl(tp)
+ else:
+ self._struct_decl(tp, name, 'typedef_' + name)
+
+ def _generate_cpy_anonymous_ctx(self, tp, name):
+ if isinstance(tp, model.EnumType):
+ self._enum_ctx(tp, name)
+ else:
+ self._struct_ctx(tp, name, 'typedef_' + name)
+
+ # ----------
+ # constants, declared with "static const ..."
+
+ def _generate_cpy_const(self, is_int, name, tp=None, category='const',
+ check_value=None):
+ if (category, name) in self._seen_constants:
+ raise VerificationError(
+ "duplicate declaration of %s '%s'" % (category, name))
+ self._seen_constants.add((category, name))
+ #
+ prnt = self._prnt
+ funcname = '_cffi_%s_%s' % (category, name)
+ if is_int:
+ prnt('static int %s(unsigned long long *o)' % funcname)
+ prnt('{')
+ prnt(' int n = (%s) <= 0;' % (name,))
+ prnt(' *o = (unsigned long long)((%s) | 0);'
+ ' /* check that %s is an integer */' % (name, name))
+ if check_value is not None:
+ if check_value > 0:
+ check_value = '%dU' % (check_value,)
+ prnt(' if (!_cffi_check_int(*o, n, %s))' % (check_value,))
+ prnt(' n |= 2;')
+ prnt(' return n;')
+ prnt('}')
+ else:
+ assert check_value is None
+ prnt('static void %s(char *o)' % funcname)
+ prnt('{')
+ prnt(' *(%s)o = %s;' % (tp.get_c_name('*'), name))
+ prnt('}')
+ prnt()
+
+ def _generate_cpy_constant_collecttype(self, tp, name):
+ is_int = tp.is_integer_type()
+ if not is_int or self.target_is_python:
+ self._do_collect_type(tp)
+
+ def _generate_cpy_constant_decl(self, tp, name):
+ is_int = tp.is_integer_type()
+ self._generate_cpy_const(is_int, name, tp)
+
+ def _generate_cpy_constant_ctx(self, tp, name):
+ if not self.target_is_python and tp.is_integer_type():
+ type_op = CffiOp(OP_CONSTANT_INT, -1)
+ else:
+ if self.target_is_python:
+ const_kind = OP_DLOPEN_CONST
+ else:
+ const_kind = OP_CONSTANT
+ type_index = self._typesdict[tp]
+ type_op = CffiOp(const_kind, type_index)
+ self._lsts["global"].append(
+ GlobalExpr(name, '_cffi_const_%s' % name, type_op))
+
+ # ----------
+ # enums
+
+ def _generate_cpy_enum_collecttype(self, tp, name):
+ self._do_collect_type(tp)
+
+ def _generate_cpy_enum_decl(self, tp, name=None):
+ for enumerator in tp.enumerators:
+ self._generate_cpy_const(True, enumerator)
+
+ def _enum_ctx(self, tp, cname):
+ type_index = self._typesdict[tp]
+ type_op = CffiOp(OP_ENUM, -1)
+ if self.target_is_python:
+ tp.check_not_partial()
+ for enumerator, enumvalue in zip(tp.enumerators, tp.enumvalues):
+ self._lsts["global"].append(
+ GlobalExpr(enumerator, '_cffi_const_%s' % enumerator, type_op,
+ check_value=enumvalue))
+ #
+ if cname is not None and '$' not in cname and not self.target_is_python:
+ size = "sizeof(%s)" % cname
+ signed = "((%s)-1) <= 0" % cname
+ else:
+ basetp = tp.build_baseinttype(self.ffi, [])
+ size = self.ffi.sizeof(basetp)
+ signed = int(int(self.ffi.cast(basetp, -1)) < 0)
+ allenums = ",".join(tp.enumerators)
+ self._lsts["enum"].append(
+ EnumExpr(tp.name, type_index, size, signed, allenums))
+
+ def _generate_cpy_enum_ctx(self, tp, name):
+ self._enum_ctx(tp, tp._get_c_name())
+
+ # ----------
+ # macros: for now only for integers
+
+ def _generate_cpy_macro_collecttype(self, tp, name):
+ pass
+
+ def _generate_cpy_macro_decl(self, tp, name):
+ if tp == '...':
+ check_value = None
+ else:
+ check_value = tp # an integer
+ self._generate_cpy_const(True, name, check_value=check_value)
+
+ def _generate_cpy_macro_ctx(self, tp, name):
+ if tp == '...':
+ if self.target_is_python:
+ raise VerificationError(
+ "cannot use the syntax '...' in '#define %s ...' when "
+ "using the ABI mode" % (name,))
+ check_value = None
+ else:
+ check_value = tp # an integer
+ type_op = CffiOp(OP_CONSTANT_INT, -1)
+ self._lsts["global"].append(
+ GlobalExpr(name, '_cffi_const_%s' % name, type_op,
+ check_value=check_value))
+
+ # ----------
+ # global variables
+
+ def _global_type(self, tp, global_name):
+ if isinstance(tp, model.ArrayType):
+ actual_length = tp.length
+ if actual_length == '...':
+ actual_length = '_cffi_array_len(%s)' % (global_name,)
+ tp_item = self._global_type(tp.item, '%s[0]' % global_name)
+ tp = model.ArrayType(tp_item, actual_length)
+ return tp
+
+ def _generate_cpy_variable_collecttype(self, tp, name):
+ self._do_collect_type(self._global_type(tp, name))
+
+ def _generate_cpy_variable_decl(self, tp, name):
+ prnt = self._prnt
+ tp = self._global_type(tp, name)
+ if isinstance(tp, model.ArrayType) and tp.length is None:
+ tp = tp.item
+ ampersand = ''
+ else:
+ ampersand = '&'
+ # This code assumes that casts from "tp *" to "void *" is a
+ # no-op, i.e. a function that returns a "tp *" can be called
+ # as if it returned a "void *". This should be generally true
+ # on any modern machine. The only exception to that rule (on
+ # uncommon architectures, and as far as I can tell) might be
+ # if 'tp' were a function type, but that is not possible here.
+ # (If 'tp' is a function _pointer_ type, then casts from "fn_t
+ # **" to "void *" are again no-ops, as far as I can tell.)
+ decl = '*_cffi_var_%s(void)' % (name,)
+ prnt('static ' + tp.get_c_name(decl, quals=self._current_quals))
+ prnt('{')
+ prnt(' return %s(%s);' % (ampersand, name))
+ prnt('}')
+ prnt()
+
+ def _generate_cpy_variable_ctx(self, tp, name):
+ tp = self._global_type(tp, name)
+ type_index = self._typesdict[tp]
+ if self.target_is_python:
+ op = OP_GLOBAL_VAR
+ else:
+ op = OP_GLOBAL_VAR_F
+ self._lsts["global"].append(
+ GlobalExpr(name, '_cffi_var_%s' % name, CffiOp(op, type_index)))
+
+ # ----------
+ # extern "Python"
+
+ def _generate_cpy_extern_python_collecttype(self, tp, name):
+ assert isinstance(tp, model.FunctionPtrType)
+ self._do_collect_type(tp)
+ _generate_cpy_dllexport_python_collecttype = \
+ _generate_cpy_extern_python_plus_c_collecttype = \
+ _generate_cpy_extern_python_collecttype
+
+ def _extern_python_decl(self, tp, name, tag_and_space):
+ prnt = self._prnt
+ if isinstance(tp.result, model.VoidType):
+ size_of_result = '0'
+ else:
+ context = 'result of %s' % name
+ size_of_result = '(int)sizeof(%s)' % (
+ tp.result.get_c_name('', context),)
+ prnt('static struct _cffi_externpy_s _cffi_externpy__%s =' % name)
+ prnt(' { "%s.%s", %s, 0, 0 };' % (
+ self.module_name, name, size_of_result))
+ prnt()
+ #
+ arguments = []
+ context = 'argument of %s' % name
+ for i, type in enumerate(tp.args):
+ arg = type.get_c_name(' a%d' % i, context)
+ arguments.append(arg)
+ #
+ repr_arguments = ', '.join(arguments)
+ repr_arguments = repr_arguments or 'void'
+ name_and_arguments = '%s(%s)' % (name, repr_arguments)
+ if tp.abi == "__stdcall":
+ name_and_arguments = '_cffi_stdcall ' + name_and_arguments
+ #
+ def may_need_128_bits(tp):
+ return (isinstance(tp, model.PrimitiveType) and
+ tp.name == 'long double')
+ #
+ size_of_a = max(len(tp.args)*8, 8)
+ if may_need_128_bits(tp.result):
+ size_of_a = max(size_of_a, 16)
+ if isinstance(tp.result, model.StructOrUnion):
+ size_of_a = 'sizeof(%s) > %d ? sizeof(%s) : %d' % (
+ tp.result.get_c_name(''), size_of_a,
+ tp.result.get_c_name(''), size_of_a)
+ prnt('%s%s' % (tag_and_space, tp.result.get_c_name(name_and_arguments)))
+ prnt('{')
+ prnt(' char a[%s];' % size_of_a)
+ prnt(' char *p = a;')
+ for i, type in enumerate(tp.args):
+ arg = 'a%d' % i
+ if (isinstance(type, model.StructOrUnion) or
+ may_need_128_bits(type)):
+ arg = '&' + arg
+ type = model.PointerType(type)
+ prnt(' *(%s)(p + %d) = %s;' % (type.get_c_name('*'), i*8, arg))
+ prnt(' _cffi_call_python(&_cffi_externpy__%s, p);' % name)
+ if not isinstance(tp.result, model.VoidType):
+ prnt(' return *(%s)p;' % (tp.result.get_c_name('*'),))
+ prnt('}')
+ prnt()
+ self._num_externpy += 1
+
+ def _generate_cpy_extern_python_decl(self, tp, name):
+ self._extern_python_decl(tp, name, 'static ')
+
+ def _generate_cpy_dllexport_python_decl(self, tp, name):
+ self._extern_python_decl(tp, name, 'CFFI_DLLEXPORT ')
+
+ def _generate_cpy_extern_python_plus_c_decl(self, tp, name):
+ self._extern_python_decl(tp, name, '')
+
+ def _generate_cpy_extern_python_ctx(self, tp, name):
+ if self.target_is_python:
+ raise VerificationError(
+ "cannot use 'extern \"Python\"' in the ABI mode")
+ if tp.ellipsis:
+ raise NotImplementedError("a vararg function is extern \"Python\"")
+ type_index = self._typesdict[tp]
+ type_op = CffiOp(OP_EXTERN_PYTHON, type_index)
+ self._lsts["global"].append(
+ GlobalExpr(name, '&_cffi_externpy__%s' % name, type_op, name))
+
+ _generate_cpy_dllexport_python_ctx = \
+ _generate_cpy_extern_python_plus_c_ctx = \
+ _generate_cpy_extern_python_ctx
+
+ def _print_string_literal_in_array(self, s):
+ prnt = self._prnt
+ prnt('// # NB. this is not a string because of a size limit in MSVC')
+ if not isinstance(s, bytes): # unicode
+ s = s.encode('utf-8') # -> bytes
+ else:
+ s.decode('utf-8') # got bytes, check for valid utf-8
+ try:
+ s.decode('ascii')
+ except UnicodeDecodeError:
+ s = b'# -*- encoding: utf8 -*-\n' + s
+ for line in s.splitlines(True):
+ comment = line
+ if type('//') is bytes: # python2
+ line = map(ord, line) # make a list of integers
+ else: # python3
+ # type(line) is bytes, which enumerates like a list of integers
+ comment = ascii(comment)[1:-1]
+ prnt(('// ' + comment).rstrip())
+ printed_line = ''
+ for c in line:
+ if len(printed_line) >= 76:
+ prnt(printed_line)
+ printed_line = ''
+ printed_line += '%d,' % (c,)
+ prnt(printed_line)
+
+ # ----------
+ # emitting the opcodes for individual types
+
+ def _emit_bytecode_VoidType(self, tp, index):
+ self.cffi_types[index] = CffiOp(OP_PRIMITIVE, PRIM_VOID)
+
+ def _emit_bytecode_PrimitiveType(self, tp, index):
+ prim_index = PRIMITIVE_TO_INDEX[tp.name]
+ self.cffi_types[index] = CffiOp(OP_PRIMITIVE, prim_index)
+
+ def _emit_bytecode_UnknownIntegerType(self, tp, index):
+ s = ('_cffi_prim_int(sizeof(%s), (\n'
+ ' ((%s)-1) | 0 /* check that %s is an integer type */\n'
+ ' ) <= 0)' % (tp.name, tp.name, tp.name))
+ self.cffi_types[index] = CffiOp(OP_PRIMITIVE, s)
+
+ def _emit_bytecode_UnknownFloatType(self, tp, index):
+ s = ('_cffi_prim_float(sizeof(%s) *\n'
+ ' (((%s)1) / 2) * 2 /* integer => 0, float => 1 */\n'
+ ' )' % (tp.name, tp.name))
+ self.cffi_types[index] = CffiOp(OP_PRIMITIVE, s)
+
+ def _emit_bytecode_RawFunctionType(self, tp, index):
+ self.cffi_types[index] = CffiOp(OP_FUNCTION, self._typesdict[tp.result])
+ index += 1
+ for tp1 in tp.args:
+ realindex = self._typesdict[tp1]
+ if index != realindex:
+ if isinstance(tp1, model.PrimitiveType):
+ self._emit_bytecode_PrimitiveType(tp1, index)
+ else:
+ self.cffi_types[index] = CffiOp(OP_NOOP, realindex)
+ index += 1
+ flags = int(tp.ellipsis)
+ if tp.abi is not None:
+ if tp.abi == '__stdcall':
+ flags |= 2
+ else:
+ raise NotImplementedError("abi=%r" % (tp.abi,))
+ self.cffi_types[index] = CffiOp(OP_FUNCTION_END, flags)
+
+ def _emit_bytecode_PointerType(self, tp, index):
+ self.cffi_types[index] = CffiOp(OP_POINTER, self._typesdict[tp.totype])
+
+ _emit_bytecode_ConstPointerType = _emit_bytecode_PointerType
+ _emit_bytecode_NamedPointerType = _emit_bytecode_PointerType
+
+ def _emit_bytecode_FunctionPtrType(self, tp, index):
+ raw = tp.as_raw_function()
+ self.cffi_types[index] = CffiOp(OP_POINTER, self._typesdict[raw])
+
+ def _emit_bytecode_ArrayType(self, tp, index):
+ item_index = self._typesdict[tp.item]
+ if tp.length is None:
+ self.cffi_types[index] = CffiOp(OP_OPEN_ARRAY, item_index)
+ elif tp.length == '...':
+ raise VerificationError(
+ "type %s badly placed: the '...' array length can only be "
+ "used on global arrays or on fields of structures" % (
+ str(tp).replace('/*...*/', '...'),))
+ else:
+ assert self.cffi_types[index + 1] == 'LEN'
+ self.cffi_types[index] = CffiOp(OP_ARRAY, item_index)
+ self.cffi_types[index + 1] = CffiOp(None, str(tp.length))
+
+ def _emit_bytecode_StructType(self, tp, index):
+ struct_index = self._struct_unions[tp]
+ self.cffi_types[index] = CffiOp(OP_STRUCT_UNION, struct_index)
+ _emit_bytecode_UnionType = _emit_bytecode_StructType
+
+ def _emit_bytecode_EnumType(self, tp, index):
+ enum_index = self._enums[tp]
+ self.cffi_types[index] = CffiOp(OP_ENUM, enum_index)
+
+
+if sys.version_info >= (3,):
+ NativeIO = io.StringIO
+else:
+ class NativeIO(io.BytesIO):
+ def write(self, s):
+ if isinstance(s, unicode):
+ s = s.encode('ascii')
+ super(NativeIO, self).write(s)
+
+def _is_file_like(maybefile):
+ # compare to xml.etree.ElementTree._get_writer
+ return hasattr(maybefile, 'write')
+
+def _make_c_or_py_source(ffi, module_name, preamble, target_file, verbose):
+ if verbose:
+ print("generating %s" % (target_file,))
+ recompiler = Recompiler(ffi, module_name,
+ target_is_python=(preamble is None))
+ recompiler.collect_type_table()
+ recompiler.collect_step_tables()
+ if _is_file_like(target_file):
+ recompiler.write_source_to_f(target_file, preamble)
+ return True
+ f = NativeIO()
+ recompiler.write_source_to_f(f, preamble)
+ output = f.getvalue()
+ try:
+ with open(target_file, 'r') as f1:
+ if f1.read(len(output) + 1) != output:
+ raise IOError
+ if verbose:
+ print("(already up-to-date)")
+ return False # already up-to-date
+ except IOError:
+ tmp_file = '%s.~%d' % (target_file, os.getpid())
+ with open(tmp_file, 'w') as f1:
+ f1.write(output)
+ try:
+ os.rename(tmp_file, target_file)
+ except OSError:
+ os.unlink(target_file)
+ os.rename(tmp_file, target_file)
+ return True
+
+def make_c_source(ffi, module_name, preamble, target_c_file, verbose=False):
+ assert preamble is not None
+ return _make_c_or_py_source(ffi, module_name, preamble, target_c_file,
+ verbose)
+
+def make_py_source(ffi, module_name, target_py_file, verbose=False):
+ return _make_c_or_py_source(ffi, module_name, None, target_py_file,
+ verbose)
+
+def _modname_to_file(outputdir, modname, extension):
+ parts = modname.split('.')
+ try:
+ os.makedirs(os.path.join(outputdir, *parts[:-1]))
+ except OSError:
+ pass
+ parts[-1] += extension
+ return os.path.join(outputdir, *parts), parts
+
+
+# Aaargh. Distutils is not tested at all for the purpose of compiling
+# DLLs that are not extension modules. Here are some hacks to work
+# around that, in the _patch_for_*() functions...
+
+def _patch_meth(patchlist, cls, name, new_meth):
+ old = getattr(cls, name)
+ patchlist.append((cls, name, old))
+ setattr(cls, name, new_meth)
+ return old
+
+def _unpatch_meths(patchlist):
+ for cls, name, old_meth in reversed(patchlist):
+ setattr(cls, name, old_meth)
+
+def _patch_for_embedding(patchlist):
+ if sys.platform == 'win32':
+ # we must not remove the manifest when building for embedding!
+ # FUTURE: this module was removed in setuptools 74; this is likely dead code and should be removed,
+ # since the toolchain it supports (VS2005-2008) is also long dead.
+ from cffi._shimmed_dist_utils import MSVCCompiler
+ if MSVCCompiler is not None:
+ _patch_meth(patchlist, MSVCCompiler, '_remove_visual_c_ref',
+ lambda self, manifest_file: manifest_file)
+
+ if sys.platform == 'darwin':
+ # we must not make a '-bundle', but a '-dynamiclib' instead
+ from cffi._shimmed_dist_utils import CCompiler
+ def my_link_shared_object(self, *args, **kwds):
+ if '-bundle' in self.linker_so:
+ self.linker_so = list(self.linker_so)
+ i = self.linker_so.index('-bundle')
+ self.linker_so[i] = '-dynamiclib'
+ return old_link_shared_object(self, *args, **kwds)
+ old_link_shared_object = _patch_meth(patchlist, CCompiler,
+ 'link_shared_object',
+ my_link_shared_object)
+
+def _patch_for_target(patchlist, target):
+ from cffi._shimmed_dist_utils import build_ext
+ # if 'target' is different from '*', we need to patch some internal
+ # method to just return this 'target' value, instead of having it
+ # built from module_name
+ if target.endswith('.*'):
+ target = target[:-2]
+ if sys.platform == 'win32':
+ target += '.dll'
+ elif sys.platform == 'darwin':
+ target += '.dylib'
+ else:
+ target += '.so'
+ _patch_meth(patchlist, build_ext, 'get_ext_filename',
+ lambda self, ext_name: target)
+
+
+def recompile(ffi, module_name, preamble, tmpdir='.', call_c_compiler=True,
+ c_file=None, source_extension='.c', extradir=None,
+ compiler_verbose=1, target=None, debug=None,
+ uses_ffiplatform=True, **kwds):
+ if not isinstance(module_name, str):
+ module_name = module_name.encode('ascii')
+ if ffi._windows_unicode:
+ ffi._apply_windows_unicode(kwds)
+ if preamble is not None:
+ if call_c_compiler and _is_file_like(c_file):
+ raise TypeError("Writing to file-like objects is not supported "
+ "with call_c_compiler=True")
+ embedding = (ffi._embedding is not None)
+ if embedding:
+ ffi._apply_embedding_fix(kwds)
+ if c_file is None:
+ c_file, parts = _modname_to_file(tmpdir, module_name,
+ source_extension)
+ if extradir:
+ parts = [extradir] + parts
+ ext_c_file = os.path.join(*parts)
+ else:
+ ext_c_file = c_file
+ #
+ if target is None:
+ if embedding:
+ target = '%s.*' % module_name
+ else:
+ target = '*'
+ #
+ if uses_ffiplatform:
+ ext = ffiplatform.get_extension(ext_c_file, module_name, **kwds)
+ else:
+ ext = None
+ updated = make_c_source(ffi, module_name, preamble, c_file,
+ verbose=compiler_verbose)
+ if call_c_compiler:
+ patchlist = []
+ cwd = os.getcwd()
+ try:
+ if embedding:
+ _patch_for_embedding(patchlist)
+ if target != '*':
+ _patch_for_target(patchlist, target)
+ if compiler_verbose:
+ if tmpdir == '.':
+ msg = 'the current directory is'
+ else:
+ msg = 'setting the current directory to'
+ print('%s %r' % (msg, os.path.abspath(tmpdir)))
+ os.chdir(tmpdir)
+ outputfilename = ffiplatform.compile('.', ext,
+ compiler_verbose, debug)
+ finally:
+ os.chdir(cwd)
+ _unpatch_meths(patchlist)
+ return outputfilename
+ else:
+ return ext, updated
+ else:
+ if c_file is None:
+ c_file, _ = _modname_to_file(tmpdir, module_name, '.py')
+ updated = make_py_source(ffi, module_name, c_file,
+ verbose=compiler_verbose)
+ if call_c_compiler:
+ return c_file
+ else:
+ return None, updated
+
diff --git a/templates/skills/file_manager/dependencies/cffi/setuptools_ext.py b/templates/skills/file_manager/dependencies/cffi/setuptools_ext.py
new file mode 100644
index 00000000..681b49d7
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/cffi/setuptools_ext.py
@@ -0,0 +1,216 @@
+import os
+import sys
+
+try:
+ basestring
+except NameError:
+ # Python 3.x
+ basestring = str
+
+def error(msg):
+ from cffi._shimmed_dist_utils import DistutilsSetupError
+ raise DistutilsSetupError(msg)
+
+
+def execfile(filename, glob):
+ # We use execfile() (here rewritten for Python 3) instead of
+ # __import__() to load the build script. The problem with
+ # a normal import is that in some packages, the intermediate
+ # __init__.py files may already try to import the file that
+ # we are generating.
+ with open(filename) as f:
+ src = f.read()
+ src += '\n' # Python 2.6 compatibility
+ code = compile(src, filename, 'exec')
+ exec(code, glob, glob)
+
+
+def add_cffi_module(dist, mod_spec):
+ from cffi.api import FFI
+
+ if not isinstance(mod_spec, basestring):
+ error("argument to 'cffi_modules=...' must be a str or a list of str,"
+ " not %r" % (type(mod_spec).__name__,))
+ mod_spec = str(mod_spec)
+ try:
+ build_file_name, ffi_var_name = mod_spec.split(':')
+ except ValueError:
+ error("%r must be of the form 'path/build.py:ffi_variable'" %
+ (mod_spec,))
+ if not os.path.exists(build_file_name):
+ ext = ''
+ rewritten = build_file_name.replace('.', '/') + '.py'
+ if os.path.exists(rewritten):
+ ext = ' (rewrite cffi_modules to [%r])' % (
+ rewritten + ':' + ffi_var_name,)
+ error("%r does not name an existing file%s" % (build_file_name, ext))
+
+ mod_vars = {'__name__': '__cffi__', '__file__': build_file_name}
+ execfile(build_file_name, mod_vars)
+
+ try:
+ ffi = mod_vars[ffi_var_name]
+ except KeyError:
+ error("%r: object %r not found in module" % (mod_spec,
+ ffi_var_name))
+ if not isinstance(ffi, FFI):
+ ffi = ffi() # maybe it's a function instead of directly an ffi
+ if not isinstance(ffi, FFI):
+ error("%r is not an FFI instance (got %r)" % (mod_spec,
+ type(ffi).__name__))
+ if not hasattr(ffi, '_assigned_source'):
+ error("%r: the set_source() method was not called" % (mod_spec,))
+ module_name, source, source_extension, kwds = ffi._assigned_source
+ if ffi._windows_unicode:
+ kwds = kwds.copy()
+ ffi._apply_windows_unicode(kwds)
+
+ if source is None:
+ _add_py_module(dist, ffi, module_name)
+ else:
+ _add_c_module(dist, ffi, module_name, source, source_extension, kwds)
+
+def _set_py_limited_api(Extension, kwds):
+ """
+ Add py_limited_api to kwds if setuptools >= 26 is in use.
+ Do not alter the setting if it already exists.
+ Setuptools takes care of ignoring the flag on Python 2 and PyPy.
+
+ CPython itself should ignore the flag in a debugging version
+ (by not listing .abi3.so in the extensions it supports), but
+ it doesn't so far, creating troubles. That's why we check
+ for "not hasattr(sys, 'gettotalrefcount')" (the 2.7 compatible equivalent
+ of 'd' not in sys.abiflags). (http://bugs.python.org/issue28401)
+
+ On Windows, with CPython <= 3.4, it's better not to use py_limited_api
+ because virtualenv *still* doesn't copy PYTHON3.DLL on these versions.
+ Recently (2020) we started shipping only >= 3.5 wheels, though. So
+ we'll give it another try and set py_limited_api on Windows >= 3.5.
+ """
+ from cffi import recompiler
+
+ if ('py_limited_api' not in kwds and not hasattr(sys, 'gettotalrefcount')
+ and recompiler.USE_LIMITED_API):
+ import setuptools
+ try:
+ setuptools_major_version = int(setuptools.__version__.partition('.')[0])
+ if setuptools_major_version >= 26:
+ kwds['py_limited_api'] = True
+ except ValueError: # certain development versions of setuptools
+ # If we don't know the version number of setuptools, we
+ # try to set 'py_limited_api' anyway. At worst, we get a
+ # warning.
+ kwds['py_limited_api'] = True
+ return kwds
+
+def _add_c_module(dist, ffi, module_name, source, source_extension, kwds):
+ # We are a setuptools extension. Need this build_ext for py_limited_api.
+ from setuptools.command.build_ext import build_ext
+ from cffi._shimmed_dist_utils import Extension, log, mkpath
+ from cffi import recompiler
+
+ allsources = ['$PLACEHOLDER']
+ allsources.extend(kwds.pop('sources', []))
+ kwds = _set_py_limited_api(Extension, kwds)
+ ext = Extension(name=module_name, sources=allsources, **kwds)
+
+ def make_mod(tmpdir, pre_run=None):
+ c_file = os.path.join(tmpdir, module_name + source_extension)
+ log.info("generating cffi module %r" % c_file)
+ mkpath(tmpdir)
+ # a setuptools-only, API-only hook: called with the "ext" and "ffi"
+ # arguments just before we turn the ffi into C code. To use it,
+ # subclass the 'distutils.command.build_ext.build_ext' class and
+ # add a method 'def pre_run(self, ext, ffi)'.
+ if pre_run is not None:
+ pre_run(ext, ffi)
+ updated = recompiler.make_c_source(ffi, module_name, source, c_file)
+ if not updated:
+ log.info("already up-to-date")
+ return c_file
+
+ if dist.ext_modules is None:
+ dist.ext_modules = []
+ dist.ext_modules.append(ext)
+
+ base_class = dist.cmdclass.get('build_ext', build_ext)
+ class build_ext_make_mod(base_class):
+ def run(self):
+ if ext.sources[0] == '$PLACEHOLDER':
+ pre_run = getattr(self, 'pre_run', None)
+ ext.sources[0] = make_mod(self.build_temp, pre_run)
+ base_class.run(self)
+ dist.cmdclass['build_ext'] = build_ext_make_mod
+ # NB. multiple runs here will create multiple 'build_ext_make_mod'
+ # classes. Even in this case the 'build_ext' command should be
+ # run once; but just in case, the logic above does nothing if
+ # called again.
+
+
+def _add_py_module(dist, ffi, module_name):
+ from setuptools.command.build_py import build_py
+ from setuptools.command.build_ext import build_ext
+ from cffi._shimmed_dist_utils import log, mkpath
+ from cffi import recompiler
+
+ def generate_mod(py_file):
+ log.info("generating cffi module %r" % py_file)
+ mkpath(os.path.dirname(py_file))
+ updated = recompiler.make_py_source(ffi, module_name, py_file)
+ if not updated:
+ log.info("already up-to-date")
+
+ base_class = dist.cmdclass.get('build_py', build_py)
+ class build_py_make_mod(base_class):
+ def run(self):
+ base_class.run(self)
+ module_path = module_name.split('.')
+ module_path[-1] += '.py'
+ generate_mod(os.path.join(self.build_lib, *module_path))
+ def get_source_files(self):
+ # This is called from 'setup.py sdist' only. Exclude
+ # the generate .py module in this case.
+ saved_py_modules = self.py_modules
+ try:
+ if saved_py_modules:
+ self.py_modules = [m for m in saved_py_modules
+ if m != module_name]
+ return base_class.get_source_files(self)
+ finally:
+ self.py_modules = saved_py_modules
+ dist.cmdclass['build_py'] = build_py_make_mod
+
+ # distutils and setuptools have no notion I could find of a
+ # generated python module. If we don't add module_name to
+ # dist.py_modules, then things mostly work but there are some
+ # combination of options (--root and --record) that will miss
+ # the module. So we add it here, which gives a few apparently
+ # harmless warnings about not finding the file outside the
+ # build directory.
+ # Then we need to hack more in get_source_files(); see above.
+ if dist.py_modules is None:
+ dist.py_modules = []
+ dist.py_modules.append(module_name)
+
+ # the following is only for "build_ext -i"
+ base_class_2 = dist.cmdclass.get('build_ext', build_ext)
+ class build_ext_make_mod(base_class_2):
+ def run(self):
+ base_class_2.run(self)
+ if self.inplace:
+ # from get_ext_fullpath() in distutils/command/build_ext.py
+ module_path = module_name.split('.')
+ package = '.'.join(module_path[:-1])
+ build_py = self.get_finalized_command('build_py')
+ package_dir = build_py.get_package_dir(package)
+ file_name = module_path[-1] + '.py'
+ generate_mod(os.path.join(package_dir, file_name))
+ dist.cmdclass['build_ext'] = build_ext_make_mod
+
+def cffi_modules(dist, attr, value):
+ assert attr == 'cffi_modules'
+ if isinstance(value, basestring):
+ value = [value]
+
+ for cffi_module in value:
+ add_cffi_module(dist, cffi_module)
diff --git a/templates/skills/file_manager/dependencies/cffi/vengine_cpy.py b/templates/skills/file_manager/dependencies/cffi/vengine_cpy.py
new file mode 100644
index 00000000..eb0b6f70
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/cffi/vengine_cpy.py
@@ -0,0 +1,1084 @@
+#
+# DEPRECATED: implementation for ffi.verify()
+#
+import sys
+from . import model
+from .error import VerificationError
+from . import _imp_emulation as imp
+
+
+class VCPythonEngine(object):
+ _class_key = 'x'
+ _gen_python_module = True
+
+ def __init__(self, verifier):
+ self.verifier = verifier
+ self.ffi = verifier.ffi
+ self._struct_pending_verification = {}
+ self._types_of_builtin_functions = {}
+
+ def patch_extension_kwds(self, kwds):
+ pass
+
+ def find_module(self, module_name, path, so_suffixes):
+ try:
+ f, filename, descr = imp.find_module(module_name, path)
+ except ImportError:
+ return None
+ if f is not None:
+ f.close()
+ # Note that after a setuptools installation, there are both .py
+ # and .so files with the same basename. The code here relies on
+ # imp.find_module() locating the .so in priority.
+ if descr[0] not in so_suffixes:
+ return None
+ return filename
+
+ def collect_types(self):
+ self._typesdict = {}
+ self._generate("collecttype")
+
+ def _prnt(self, what=''):
+ self._f.write(what + '\n')
+
+ def _gettypenum(self, type):
+ # a KeyError here is a bug. please report it! :-)
+ return self._typesdict[type]
+
+ def _do_collect_type(self, tp):
+ if ((not isinstance(tp, model.PrimitiveType)
+ or tp.name == 'long double')
+ and tp not in self._typesdict):
+ num = len(self._typesdict)
+ self._typesdict[tp] = num
+
+ def write_source_to_f(self):
+ self.collect_types()
+ #
+ # The new module will have a _cffi_setup() function that receives
+ # objects from the ffi world, and that calls some setup code in
+ # the module. This setup code is split in several independent
+ # functions, e.g. one per constant. The functions are "chained"
+ # by ending in a tail call to each other.
+ #
+ # This is further split in two chained lists, depending on if we
+ # can do it at import-time or if we must wait for _cffi_setup() to
+ # provide us with the objects. This is needed because we
+ # need the values of the enum constants in order to build the
+ # that we may have to pass to _cffi_setup().
+ #
+ # The following two 'chained_list_constants' items contains
+ # the head of these two chained lists, as a string that gives the
+ # call to do, if any.
+ self._chained_list_constants = ['((void)lib,0)', '((void)lib,0)']
+ #
+ prnt = self._prnt
+ # first paste some standard set of lines that are mostly '#define'
+ prnt(cffimod_header)
+ prnt()
+ # then paste the C source given by the user, verbatim.
+ prnt(self.verifier.preamble)
+ prnt()
+ #
+ # call generate_cpy_xxx_decl(), for every xxx found from
+ # ffi._parser._declarations. This generates all the functions.
+ self._generate("decl")
+ #
+ # implement the function _cffi_setup_custom() as calling the
+ # head of the chained list.
+ self._generate_setup_custom()
+ prnt()
+ #
+ # produce the method table, including the entries for the
+ # generated Python->C function wrappers, which are done
+ # by generate_cpy_function_method().
+ prnt('static PyMethodDef _cffi_methods[] = {')
+ self._generate("method")
+ prnt(' {"_cffi_setup", _cffi_setup, METH_VARARGS, NULL},')
+ prnt(' {NULL, NULL, 0, NULL} /* Sentinel */')
+ prnt('};')
+ prnt()
+ #
+ # standard init.
+ modname = self.verifier.get_module_name()
+ constants = self._chained_list_constants[False]
+ prnt('#if PY_MAJOR_VERSION >= 3')
+ prnt()
+ prnt('static struct PyModuleDef _cffi_module_def = {')
+ prnt(' PyModuleDef_HEAD_INIT,')
+ prnt(' "%s",' % modname)
+ prnt(' NULL,')
+ prnt(' -1,')
+ prnt(' _cffi_methods,')
+ prnt(' NULL, NULL, NULL, NULL')
+ prnt('};')
+ prnt()
+ prnt('PyMODINIT_FUNC')
+ prnt('PyInit_%s(void)' % modname)
+ prnt('{')
+ prnt(' PyObject *lib;')
+ prnt(' lib = PyModule_Create(&_cffi_module_def);')
+ prnt(' if (lib == NULL)')
+ prnt(' return NULL;')
+ prnt(' if (%s < 0 || _cffi_init() < 0) {' % (constants,))
+ prnt(' Py_DECREF(lib);')
+ prnt(' return NULL;')
+ prnt(' }')
+ prnt(' return lib;')
+ prnt('}')
+ prnt()
+ prnt('#else')
+ prnt()
+ prnt('PyMODINIT_FUNC')
+ prnt('init%s(void)' % modname)
+ prnt('{')
+ prnt(' PyObject *lib;')
+ prnt(' lib = Py_InitModule("%s", _cffi_methods);' % modname)
+ prnt(' if (lib == NULL)')
+ prnt(' return;')
+ prnt(' if (%s < 0 || _cffi_init() < 0)' % (constants,))
+ prnt(' return;')
+ prnt(' return;')
+ prnt('}')
+ prnt()
+ prnt('#endif')
+
+ def load_library(self, flags=None):
+ # XXX review all usages of 'self' here!
+ # import it as a new extension module
+ imp.acquire_lock()
+ try:
+ if hasattr(sys, "getdlopenflags"):
+ previous_flags = sys.getdlopenflags()
+ try:
+ if hasattr(sys, "setdlopenflags") and flags is not None:
+ sys.setdlopenflags(flags)
+ module = imp.load_dynamic(self.verifier.get_module_name(),
+ self.verifier.modulefilename)
+ except ImportError as e:
+ error = "importing %r: %s" % (self.verifier.modulefilename, e)
+ raise VerificationError(error)
+ finally:
+ if hasattr(sys, "setdlopenflags"):
+ sys.setdlopenflags(previous_flags)
+ finally:
+ imp.release_lock()
+ #
+ # call loading_cpy_struct() to get the struct layout inferred by
+ # the C compiler
+ self._load(module, 'loading')
+ #
+ # the C code will need the objects. Collect them in
+ # order in a list.
+ revmapping = dict([(value, key)
+ for (key, value) in self._typesdict.items()])
+ lst = [revmapping[i] for i in range(len(revmapping))]
+ lst = list(map(self.ffi._get_cached_btype, lst))
+ #
+ # build the FFILibrary class and instance and call _cffi_setup().
+ # this will set up some fields like '_cffi_types', and only then
+ # it will invoke the chained list of functions that will really
+ # build (notably) the constant objects, as if they are
+ # pointers, and store them as attributes on the 'library' object.
+ class FFILibrary(object):
+ _cffi_python_module = module
+ _cffi_ffi = self.ffi
+ _cffi_dir = []
+ def __dir__(self):
+ return FFILibrary._cffi_dir + list(self.__dict__)
+ library = FFILibrary()
+ if module._cffi_setup(lst, VerificationError, library):
+ import warnings
+ warnings.warn("reimporting %r might overwrite older definitions"
+ % (self.verifier.get_module_name()))
+ #
+ # finally, call the loaded_cpy_xxx() functions. This will perform
+ # the final adjustments, like copying the Python->C wrapper
+ # functions from the module to the 'library' object, and setting
+ # up the FFILibrary class with properties for the global C variables.
+ self._load(module, 'loaded', library=library)
+ module._cffi_original_ffi = self.ffi
+ module._cffi_types_of_builtin_funcs = self._types_of_builtin_functions
+ return library
+
+ def _get_declarations(self):
+ lst = [(key, tp) for (key, (tp, qual)) in
+ self.ffi._parser._declarations.items()]
+ lst.sort()
+ return lst
+
+ def _generate(self, step_name):
+ for name, tp in self._get_declarations():
+ kind, realname = name.split(' ', 1)
+ try:
+ method = getattr(self, '_generate_cpy_%s_%s' % (kind,
+ step_name))
+ except AttributeError:
+ raise VerificationError(
+ "not implemented in verify(): %r" % name)
+ try:
+ method(tp, realname)
+ except Exception as e:
+ model.attach_exception_info(e, name)
+ raise
+
+ def _load(self, module, step_name, **kwds):
+ for name, tp in self._get_declarations():
+ kind, realname = name.split(' ', 1)
+ method = getattr(self, '_%s_cpy_%s' % (step_name, kind))
+ try:
+ method(tp, realname, module, **kwds)
+ except Exception as e:
+ model.attach_exception_info(e, name)
+ raise
+
+ def _generate_nothing(self, tp, name):
+ pass
+
+ def _loaded_noop(self, tp, name, module, **kwds):
+ pass
+
+ # ----------
+
+ def _convert_funcarg_to_c(self, tp, fromvar, tovar, errcode):
+ extraarg = ''
+ if isinstance(tp, model.PrimitiveType):
+ if tp.is_integer_type() and tp.name != '_Bool':
+ converter = '_cffi_to_c_int'
+ extraarg = ', %s' % tp.name
+ elif tp.is_complex_type():
+ raise VerificationError(
+ "not implemented in verify(): complex types")
+ else:
+ converter = '(%s)_cffi_to_c_%s' % (tp.get_c_name(''),
+ tp.name.replace(' ', '_'))
+ errvalue = '-1'
+ #
+ elif isinstance(tp, model.PointerType):
+ self._convert_funcarg_to_c_ptr_or_array(tp, fromvar,
+ tovar, errcode)
+ return
+ #
+ elif isinstance(tp, (model.StructOrUnion, model.EnumType)):
+ # a struct (not a struct pointer) as a function argument
+ self._prnt(' if (_cffi_to_c((char *)&%s, _cffi_type(%d), %s) < 0)'
+ % (tovar, self._gettypenum(tp), fromvar))
+ self._prnt(' %s;' % errcode)
+ return
+ #
+ elif isinstance(tp, model.FunctionPtrType):
+ converter = '(%s)_cffi_to_c_pointer' % tp.get_c_name('')
+ extraarg = ', _cffi_type(%d)' % self._gettypenum(tp)
+ errvalue = 'NULL'
+ #
+ else:
+ raise NotImplementedError(tp)
+ #
+ self._prnt(' %s = %s(%s%s);' % (tovar, converter, fromvar, extraarg))
+ self._prnt(' if (%s == (%s)%s && PyErr_Occurred())' % (
+ tovar, tp.get_c_name(''), errvalue))
+ self._prnt(' %s;' % errcode)
+
+ def _extra_local_variables(self, tp, localvars, freelines):
+ if isinstance(tp, model.PointerType):
+ localvars.add('Py_ssize_t datasize')
+ localvars.add('struct _cffi_freeme_s *large_args_free = NULL')
+ freelines.add('if (large_args_free != NULL)'
+ ' _cffi_free_array_arguments(large_args_free);')
+
+ def _convert_funcarg_to_c_ptr_or_array(self, tp, fromvar, tovar, errcode):
+ self._prnt(' datasize = _cffi_prepare_pointer_call_argument(')
+ self._prnt(' _cffi_type(%d), %s, (char **)&%s);' % (
+ self._gettypenum(tp), fromvar, tovar))
+ self._prnt(' if (datasize != 0) {')
+ self._prnt(' %s = ((size_t)datasize) <= 640 ? '
+ 'alloca((size_t)datasize) : NULL;' % (tovar,))
+ self._prnt(' if (_cffi_convert_array_argument(_cffi_type(%d), %s, '
+ '(char **)&%s,' % (self._gettypenum(tp), fromvar, tovar))
+ self._prnt(' datasize, &large_args_free) < 0)')
+ self._prnt(' %s;' % errcode)
+ self._prnt(' }')
+
+ def _convert_expr_from_c(self, tp, var, context):
+ if isinstance(tp, model.PrimitiveType):
+ if tp.is_integer_type() and tp.name != '_Bool':
+ return '_cffi_from_c_int(%s, %s)' % (var, tp.name)
+ elif tp.name != 'long double':
+ return '_cffi_from_c_%s(%s)' % (tp.name.replace(' ', '_'), var)
+ else:
+ return '_cffi_from_c_deref((char *)&%s, _cffi_type(%d))' % (
+ var, self._gettypenum(tp))
+ elif isinstance(tp, (model.PointerType, model.FunctionPtrType)):
+ return '_cffi_from_c_pointer((char *)%s, _cffi_type(%d))' % (
+ var, self._gettypenum(tp))
+ elif isinstance(tp, model.ArrayType):
+ return '_cffi_from_c_pointer((char *)%s, _cffi_type(%d))' % (
+ var, self._gettypenum(model.PointerType(tp.item)))
+ elif isinstance(tp, model.StructOrUnion):
+ if tp.fldnames is None:
+ raise TypeError("'%s' is used as %s, but is opaque" % (
+ tp._get_c_name(), context))
+ return '_cffi_from_c_struct((char *)&%s, _cffi_type(%d))' % (
+ var, self._gettypenum(tp))
+ elif isinstance(tp, model.EnumType):
+ return '_cffi_from_c_deref((char *)&%s, _cffi_type(%d))' % (
+ var, self._gettypenum(tp))
+ else:
+ raise NotImplementedError(tp)
+
+ # ----------
+ # typedefs: generates no code so far
+
+ _generate_cpy_typedef_collecttype = _generate_nothing
+ _generate_cpy_typedef_decl = _generate_nothing
+ _generate_cpy_typedef_method = _generate_nothing
+ _loading_cpy_typedef = _loaded_noop
+ _loaded_cpy_typedef = _loaded_noop
+
+ # ----------
+ # function declarations
+
+ def _generate_cpy_function_collecttype(self, tp, name):
+ assert isinstance(tp, model.FunctionPtrType)
+ if tp.ellipsis:
+ self._do_collect_type(tp)
+ else:
+ # don't call _do_collect_type(tp) in this common case,
+ # otherwise test_autofilled_struct_as_argument fails
+ for type in tp.args:
+ self._do_collect_type(type)
+ self._do_collect_type(tp.result)
+
+ def _generate_cpy_function_decl(self, tp, name):
+ assert isinstance(tp, model.FunctionPtrType)
+ if tp.ellipsis:
+ # cannot support vararg functions better than this: check for its
+ # exact type (including the fixed arguments), and build it as a
+ # constant function pointer (no CPython wrapper)
+ self._generate_cpy_const(False, name, tp)
+ return
+ prnt = self._prnt
+ numargs = len(tp.args)
+ if numargs == 0:
+ argname = 'noarg'
+ elif numargs == 1:
+ argname = 'arg0'
+ else:
+ argname = 'args'
+ prnt('static PyObject *')
+ prnt('_cffi_f_%s(PyObject *self, PyObject *%s)' % (name, argname))
+ prnt('{')
+ #
+ context = 'argument of %s' % name
+ for i, type in enumerate(tp.args):
+ prnt(' %s;' % type.get_c_name(' x%d' % i, context))
+ #
+ localvars = set()
+ freelines = set()
+ for type in tp.args:
+ self._extra_local_variables(type, localvars, freelines)
+ for decl in sorted(localvars):
+ prnt(' %s;' % (decl,))
+ #
+ if not isinstance(tp.result, model.VoidType):
+ result_code = 'result = '
+ context = 'result of %s' % name
+ prnt(' %s;' % tp.result.get_c_name(' result', context))
+ prnt(' PyObject *pyresult;')
+ else:
+ result_code = ''
+ #
+ if len(tp.args) > 1:
+ rng = range(len(tp.args))
+ for i in rng:
+ prnt(' PyObject *arg%d;' % i)
+ prnt()
+ prnt(' if (!PyArg_ParseTuple(args, "%s:%s", %s))' % (
+ 'O' * numargs, name, ', '.join(['&arg%d' % i for i in rng])))
+ prnt(' return NULL;')
+ prnt()
+ #
+ for i, type in enumerate(tp.args):
+ self._convert_funcarg_to_c(type, 'arg%d' % i, 'x%d' % i,
+ 'return NULL')
+ prnt()
+ #
+ prnt(' Py_BEGIN_ALLOW_THREADS')
+ prnt(' _cffi_restore_errno();')
+ prnt(' { %s%s(%s); }' % (
+ result_code, name,
+ ', '.join(['x%d' % i for i in range(len(tp.args))])))
+ prnt(' _cffi_save_errno();')
+ prnt(' Py_END_ALLOW_THREADS')
+ prnt()
+ #
+ prnt(' (void)self; /* unused */')
+ if numargs == 0:
+ prnt(' (void)noarg; /* unused */')
+ if result_code:
+ prnt(' pyresult = %s;' %
+ self._convert_expr_from_c(tp.result, 'result', 'result type'))
+ for freeline in freelines:
+ prnt(' ' + freeline)
+ prnt(' return pyresult;')
+ else:
+ for freeline in freelines:
+ prnt(' ' + freeline)
+ prnt(' Py_INCREF(Py_None);')
+ prnt(' return Py_None;')
+ prnt('}')
+ prnt()
+
+ def _generate_cpy_function_method(self, tp, name):
+ if tp.ellipsis:
+ return
+ numargs = len(tp.args)
+ if numargs == 0:
+ meth = 'METH_NOARGS'
+ elif numargs == 1:
+ meth = 'METH_O'
+ else:
+ meth = 'METH_VARARGS'
+ self._prnt(' {"%s", _cffi_f_%s, %s, NULL},' % (name, name, meth))
+
+ _loading_cpy_function = _loaded_noop
+
+ def _loaded_cpy_function(self, tp, name, module, library):
+ if tp.ellipsis:
+ return
+ func = getattr(module, name)
+ setattr(library, name, func)
+ self._types_of_builtin_functions[func] = tp
+
+ # ----------
+ # named structs
+
+ _generate_cpy_struct_collecttype = _generate_nothing
+ def _generate_cpy_struct_decl(self, tp, name):
+ assert name == tp.name
+ self._generate_struct_or_union_decl(tp, 'struct', name)
+ def _generate_cpy_struct_method(self, tp, name):
+ self._generate_struct_or_union_method(tp, 'struct', name)
+ def _loading_cpy_struct(self, tp, name, module):
+ self._loading_struct_or_union(tp, 'struct', name, module)
+ def _loaded_cpy_struct(self, tp, name, module, **kwds):
+ self._loaded_struct_or_union(tp)
+
+ _generate_cpy_union_collecttype = _generate_nothing
+ def _generate_cpy_union_decl(self, tp, name):
+ assert name == tp.name
+ self._generate_struct_or_union_decl(tp, 'union', name)
+ def _generate_cpy_union_method(self, tp, name):
+ self._generate_struct_or_union_method(tp, 'union', name)
+ def _loading_cpy_union(self, tp, name, module):
+ self._loading_struct_or_union(tp, 'union', name, module)
+ def _loaded_cpy_union(self, tp, name, module, **kwds):
+ self._loaded_struct_or_union(tp)
+
+ def _generate_struct_or_union_decl(self, tp, prefix, name):
+ if tp.fldnames is None:
+ return # nothing to do with opaque structs
+ checkfuncname = '_cffi_check_%s_%s' % (prefix, name)
+ layoutfuncname = '_cffi_layout_%s_%s' % (prefix, name)
+ cname = ('%s %s' % (prefix, name)).strip()
+ #
+ prnt = self._prnt
+ prnt('static void %s(%s *p)' % (checkfuncname, cname))
+ prnt('{')
+ prnt(' /* only to generate compile-time warnings or errors */')
+ prnt(' (void)p;')
+ for fname, ftype, fbitsize, fqual in tp.enumfields():
+ if (isinstance(ftype, model.PrimitiveType)
+ and ftype.is_integer_type()) or fbitsize >= 0:
+ # accept all integers, but complain on float or double
+ prnt(' (void)((p->%s) << 1);' % fname)
+ else:
+ # only accept exactly the type declared.
+ try:
+ prnt(' { %s = &p->%s; (void)tmp; }' % (
+ ftype.get_c_name('*tmp', 'field %r'%fname, quals=fqual),
+ fname))
+ except VerificationError as e:
+ prnt(' /* %s */' % str(e)) # cannot verify it, ignore
+ prnt('}')
+ prnt('static PyObject *')
+ prnt('%s(PyObject *self, PyObject *noarg)' % (layoutfuncname,))
+ prnt('{')
+ prnt(' struct _cffi_aligncheck { char x; %s y; };' % cname)
+ prnt(' static Py_ssize_t nums[] = {')
+ prnt(' sizeof(%s),' % cname)
+ prnt(' offsetof(struct _cffi_aligncheck, y),')
+ for fname, ftype, fbitsize, fqual in tp.enumfields():
+ if fbitsize >= 0:
+ continue # xxx ignore fbitsize for now
+ prnt(' offsetof(%s, %s),' % (cname, fname))
+ if isinstance(ftype, model.ArrayType) and ftype.length is None:
+ prnt(' 0, /* %s */' % ftype._get_c_name())
+ else:
+ prnt(' sizeof(((%s *)0)->%s),' % (cname, fname))
+ prnt(' -1')
+ prnt(' };')
+ prnt(' (void)self; /* unused */')
+ prnt(' (void)noarg; /* unused */')
+ prnt(' return _cffi_get_struct_layout(nums);')
+ prnt(' /* the next line is not executed, but compiled */')
+ prnt(' %s(0);' % (checkfuncname,))
+ prnt('}')
+ prnt()
+
+ def _generate_struct_or_union_method(self, tp, prefix, name):
+ if tp.fldnames is None:
+ return # nothing to do with opaque structs
+ layoutfuncname = '_cffi_layout_%s_%s' % (prefix, name)
+ self._prnt(' {"%s", %s, METH_NOARGS, NULL},' % (layoutfuncname,
+ layoutfuncname))
+
+ def _loading_struct_or_union(self, tp, prefix, name, module):
+ if tp.fldnames is None:
+ return # nothing to do with opaque structs
+ layoutfuncname = '_cffi_layout_%s_%s' % (prefix, name)
+ #
+ function = getattr(module, layoutfuncname)
+ layout = function()
+ if isinstance(tp, model.StructOrUnion) and tp.partial:
+ # use the function()'s sizes and offsets to guide the
+ # layout of the struct
+ totalsize = layout[0]
+ totalalignment = layout[1]
+ fieldofs = layout[2::2]
+ fieldsize = layout[3::2]
+ tp.force_flatten()
+ assert len(fieldofs) == len(fieldsize) == len(tp.fldnames)
+ tp.fixedlayout = fieldofs, fieldsize, totalsize, totalalignment
+ else:
+ cname = ('%s %s' % (prefix, name)).strip()
+ self._struct_pending_verification[tp] = layout, cname
+
+ def _loaded_struct_or_union(self, tp):
+ if tp.fldnames is None:
+ return # nothing to do with opaque structs
+ self.ffi._get_cached_btype(tp) # force 'fixedlayout' to be considered
+
+ if tp in self._struct_pending_verification:
+ # check that the layout sizes and offsets match the real ones
+ def check(realvalue, expectedvalue, msg):
+ if realvalue != expectedvalue:
+ raise VerificationError(
+ "%s (we have %d, but C compiler says %d)"
+ % (msg, expectedvalue, realvalue))
+ ffi = self.ffi
+ BStruct = ffi._get_cached_btype(tp)
+ layout, cname = self._struct_pending_verification.pop(tp)
+ check(layout[0], ffi.sizeof(BStruct), "wrong total size")
+ check(layout[1], ffi.alignof(BStruct), "wrong total alignment")
+ i = 2
+ for fname, ftype, fbitsize, fqual in tp.enumfields():
+ if fbitsize >= 0:
+ continue # xxx ignore fbitsize for now
+ check(layout[i], ffi.offsetof(BStruct, fname),
+ "wrong offset for field %r" % (fname,))
+ if layout[i+1] != 0:
+ BField = ffi._get_cached_btype(ftype)
+ check(layout[i+1], ffi.sizeof(BField),
+ "wrong size for field %r" % (fname,))
+ i += 2
+ assert i == len(layout)
+
+ # ----------
+ # 'anonymous' declarations. These are produced for anonymous structs
+ # or unions; the 'name' is obtained by a typedef.
+
+ _generate_cpy_anonymous_collecttype = _generate_nothing
+
+ def _generate_cpy_anonymous_decl(self, tp, name):
+ if isinstance(tp, model.EnumType):
+ self._generate_cpy_enum_decl(tp, name, '')
+ else:
+ self._generate_struct_or_union_decl(tp, '', name)
+
+ def _generate_cpy_anonymous_method(self, tp, name):
+ if not isinstance(tp, model.EnumType):
+ self._generate_struct_or_union_method(tp, '', name)
+
+ def _loading_cpy_anonymous(self, tp, name, module):
+ if isinstance(tp, model.EnumType):
+ self._loading_cpy_enum(tp, name, module)
+ else:
+ self._loading_struct_or_union(tp, '', name, module)
+
+ def _loaded_cpy_anonymous(self, tp, name, module, **kwds):
+ if isinstance(tp, model.EnumType):
+ self._loaded_cpy_enum(tp, name, module, **kwds)
+ else:
+ self._loaded_struct_or_union(tp)
+
+ # ----------
+ # constants, likely declared with '#define'
+
+ def _generate_cpy_const(self, is_int, name, tp=None, category='const',
+ vartp=None, delayed=True, size_too=False,
+ check_value=None):
+ prnt = self._prnt
+ funcname = '_cffi_%s_%s' % (category, name)
+ prnt('static int %s(PyObject *lib)' % funcname)
+ prnt('{')
+ prnt(' PyObject *o;')
+ prnt(' int res;')
+ if not is_int:
+ prnt(' %s;' % (vartp or tp).get_c_name(' i', name))
+ else:
+ assert category == 'const'
+ #
+ if check_value is not None:
+ self._check_int_constant_value(name, check_value)
+ #
+ if not is_int:
+ if category == 'var':
+ realexpr = '&' + name
+ else:
+ realexpr = name
+ prnt(' i = (%s);' % (realexpr,))
+ prnt(' o = %s;' % (self._convert_expr_from_c(tp, 'i',
+ 'variable type'),))
+ assert delayed
+ else:
+ prnt(' o = _cffi_from_c_int_const(%s);' % name)
+ prnt(' if (o == NULL)')
+ prnt(' return -1;')
+ if size_too:
+ prnt(' {')
+ prnt(' PyObject *o1 = o;')
+ prnt(' o = Py_BuildValue("On", o1, (Py_ssize_t)sizeof(%s));'
+ % (name,))
+ prnt(' Py_DECREF(o1);')
+ prnt(' if (o == NULL)')
+ prnt(' return -1;')
+ prnt(' }')
+ prnt(' res = PyObject_SetAttrString(lib, "%s", o);' % name)
+ prnt(' Py_DECREF(o);')
+ prnt(' if (res < 0)')
+ prnt(' return -1;')
+ prnt(' return %s;' % self._chained_list_constants[delayed])
+ self._chained_list_constants[delayed] = funcname + '(lib)'
+ prnt('}')
+ prnt()
+
+ def _generate_cpy_constant_collecttype(self, tp, name):
+ is_int = isinstance(tp, model.PrimitiveType) and tp.is_integer_type()
+ if not is_int:
+ self._do_collect_type(tp)
+
+ def _generate_cpy_constant_decl(self, tp, name):
+ is_int = isinstance(tp, model.PrimitiveType) and tp.is_integer_type()
+ self._generate_cpy_const(is_int, name, tp)
+
+ _generate_cpy_constant_method = _generate_nothing
+ _loading_cpy_constant = _loaded_noop
+ _loaded_cpy_constant = _loaded_noop
+
+ # ----------
+ # enums
+
+ def _check_int_constant_value(self, name, value, err_prefix=''):
+ prnt = self._prnt
+ if value <= 0:
+ prnt(' if ((%s) > 0 || (long)(%s) != %dL) {' % (
+ name, name, value))
+ else:
+ prnt(' if ((%s) <= 0 || (unsigned long)(%s) != %dUL) {' % (
+ name, name, value))
+ prnt(' char buf[64];')
+ prnt(' if ((%s) <= 0)' % name)
+ prnt(' snprintf(buf, 63, "%%ld", (long)(%s));' % name)
+ prnt(' else')
+ prnt(' snprintf(buf, 63, "%%lu", (unsigned long)(%s));' %
+ name)
+ prnt(' PyErr_Format(_cffi_VerificationError,')
+ prnt(' "%s%s has the real value %s, not %s",')
+ prnt(' "%s", "%s", buf, "%d");' % (
+ err_prefix, name, value))
+ prnt(' return -1;')
+ prnt(' }')
+
+ def _enum_funcname(self, prefix, name):
+ # "$enum_$1" => "___D_enum____D_1"
+ name = name.replace('$', '___D_')
+ return '_cffi_e_%s_%s' % (prefix, name)
+
+ def _generate_cpy_enum_decl(self, tp, name, prefix='enum'):
+ if tp.partial:
+ for enumerator in tp.enumerators:
+ self._generate_cpy_const(True, enumerator, delayed=False)
+ return
+ #
+ funcname = self._enum_funcname(prefix, name)
+ prnt = self._prnt
+ prnt('static int %s(PyObject *lib)' % funcname)
+ prnt('{')
+ for enumerator, enumvalue in zip(tp.enumerators, tp.enumvalues):
+ self._check_int_constant_value(enumerator, enumvalue,
+ "enum %s: " % name)
+ prnt(' return %s;' % self._chained_list_constants[True])
+ self._chained_list_constants[True] = funcname + '(lib)'
+ prnt('}')
+ prnt()
+
+ _generate_cpy_enum_collecttype = _generate_nothing
+ _generate_cpy_enum_method = _generate_nothing
+
+ def _loading_cpy_enum(self, tp, name, module):
+ if tp.partial:
+ enumvalues = [getattr(module, enumerator)
+ for enumerator in tp.enumerators]
+ tp.enumvalues = tuple(enumvalues)
+ tp.partial_resolved = True
+
+ def _loaded_cpy_enum(self, tp, name, module, library):
+ for enumerator, enumvalue in zip(tp.enumerators, tp.enumvalues):
+ setattr(library, enumerator, enumvalue)
+
+ # ----------
+ # macros: for now only for integers
+
+ def _generate_cpy_macro_decl(self, tp, name):
+ if tp == '...':
+ check_value = None
+ else:
+ check_value = tp # an integer
+ self._generate_cpy_const(True, name, check_value=check_value)
+
+ _generate_cpy_macro_collecttype = _generate_nothing
+ _generate_cpy_macro_method = _generate_nothing
+ _loading_cpy_macro = _loaded_noop
+ _loaded_cpy_macro = _loaded_noop
+
+ # ----------
+ # global variables
+
+ def _generate_cpy_variable_collecttype(self, tp, name):
+ if isinstance(tp, model.ArrayType):
+ tp_ptr = model.PointerType(tp.item)
+ else:
+ tp_ptr = model.PointerType(tp)
+ self._do_collect_type(tp_ptr)
+
+ def _generate_cpy_variable_decl(self, tp, name):
+ if isinstance(tp, model.ArrayType):
+ tp_ptr = model.PointerType(tp.item)
+ self._generate_cpy_const(False, name, tp, vartp=tp_ptr,
+ size_too = tp.length_is_unknown())
+ else:
+ tp_ptr = model.PointerType(tp)
+ self._generate_cpy_const(False, name, tp_ptr, category='var')
+
+ _generate_cpy_variable_method = _generate_nothing
+ _loading_cpy_variable = _loaded_noop
+
+ def _loaded_cpy_variable(self, tp, name, module, library):
+ value = getattr(library, name)
+ if isinstance(tp, model.ArrayType): # int a[5] is "constant" in the
+ # sense that "a=..." is forbidden
+ if tp.length_is_unknown():
+ assert isinstance(value, tuple)
+ (value, size) = value
+ BItemType = self.ffi._get_cached_btype(tp.item)
+ length, rest = divmod(size, self.ffi.sizeof(BItemType))
+ if rest != 0:
+ raise VerificationError(
+ "bad size: %r does not seem to be an array of %s" %
+ (name, tp.item))
+ tp = tp.resolve_length(length)
+ # 'value' is a which we have to replace with
+ # a if the N is actually known
+ if tp.length is not None:
+ BArray = self.ffi._get_cached_btype(tp)
+ value = self.ffi.cast(BArray, value)
+ setattr(library, name, value)
+ return
+ # remove ptr= from the library instance, and replace
+ # it by a property on the class, which reads/writes into ptr[0].
+ ptr = value
+ delattr(library, name)
+ def getter(library):
+ return ptr[0]
+ def setter(library, value):
+ ptr[0] = value
+ setattr(type(library), name, property(getter, setter))
+ type(library)._cffi_dir.append(name)
+
+ # ----------
+
+ def _generate_setup_custom(self):
+ prnt = self._prnt
+ prnt('static int _cffi_setup_custom(PyObject *lib)')
+ prnt('{')
+ prnt(' return %s;' % self._chained_list_constants[True])
+ prnt('}')
+
+cffimod_header = r'''
+#include
+#include
+
+/* this block of #ifs should be kept exactly identical between
+ c/_cffi_backend.c, cffi/vengine_cpy.py, cffi/vengine_gen.py
+ and cffi/_cffi_include.h */
+#if defined(_MSC_VER)
+# include /* for alloca() */
+# if _MSC_VER < 1600 /* MSVC < 2010 */
+ typedef __int8 int8_t;
+ typedef __int16 int16_t;
+ typedef __int32 int32_t;
+ typedef __int64 int64_t;
+ typedef unsigned __int8 uint8_t;
+ typedef unsigned __int16 uint16_t;
+ typedef unsigned __int32 uint32_t;
+ typedef unsigned __int64 uint64_t;
+ typedef __int8 int_least8_t;
+ typedef __int16 int_least16_t;
+ typedef __int32 int_least32_t;
+ typedef __int64 int_least64_t;
+ typedef unsigned __int8 uint_least8_t;
+ typedef unsigned __int16 uint_least16_t;
+ typedef unsigned __int32 uint_least32_t;
+ typedef unsigned __int64 uint_least64_t;
+ typedef __int8 int_fast8_t;
+ typedef __int16 int_fast16_t;
+ typedef __int32 int_fast32_t;
+ typedef __int64 int_fast64_t;
+ typedef unsigned __int8 uint_fast8_t;
+ typedef unsigned __int16 uint_fast16_t;
+ typedef unsigned __int32 uint_fast32_t;
+ typedef unsigned __int64 uint_fast64_t;
+ typedef __int64 intmax_t;
+ typedef unsigned __int64 uintmax_t;
+# else
+# include
+# endif
+# if _MSC_VER < 1800 /* MSVC < 2013 */
+# ifndef __cplusplus
+ typedef unsigned char _Bool;
+# endif
+# endif
+# define _cffi_float_complex_t _Fcomplex /* include for it */
+# define _cffi_double_complex_t _Dcomplex /* include for it */
+#else
+# include
+# if (defined (__SVR4) && defined (__sun)) || defined(_AIX) || defined(__hpux)
+# include
+# endif
+# define _cffi_float_complex_t float _Complex
+# define _cffi_double_complex_t double _Complex
+#endif
+
+#if PY_MAJOR_VERSION < 3
+# undef PyCapsule_CheckExact
+# undef PyCapsule_GetPointer
+# define PyCapsule_CheckExact(capsule) (PyCObject_Check(capsule))
+# define PyCapsule_GetPointer(capsule, name) \
+ (PyCObject_AsVoidPtr(capsule))
+#endif
+
+#if PY_MAJOR_VERSION >= 3
+# define PyInt_FromLong PyLong_FromLong
+#endif
+
+#define _cffi_from_c_double PyFloat_FromDouble
+#define _cffi_from_c_float PyFloat_FromDouble
+#define _cffi_from_c_long PyInt_FromLong
+#define _cffi_from_c_ulong PyLong_FromUnsignedLong
+#define _cffi_from_c_longlong PyLong_FromLongLong
+#define _cffi_from_c_ulonglong PyLong_FromUnsignedLongLong
+#define _cffi_from_c__Bool PyBool_FromLong
+
+#define _cffi_to_c_double PyFloat_AsDouble
+#define _cffi_to_c_float PyFloat_AsDouble
+
+#define _cffi_from_c_int_const(x) \
+ (((x) > 0) ? \
+ ((unsigned long long)(x) <= (unsigned long long)LONG_MAX) ? \
+ PyInt_FromLong((long)(x)) : \
+ PyLong_FromUnsignedLongLong((unsigned long long)(x)) : \
+ ((long long)(x) >= (long long)LONG_MIN) ? \
+ PyInt_FromLong((long)(x)) : \
+ PyLong_FromLongLong((long long)(x)))
+
+#define _cffi_from_c_int(x, type) \
+ (((type)-1) > 0 ? /* unsigned */ \
+ (sizeof(type) < sizeof(long) ? \
+ PyInt_FromLong((long)x) : \
+ sizeof(type) == sizeof(long) ? \
+ PyLong_FromUnsignedLong((unsigned long)x) : \
+ PyLong_FromUnsignedLongLong((unsigned long long)x)) : \
+ (sizeof(type) <= sizeof(long) ? \
+ PyInt_FromLong((long)x) : \
+ PyLong_FromLongLong((long long)x)))
+
+#define _cffi_to_c_int(o, type) \
+ ((type)( \
+ sizeof(type) == 1 ? (((type)-1) > 0 ? (type)_cffi_to_c_u8(o) \
+ : (type)_cffi_to_c_i8(o)) : \
+ sizeof(type) == 2 ? (((type)-1) > 0 ? (type)_cffi_to_c_u16(o) \
+ : (type)_cffi_to_c_i16(o)) : \
+ sizeof(type) == 4 ? (((type)-1) > 0 ? (type)_cffi_to_c_u32(o) \
+ : (type)_cffi_to_c_i32(o)) : \
+ sizeof(type) == 8 ? (((type)-1) > 0 ? (type)_cffi_to_c_u64(o) \
+ : (type)_cffi_to_c_i64(o)) : \
+ (Py_FatalError("unsupported size for type " #type), (type)0)))
+
+#define _cffi_to_c_i8 \
+ ((int(*)(PyObject *))_cffi_exports[1])
+#define _cffi_to_c_u8 \
+ ((int(*)(PyObject *))_cffi_exports[2])
+#define _cffi_to_c_i16 \
+ ((int(*)(PyObject *))_cffi_exports[3])
+#define _cffi_to_c_u16 \
+ ((int(*)(PyObject *))_cffi_exports[4])
+#define _cffi_to_c_i32 \
+ ((int(*)(PyObject *))_cffi_exports[5])
+#define _cffi_to_c_u32 \
+ ((unsigned int(*)(PyObject *))_cffi_exports[6])
+#define _cffi_to_c_i64 \
+ ((long long(*)(PyObject *))_cffi_exports[7])
+#define _cffi_to_c_u64 \
+ ((unsigned long long(*)(PyObject *))_cffi_exports[8])
+#define _cffi_to_c_char \
+ ((int(*)(PyObject *))_cffi_exports[9])
+#define _cffi_from_c_pointer \
+ ((PyObject *(*)(char *, CTypeDescrObject *))_cffi_exports[10])
+#define _cffi_to_c_pointer \
+ ((char *(*)(PyObject *, CTypeDescrObject *))_cffi_exports[11])
+#define _cffi_get_struct_layout \
+ ((PyObject *(*)(Py_ssize_t[]))_cffi_exports[12])
+#define _cffi_restore_errno \
+ ((void(*)(void))_cffi_exports[13])
+#define _cffi_save_errno \
+ ((void(*)(void))_cffi_exports[14])
+#define _cffi_from_c_char \
+ ((PyObject *(*)(char))_cffi_exports[15])
+#define _cffi_from_c_deref \
+ ((PyObject *(*)(char *, CTypeDescrObject *))_cffi_exports[16])
+#define _cffi_to_c \
+ ((int(*)(char *, CTypeDescrObject *, PyObject *))_cffi_exports[17])
+#define _cffi_from_c_struct \
+ ((PyObject *(*)(char *, CTypeDescrObject *))_cffi_exports[18])
+#define _cffi_to_c_wchar_t \
+ ((wchar_t(*)(PyObject *))_cffi_exports[19])
+#define _cffi_from_c_wchar_t \
+ ((PyObject *(*)(wchar_t))_cffi_exports[20])
+#define _cffi_to_c_long_double \
+ ((long double(*)(PyObject *))_cffi_exports[21])
+#define _cffi_to_c__Bool \
+ ((_Bool(*)(PyObject *))_cffi_exports[22])
+#define _cffi_prepare_pointer_call_argument \
+ ((Py_ssize_t(*)(CTypeDescrObject *, PyObject *, char **))_cffi_exports[23])
+#define _cffi_convert_array_from_object \
+ ((int(*)(char *, CTypeDescrObject *, PyObject *))_cffi_exports[24])
+#define _CFFI_NUM_EXPORTS 25
+
+typedef struct _ctypedescr CTypeDescrObject;
+
+static void *_cffi_exports[_CFFI_NUM_EXPORTS];
+static PyObject *_cffi_types, *_cffi_VerificationError;
+
+static int _cffi_setup_custom(PyObject *lib); /* forward */
+
+static PyObject *_cffi_setup(PyObject *self, PyObject *args)
+{
+ PyObject *library;
+ int was_alive = (_cffi_types != NULL);
+ (void)self; /* unused */
+ if (!PyArg_ParseTuple(args, "OOO", &_cffi_types, &_cffi_VerificationError,
+ &library))
+ return NULL;
+ Py_INCREF(_cffi_types);
+ Py_INCREF(_cffi_VerificationError);
+ if (_cffi_setup_custom(library) < 0)
+ return NULL;
+ return PyBool_FromLong(was_alive);
+}
+
+union _cffi_union_alignment_u {
+ unsigned char m_char;
+ unsigned short m_short;
+ unsigned int m_int;
+ unsigned long m_long;
+ unsigned long long m_longlong;
+ float m_float;
+ double m_double;
+ long double m_longdouble;
+};
+
+struct _cffi_freeme_s {
+ struct _cffi_freeme_s *next;
+ union _cffi_union_alignment_u alignment;
+};
+
+#ifdef __GNUC__
+ __attribute__((unused))
+#endif
+static int _cffi_convert_array_argument(CTypeDescrObject *ctptr, PyObject *arg,
+ char **output_data, Py_ssize_t datasize,
+ struct _cffi_freeme_s **freeme)
+{
+ char *p;
+ if (datasize < 0)
+ return -1;
+
+ p = *output_data;
+ if (p == NULL) {
+ struct _cffi_freeme_s *fp = (struct _cffi_freeme_s *)PyObject_Malloc(
+ offsetof(struct _cffi_freeme_s, alignment) + (size_t)datasize);
+ if (fp == NULL)
+ return -1;
+ fp->next = *freeme;
+ *freeme = fp;
+ p = *output_data = (char *)&fp->alignment;
+ }
+ memset((void *)p, 0, (size_t)datasize);
+ return _cffi_convert_array_from_object(p, ctptr, arg);
+}
+
+#ifdef __GNUC__
+ __attribute__((unused))
+#endif
+static void _cffi_free_array_arguments(struct _cffi_freeme_s *freeme)
+{
+ do {
+ void *p = (void *)freeme;
+ freeme = freeme->next;
+ PyObject_Free(p);
+ } while (freeme != NULL);
+}
+
+static int _cffi_init(void)
+{
+ PyObject *module, *c_api_object = NULL;
+
+ module = PyImport_ImportModule("_cffi_backend");
+ if (module == NULL)
+ goto failure;
+
+ c_api_object = PyObject_GetAttrString(module, "_C_API");
+ if (c_api_object == NULL)
+ goto failure;
+ if (!PyCapsule_CheckExact(c_api_object)) {
+ PyErr_SetNone(PyExc_ImportError);
+ goto failure;
+ }
+ memcpy(_cffi_exports, PyCapsule_GetPointer(c_api_object, "cffi"),
+ _CFFI_NUM_EXPORTS * sizeof(void *));
+
+ Py_DECREF(module);
+ Py_DECREF(c_api_object);
+ return 0;
+
+ failure:
+ Py_XDECREF(module);
+ Py_XDECREF(c_api_object);
+ return -1;
+}
+
+#define _cffi_type(num) ((CTypeDescrObject *)PyList_GET_ITEM(_cffi_types, num))
+
+/**********/
+'''
diff --git a/templates/skills/file_manager/dependencies/cffi/vengine_gen.py b/templates/skills/file_manager/dependencies/cffi/vengine_gen.py
new file mode 100644
index 00000000..bffc8212
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/cffi/vengine_gen.py
@@ -0,0 +1,679 @@
+#
+# DEPRECATED: implementation for ffi.verify()
+#
+import sys, os
+import types
+
+from . import model
+from .error import VerificationError
+
+
+class VGenericEngine(object):
+ _class_key = 'g'
+ _gen_python_module = False
+
+ def __init__(self, verifier):
+ self.verifier = verifier
+ self.ffi = verifier.ffi
+ self.export_symbols = []
+ self._struct_pending_verification = {}
+
+ def patch_extension_kwds(self, kwds):
+ # add 'export_symbols' to the dictionary. Note that we add the
+ # list before filling it. When we fill it, it will thus also show
+ # up in kwds['export_symbols'].
+ kwds.setdefault('export_symbols', self.export_symbols)
+
+ def find_module(self, module_name, path, so_suffixes):
+ for so_suffix in so_suffixes:
+ basename = module_name + so_suffix
+ if path is None:
+ path = sys.path
+ for dirname in path:
+ filename = os.path.join(dirname, basename)
+ if os.path.isfile(filename):
+ return filename
+
+ def collect_types(self):
+ pass # not needed in the generic engine
+
+ def _prnt(self, what=''):
+ self._f.write(what + '\n')
+
+ def write_source_to_f(self):
+ prnt = self._prnt
+ # first paste some standard set of lines that are mostly '#include'
+ prnt(cffimod_header)
+ # then paste the C source given by the user, verbatim.
+ prnt(self.verifier.preamble)
+ #
+ # call generate_gen_xxx_decl(), for every xxx found from
+ # ffi._parser._declarations. This generates all the functions.
+ self._generate('decl')
+ #
+ # on Windows, distutils insists on putting init_cffi_xyz in
+ # 'export_symbols', so instead of fighting it, just give up and
+ # give it one
+ if sys.platform == 'win32':
+ if sys.version_info >= (3,):
+ prefix = 'PyInit_'
+ else:
+ prefix = 'init'
+ modname = self.verifier.get_module_name()
+ prnt("void %s%s(void) { }\n" % (prefix, modname))
+
+ def load_library(self, flags=0):
+ # import it with the CFFI backend
+ backend = self.ffi._backend
+ # needs to make a path that contains '/', on Posix
+ filename = os.path.join(os.curdir, self.verifier.modulefilename)
+ module = backend.load_library(filename, flags)
+ #
+ # call loading_gen_struct() to get the struct layout inferred by
+ # the C compiler
+ self._load(module, 'loading')
+
+ # build the FFILibrary class and instance, this is a module subclass
+ # because modules are expected to have usually-constant-attributes and
+ # in PyPy this means the JIT is able to treat attributes as constant,
+ # which we want.
+ class FFILibrary(types.ModuleType):
+ _cffi_generic_module = module
+ _cffi_ffi = self.ffi
+ _cffi_dir = []
+ def __dir__(self):
+ return FFILibrary._cffi_dir
+ library = FFILibrary("")
+ #
+ # finally, call the loaded_gen_xxx() functions. This will set
+ # up the 'library' object.
+ self._load(module, 'loaded', library=library)
+ return library
+
+ def _get_declarations(self):
+ lst = [(key, tp) for (key, (tp, qual)) in
+ self.ffi._parser._declarations.items()]
+ lst.sort()
+ return lst
+
+ def _generate(self, step_name):
+ for name, tp in self._get_declarations():
+ kind, realname = name.split(' ', 1)
+ try:
+ method = getattr(self, '_generate_gen_%s_%s' % (kind,
+ step_name))
+ except AttributeError:
+ raise VerificationError(
+ "not implemented in verify(): %r" % name)
+ try:
+ method(tp, realname)
+ except Exception as e:
+ model.attach_exception_info(e, name)
+ raise
+
+ def _load(self, module, step_name, **kwds):
+ for name, tp in self._get_declarations():
+ kind, realname = name.split(' ', 1)
+ method = getattr(self, '_%s_gen_%s' % (step_name, kind))
+ try:
+ method(tp, realname, module, **kwds)
+ except Exception as e:
+ model.attach_exception_info(e, name)
+ raise
+
+ def _generate_nothing(self, tp, name):
+ pass
+
+ def _loaded_noop(self, tp, name, module, **kwds):
+ pass
+
+ # ----------
+ # typedefs: generates no code so far
+
+ _generate_gen_typedef_decl = _generate_nothing
+ _loading_gen_typedef = _loaded_noop
+ _loaded_gen_typedef = _loaded_noop
+
+ # ----------
+ # function declarations
+
+ def _generate_gen_function_decl(self, tp, name):
+ assert isinstance(tp, model.FunctionPtrType)
+ if tp.ellipsis:
+ # cannot support vararg functions better than this: check for its
+ # exact type (including the fixed arguments), and build it as a
+ # constant function pointer (no _cffi_f_%s wrapper)
+ self._generate_gen_const(False, name, tp)
+ return
+ prnt = self._prnt
+ numargs = len(tp.args)
+ argnames = []
+ for i, type in enumerate(tp.args):
+ indirection = ''
+ if isinstance(type, model.StructOrUnion):
+ indirection = '*'
+ argnames.append('%sx%d' % (indirection, i))
+ context = 'argument of %s' % name
+ arglist = [type.get_c_name(' %s' % arg, context)
+ for type, arg in zip(tp.args, argnames)]
+ tpresult = tp.result
+ if isinstance(tpresult, model.StructOrUnion):
+ arglist.insert(0, tpresult.get_c_name(' *r', context))
+ tpresult = model.void_type
+ arglist = ', '.join(arglist) or 'void'
+ wrappername = '_cffi_f_%s' % name
+ self.export_symbols.append(wrappername)
+ if tp.abi:
+ abi = tp.abi + ' '
+ else:
+ abi = ''
+ funcdecl = ' %s%s(%s)' % (abi, wrappername, arglist)
+ context = 'result of %s' % name
+ prnt(tpresult.get_c_name(funcdecl, context))
+ prnt('{')
+ #
+ if isinstance(tp.result, model.StructOrUnion):
+ result_code = '*r = '
+ elif not isinstance(tp.result, model.VoidType):
+ result_code = 'return '
+ else:
+ result_code = ''
+ prnt(' %s%s(%s);' % (result_code, name, ', '.join(argnames)))
+ prnt('}')
+ prnt()
+
+ _loading_gen_function = _loaded_noop
+
+ def _loaded_gen_function(self, tp, name, module, library):
+ assert isinstance(tp, model.FunctionPtrType)
+ if tp.ellipsis:
+ newfunction = self._load_constant(False, tp, name, module)
+ else:
+ indirections = []
+ base_tp = tp
+ if (any(isinstance(typ, model.StructOrUnion) for typ in tp.args)
+ or isinstance(tp.result, model.StructOrUnion)):
+ indirect_args = []
+ for i, typ in enumerate(tp.args):
+ if isinstance(typ, model.StructOrUnion):
+ typ = model.PointerType(typ)
+ indirections.append((i, typ))
+ indirect_args.append(typ)
+ indirect_result = tp.result
+ if isinstance(indirect_result, model.StructOrUnion):
+ if indirect_result.fldtypes is None:
+ raise TypeError("'%s' is used as result type, "
+ "but is opaque" % (
+ indirect_result._get_c_name(),))
+ indirect_result = model.PointerType(indirect_result)
+ indirect_args.insert(0, indirect_result)
+ indirections.insert(0, ("result", indirect_result))
+ indirect_result = model.void_type
+ tp = model.FunctionPtrType(tuple(indirect_args),
+ indirect_result, tp.ellipsis)
+ BFunc = self.ffi._get_cached_btype(tp)
+ wrappername = '_cffi_f_%s' % name
+ newfunction = module.load_function(BFunc, wrappername)
+ for i, typ in indirections:
+ newfunction = self._make_struct_wrapper(newfunction, i, typ,
+ base_tp)
+ setattr(library, name, newfunction)
+ type(library)._cffi_dir.append(name)
+
+ def _make_struct_wrapper(self, oldfunc, i, tp, base_tp):
+ backend = self.ffi._backend
+ BType = self.ffi._get_cached_btype(tp)
+ if i == "result":
+ ffi = self.ffi
+ def newfunc(*args):
+ res = ffi.new(BType)
+ oldfunc(res, *args)
+ return res[0]
+ else:
+ def newfunc(*args):
+ args = args[:i] + (backend.newp(BType, args[i]),) + args[i+1:]
+ return oldfunc(*args)
+ newfunc._cffi_base_type = base_tp
+ return newfunc
+
+ # ----------
+ # named structs
+
+ def _generate_gen_struct_decl(self, tp, name):
+ assert name == tp.name
+ self._generate_struct_or_union_decl(tp, 'struct', name)
+
+ def _loading_gen_struct(self, tp, name, module):
+ self._loading_struct_or_union(tp, 'struct', name, module)
+
+ def _loaded_gen_struct(self, tp, name, module, **kwds):
+ self._loaded_struct_or_union(tp)
+
+ def _generate_gen_union_decl(self, tp, name):
+ assert name == tp.name
+ self._generate_struct_or_union_decl(tp, 'union', name)
+
+ def _loading_gen_union(self, tp, name, module):
+ self._loading_struct_or_union(tp, 'union', name, module)
+
+ def _loaded_gen_union(self, tp, name, module, **kwds):
+ self._loaded_struct_or_union(tp)
+
+ def _generate_struct_or_union_decl(self, tp, prefix, name):
+ if tp.fldnames is None:
+ return # nothing to do with opaque structs
+ checkfuncname = '_cffi_check_%s_%s' % (prefix, name)
+ layoutfuncname = '_cffi_layout_%s_%s' % (prefix, name)
+ cname = ('%s %s' % (prefix, name)).strip()
+ #
+ prnt = self._prnt
+ prnt('static void %s(%s *p)' % (checkfuncname, cname))
+ prnt('{')
+ prnt(' /* only to generate compile-time warnings or errors */')
+ prnt(' (void)p;')
+ for fname, ftype, fbitsize, fqual in tp.enumfields():
+ if (isinstance(ftype, model.PrimitiveType)
+ and ftype.is_integer_type()) or fbitsize >= 0:
+ # accept all integers, but complain on float or double
+ prnt(' (void)((p->%s) << 1);' % fname)
+ else:
+ # only accept exactly the type declared.
+ try:
+ prnt(' { %s = &p->%s; (void)tmp; }' % (
+ ftype.get_c_name('*tmp', 'field %r'%fname, quals=fqual),
+ fname))
+ except VerificationError as e:
+ prnt(' /* %s */' % str(e)) # cannot verify it, ignore
+ prnt('}')
+ self.export_symbols.append(layoutfuncname)
+ prnt('intptr_t %s(intptr_t i)' % (layoutfuncname,))
+ prnt('{')
+ prnt(' struct _cffi_aligncheck { char x; %s y; };' % cname)
+ prnt(' static intptr_t nums[] = {')
+ prnt(' sizeof(%s),' % cname)
+ prnt(' offsetof(struct _cffi_aligncheck, y),')
+ for fname, ftype, fbitsize, fqual in tp.enumfields():
+ if fbitsize >= 0:
+ continue # xxx ignore fbitsize for now
+ prnt(' offsetof(%s, %s),' % (cname, fname))
+ if isinstance(ftype, model.ArrayType) and ftype.length is None:
+ prnt(' 0, /* %s */' % ftype._get_c_name())
+ else:
+ prnt(' sizeof(((%s *)0)->%s),' % (cname, fname))
+ prnt(' -1')
+ prnt(' };')
+ prnt(' return nums[i];')
+ prnt(' /* the next line is not executed, but compiled */')
+ prnt(' %s(0);' % (checkfuncname,))
+ prnt('}')
+ prnt()
+
+ def _loading_struct_or_union(self, tp, prefix, name, module):
+ if tp.fldnames is None:
+ return # nothing to do with opaque structs
+ layoutfuncname = '_cffi_layout_%s_%s' % (prefix, name)
+ #
+ BFunc = self.ffi._typeof_locked("intptr_t(*)(intptr_t)")[0]
+ function = module.load_function(BFunc, layoutfuncname)
+ layout = []
+ num = 0
+ while True:
+ x = function(num)
+ if x < 0: break
+ layout.append(x)
+ num += 1
+ if isinstance(tp, model.StructOrUnion) and tp.partial:
+ # use the function()'s sizes and offsets to guide the
+ # layout of the struct
+ totalsize = layout[0]
+ totalalignment = layout[1]
+ fieldofs = layout[2::2]
+ fieldsize = layout[3::2]
+ tp.force_flatten()
+ assert len(fieldofs) == len(fieldsize) == len(tp.fldnames)
+ tp.fixedlayout = fieldofs, fieldsize, totalsize, totalalignment
+ else:
+ cname = ('%s %s' % (prefix, name)).strip()
+ self._struct_pending_verification[tp] = layout, cname
+
+ def _loaded_struct_or_union(self, tp):
+ if tp.fldnames is None:
+ return # nothing to do with opaque structs
+ self.ffi._get_cached_btype(tp) # force 'fixedlayout' to be considered
+
+ if tp in self._struct_pending_verification:
+ # check that the layout sizes and offsets match the real ones
+ def check(realvalue, expectedvalue, msg):
+ if realvalue != expectedvalue:
+ raise VerificationError(
+ "%s (we have %d, but C compiler says %d)"
+ % (msg, expectedvalue, realvalue))
+ ffi = self.ffi
+ BStruct = ffi._get_cached_btype(tp)
+ layout, cname = self._struct_pending_verification.pop(tp)
+ check(layout[0], ffi.sizeof(BStruct), "wrong total size")
+ check(layout[1], ffi.alignof(BStruct), "wrong total alignment")
+ i = 2
+ for fname, ftype, fbitsize, fqual in tp.enumfields():
+ if fbitsize >= 0:
+ continue # xxx ignore fbitsize for now
+ check(layout[i], ffi.offsetof(BStruct, fname),
+ "wrong offset for field %r" % (fname,))
+ if layout[i+1] != 0:
+ BField = ffi._get_cached_btype(ftype)
+ check(layout[i+1], ffi.sizeof(BField),
+ "wrong size for field %r" % (fname,))
+ i += 2
+ assert i == len(layout)
+
+ # ----------
+ # 'anonymous' declarations. These are produced for anonymous structs
+ # or unions; the 'name' is obtained by a typedef.
+
+ def _generate_gen_anonymous_decl(self, tp, name):
+ if isinstance(tp, model.EnumType):
+ self._generate_gen_enum_decl(tp, name, '')
+ else:
+ self._generate_struct_or_union_decl(tp, '', name)
+
+ def _loading_gen_anonymous(self, tp, name, module):
+ if isinstance(tp, model.EnumType):
+ self._loading_gen_enum(tp, name, module, '')
+ else:
+ self._loading_struct_or_union(tp, '', name, module)
+
+ def _loaded_gen_anonymous(self, tp, name, module, **kwds):
+ if isinstance(tp, model.EnumType):
+ self._loaded_gen_enum(tp, name, module, **kwds)
+ else:
+ self._loaded_struct_or_union(tp)
+
+ # ----------
+ # constants, likely declared with '#define'
+
+ def _generate_gen_const(self, is_int, name, tp=None, category='const',
+ check_value=None):
+ prnt = self._prnt
+ funcname = '_cffi_%s_%s' % (category, name)
+ self.export_symbols.append(funcname)
+ if check_value is not None:
+ assert is_int
+ assert category == 'const'
+ prnt('int %s(char *out_error)' % funcname)
+ prnt('{')
+ self._check_int_constant_value(name, check_value)
+ prnt(' return 0;')
+ prnt('}')
+ elif is_int:
+ assert category == 'const'
+ prnt('int %s(long long *out_value)' % funcname)
+ prnt('{')
+ prnt(' *out_value = (long long)(%s);' % (name,))
+ prnt(' return (%s) <= 0;' % (name,))
+ prnt('}')
+ else:
+ assert tp is not None
+ assert check_value is None
+ if category == 'var':
+ ampersand = '&'
+ else:
+ ampersand = ''
+ extra = ''
+ if category == 'const' and isinstance(tp, model.StructOrUnion):
+ extra = 'const *'
+ ampersand = '&'
+ prnt(tp.get_c_name(' %s%s(void)' % (extra, funcname), name))
+ prnt('{')
+ prnt(' return (%s%s);' % (ampersand, name))
+ prnt('}')
+ prnt()
+
+ def _generate_gen_constant_decl(self, tp, name):
+ is_int = isinstance(tp, model.PrimitiveType) and tp.is_integer_type()
+ self._generate_gen_const(is_int, name, tp)
+
+ _loading_gen_constant = _loaded_noop
+
+ def _load_constant(self, is_int, tp, name, module, check_value=None):
+ funcname = '_cffi_const_%s' % name
+ if check_value is not None:
+ assert is_int
+ self._load_known_int_constant(module, funcname)
+ value = check_value
+ elif is_int:
+ BType = self.ffi._typeof_locked("long long*")[0]
+ BFunc = self.ffi._typeof_locked("int(*)(long long*)")[0]
+ function = module.load_function(BFunc, funcname)
+ p = self.ffi.new(BType)
+ negative = function(p)
+ value = int(p[0])
+ if value < 0 and not negative:
+ BLongLong = self.ffi._typeof_locked("long long")[0]
+ value += (1 << (8*self.ffi.sizeof(BLongLong)))
+ else:
+ assert check_value is None
+ fntypeextra = '(*)(void)'
+ if isinstance(tp, model.StructOrUnion):
+ fntypeextra = '*' + fntypeextra
+ BFunc = self.ffi._typeof_locked(tp.get_c_name(fntypeextra, name))[0]
+ function = module.load_function(BFunc, funcname)
+ value = function()
+ if isinstance(tp, model.StructOrUnion):
+ value = value[0]
+ return value
+
+ def _loaded_gen_constant(self, tp, name, module, library):
+ is_int = isinstance(tp, model.PrimitiveType) and tp.is_integer_type()
+ value = self._load_constant(is_int, tp, name, module)
+ setattr(library, name, value)
+ type(library)._cffi_dir.append(name)
+
+ # ----------
+ # enums
+
+ def _check_int_constant_value(self, name, value):
+ prnt = self._prnt
+ if value <= 0:
+ prnt(' if ((%s) > 0 || (long)(%s) != %dL) {' % (
+ name, name, value))
+ else:
+ prnt(' if ((%s) <= 0 || (unsigned long)(%s) != %dUL) {' % (
+ name, name, value))
+ prnt(' char buf[64];')
+ prnt(' if ((%s) <= 0)' % name)
+ prnt(' sprintf(buf, "%%ld", (long)(%s));' % name)
+ prnt(' else')
+ prnt(' sprintf(buf, "%%lu", (unsigned long)(%s));' %
+ name)
+ prnt(' sprintf(out_error, "%s has the real value %s, not %s",')
+ prnt(' "%s", buf, "%d");' % (name[:100], value))
+ prnt(' return -1;')
+ prnt(' }')
+
+ def _load_known_int_constant(self, module, funcname):
+ BType = self.ffi._typeof_locked("char[]")[0]
+ BFunc = self.ffi._typeof_locked("int(*)(char*)")[0]
+ function = module.load_function(BFunc, funcname)
+ p = self.ffi.new(BType, 256)
+ if function(p) < 0:
+ error = self.ffi.string(p)
+ if sys.version_info >= (3,):
+ error = str(error, 'utf-8')
+ raise VerificationError(error)
+
+ def _enum_funcname(self, prefix, name):
+ # "$enum_$1" => "___D_enum____D_1"
+ name = name.replace('$', '___D_')
+ return '_cffi_e_%s_%s' % (prefix, name)
+
+ def _generate_gen_enum_decl(self, tp, name, prefix='enum'):
+ if tp.partial:
+ for enumerator in tp.enumerators:
+ self._generate_gen_const(True, enumerator)
+ return
+ #
+ funcname = self._enum_funcname(prefix, name)
+ self.export_symbols.append(funcname)
+ prnt = self._prnt
+ prnt('int %s(char *out_error)' % funcname)
+ prnt('{')
+ for enumerator, enumvalue in zip(tp.enumerators, tp.enumvalues):
+ self._check_int_constant_value(enumerator, enumvalue)
+ prnt(' return 0;')
+ prnt('}')
+ prnt()
+
+ def _loading_gen_enum(self, tp, name, module, prefix='enum'):
+ if tp.partial:
+ enumvalues = [self._load_constant(True, tp, enumerator, module)
+ for enumerator in tp.enumerators]
+ tp.enumvalues = tuple(enumvalues)
+ tp.partial_resolved = True
+ else:
+ funcname = self._enum_funcname(prefix, name)
+ self._load_known_int_constant(module, funcname)
+
+ def _loaded_gen_enum(self, tp, name, module, library):
+ for enumerator, enumvalue in zip(tp.enumerators, tp.enumvalues):
+ setattr(library, enumerator, enumvalue)
+ type(library)._cffi_dir.append(enumerator)
+
+ # ----------
+ # macros: for now only for integers
+
+ def _generate_gen_macro_decl(self, tp, name):
+ if tp == '...':
+ check_value = None
+ else:
+ check_value = tp # an integer
+ self._generate_gen_const(True, name, check_value=check_value)
+
+ _loading_gen_macro = _loaded_noop
+
+ def _loaded_gen_macro(self, tp, name, module, library):
+ if tp == '...':
+ check_value = None
+ else:
+ check_value = tp # an integer
+ value = self._load_constant(True, tp, name, module,
+ check_value=check_value)
+ setattr(library, name, value)
+ type(library)._cffi_dir.append(name)
+
+ # ----------
+ # global variables
+
+ def _generate_gen_variable_decl(self, tp, name):
+ if isinstance(tp, model.ArrayType):
+ if tp.length_is_unknown():
+ prnt = self._prnt
+ funcname = '_cffi_sizeof_%s' % (name,)
+ self.export_symbols.append(funcname)
+ prnt("size_t %s(void)" % funcname)
+ prnt("{")
+ prnt(" return sizeof(%s);" % (name,))
+ prnt("}")
+ tp_ptr = model.PointerType(tp.item)
+ self._generate_gen_const(False, name, tp_ptr)
+ else:
+ tp_ptr = model.PointerType(tp)
+ self._generate_gen_const(False, name, tp_ptr, category='var')
+
+ _loading_gen_variable = _loaded_noop
+
+ def _loaded_gen_variable(self, tp, name, module, library):
+ if isinstance(tp, model.ArrayType): # int a[5] is "constant" in the
+ # sense that "a=..." is forbidden
+ if tp.length_is_unknown():
+ funcname = '_cffi_sizeof_%s' % (name,)
+ BFunc = self.ffi._typeof_locked('size_t(*)(void)')[0]
+ function = module.load_function(BFunc, funcname)
+ size = function()
+ BItemType = self.ffi._get_cached_btype(tp.item)
+ length, rest = divmod(size, self.ffi.sizeof(BItemType))
+ if rest != 0:
+ raise VerificationError(
+ "bad size: %r does not seem to be an array of %s" %
+ (name, tp.item))
+ tp = tp.resolve_length(length)
+ tp_ptr = model.PointerType(tp.item)
+ value = self._load_constant(False, tp_ptr, name, module)
+ # 'value' is a which we have to replace with
+ # a if the N is actually known
+ if tp.length is not None:
+ BArray = self.ffi._get_cached_btype(tp)
+ value = self.ffi.cast(BArray, value)
+ setattr(library, name, value)
+ type(library)._cffi_dir.append(name)
+ return
+ # remove ptr= from the library instance, and replace
+ # it by a property on the class, which reads/writes into ptr[0].
+ funcname = '_cffi_var_%s' % name
+ BFunc = self.ffi._typeof_locked(tp.get_c_name('*(*)(void)', name))[0]
+ function = module.load_function(BFunc, funcname)
+ ptr = function()
+ def getter(library):
+ return ptr[0]
+ def setter(library, value):
+ ptr[0] = value
+ setattr(type(library), name, property(getter, setter))
+ type(library)._cffi_dir.append(name)
+
+cffimod_header = r'''
+#include
+#include
+#include
+#include
+#include /* XXX for ssize_t on some platforms */
+
+/* this block of #ifs should be kept exactly identical between
+ c/_cffi_backend.c, cffi/vengine_cpy.py, cffi/vengine_gen.py
+ and cffi/_cffi_include.h */
+#if defined(_MSC_VER)
+# include /* for alloca() */
+# if _MSC_VER < 1600 /* MSVC < 2010 */
+ typedef __int8 int8_t;
+ typedef __int16 int16_t;
+ typedef __int32 int32_t;
+ typedef __int64 int64_t;
+ typedef unsigned __int8 uint8_t;
+ typedef unsigned __int16 uint16_t;
+ typedef unsigned __int32 uint32_t;
+ typedef unsigned __int64 uint64_t;
+ typedef __int8 int_least8_t;
+ typedef __int16 int_least16_t;
+ typedef __int32 int_least32_t;
+ typedef __int64 int_least64_t;
+ typedef unsigned __int8 uint_least8_t;
+ typedef unsigned __int16 uint_least16_t;
+ typedef unsigned __int32 uint_least32_t;
+ typedef unsigned __int64 uint_least64_t;
+ typedef __int8 int_fast8_t;
+ typedef __int16 int_fast16_t;
+ typedef __int32 int_fast32_t;
+ typedef __int64 int_fast64_t;
+ typedef unsigned __int8 uint_fast8_t;
+ typedef unsigned __int16 uint_fast16_t;
+ typedef unsigned __int32 uint_fast32_t;
+ typedef unsigned __int64 uint_fast64_t;
+ typedef __int64 intmax_t;
+ typedef unsigned __int64 uintmax_t;
+# else
+# include
+# endif
+# if _MSC_VER < 1800 /* MSVC < 2013 */
+# ifndef __cplusplus
+ typedef unsigned char _Bool;
+# endif
+# endif
+# define _cffi_float_complex_t _Fcomplex /* include for it */
+# define _cffi_double_complex_t _Dcomplex /* include for it */
+#else
+# include
+# if (defined (__SVR4) && defined (__sun)) || defined(_AIX) || defined(__hpux)
+# include
+# endif
+# define _cffi_float_complex_t float _Complex
+# define _cffi_double_complex_t double _Complex
+#endif
+'''
diff --git a/templates/skills/file_manager/dependencies/cffi/verifier.py b/templates/skills/file_manager/dependencies/cffi/verifier.py
new file mode 100644
index 00000000..e392a2b7
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/cffi/verifier.py
@@ -0,0 +1,306 @@
+#
+# DEPRECATED: implementation for ffi.verify()
+#
+import sys, os, binascii, shutil, io
+from . import __version_verifier_modules__
+from . import ffiplatform
+from .error import VerificationError
+
+if sys.version_info >= (3, 3):
+ import importlib.machinery
+ def _extension_suffixes():
+ return importlib.machinery.EXTENSION_SUFFIXES[:]
+else:
+ import imp
+ def _extension_suffixes():
+ return [suffix for suffix, _, type in imp.get_suffixes()
+ if type == imp.C_EXTENSION]
+
+
+if sys.version_info >= (3,):
+ NativeIO = io.StringIO
+else:
+ class NativeIO(io.BytesIO):
+ def write(self, s):
+ if isinstance(s, unicode):
+ s = s.encode('ascii')
+ super(NativeIO, self).write(s)
+
+
+class Verifier(object):
+
+ def __init__(self, ffi, preamble, tmpdir=None, modulename=None,
+ ext_package=None, tag='', force_generic_engine=False,
+ source_extension='.c', flags=None, relative_to=None, **kwds):
+ if ffi._parser._uses_new_feature:
+ raise VerificationError(
+ "feature not supported with ffi.verify(), but only "
+ "with ffi.set_source(): %s" % (ffi._parser._uses_new_feature,))
+ self.ffi = ffi
+ self.preamble = preamble
+ if not modulename:
+ flattened_kwds = ffiplatform.flatten(kwds)
+ vengine_class = _locate_engine_class(ffi, force_generic_engine)
+ self._vengine = vengine_class(self)
+ self._vengine.patch_extension_kwds(kwds)
+ self.flags = flags
+ self.kwds = self.make_relative_to(kwds, relative_to)
+ #
+ if modulename:
+ if tag:
+ raise TypeError("can't specify both 'modulename' and 'tag'")
+ else:
+ key = '\x00'.join(['%d.%d' % sys.version_info[:2],
+ __version_verifier_modules__,
+ preamble, flattened_kwds] +
+ ffi._cdefsources)
+ if sys.version_info >= (3,):
+ key = key.encode('utf-8')
+ k1 = hex(binascii.crc32(key[0::2]) & 0xffffffff)
+ k1 = k1.lstrip('0x').rstrip('L')
+ k2 = hex(binascii.crc32(key[1::2]) & 0xffffffff)
+ k2 = k2.lstrip('0').rstrip('L')
+ modulename = '_cffi_%s_%s%s%s' % (tag, self._vengine._class_key,
+ k1, k2)
+ suffix = _get_so_suffixes()[0]
+ self.tmpdir = tmpdir or _caller_dir_pycache()
+ self.sourcefilename = os.path.join(self.tmpdir, modulename + source_extension)
+ self.modulefilename = os.path.join(self.tmpdir, modulename + suffix)
+ self.ext_package = ext_package
+ self._has_source = False
+ self._has_module = False
+
+ def write_source(self, file=None):
+ """Write the C source code. It is produced in 'self.sourcefilename',
+ which can be tweaked beforehand."""
+ with self.ffi._lock:
+ if self._has_source and file is None:
+ raise VerificationError(
+ "source code already written")
+ self._write_source(file)
+
+ def compile_module(self):
+ """Write the C source code (if not done already) and compile it.
+ This produces a dynamic link library in 'self.modulefilename'."""
+ with self.ffi._lock:
+ if self._has_module:
+ raise VerificationError("module already compiled")
+ if not self._has_source:
+ self._write_source()
+ self._compile_module()
+
+ def load_library(self):
+ """Get a C module from this Verifier instance.
+ Returns an instance of a FFILibrary class that behaves like the
+ objects returned by ffi.dlopen(), but that delegates all
+ operations to the C module. If necessary, the C code is written
+ and compiled first.
+ """
+ with self.ffi._lock:
+ if not self._has_module:
+ self._locate_module()
+ if not self._has_module:
+ if not self._has_source:
+ self._write_source()
+ self._compile_module()
+ return self._load_library()
+
+ def get_module_name(self):
+ basename = os.path.basename(self.modulefilename)
+ # kill both the .so extension and the other .'s, as introduced
+ # by Python 3: 'basename.cpython-33m.so'
+ basename = basename.split('.', 1)[0]
+ # and the _d added in Python 2 debug builds --- but try to be
+ # conservative and not kill a legitimate _d
+ if basename.endswith('_d') and hasattr(sys, 'gettotalrefcount'):
+ basename = basename[:-2]
+ return basename
+
+ def get_extension(self):
+ if not self._has_source:
+ with self.ffi._lock:
+ if not self._has_source:
+ self._write_source()
+ sourcename = ffiplatform.maybe_relative_path(self.sourcefilename)
+ modname = self.get_module_name()
+ return ffiplatform.get_extension(sourcename, modname, **self.kwds)
+
+ def generates_python_module(self):
+ return self._vengine._gen_python_module
+
+ def make_relative_to(self, kwds, relative_to):
+ if relative_to and os.path.dirname(relative_to):
+ dirname = os.path.dirname(relative_to)
+ kwds = kwds.copy()
+ for key in ffiplatform.LIST_OF_FILE_NAMES:
+ if key in kwds:
+ lst = kwds[key]
+ if not isinstance(lst, (list, tuple)):
+ raise TypeError("keyword '%s' should be a list or tuple"
+ % (key,))
+ lst = [os.path.join(dirname, fn) for fn in lst]
+ kwds[key] = lst
+ return kwds
+
+ # ----------
+
+ def _locate_module(self):
+ if not os.path.isfile(self.modulefilename):
+ if self.ext_package:
+ try:
+ pkg = __import__(self.ext_package, None, None, ['__doc__'])
+ except ImportError:
+ return # cannot import the package itself, give up
+ # (e.g. it might be called differently before installation)
+ path = pkg.__path__
+ else:
+ path = None
+ filename = self._vengine.find_module(self.get_module_name(), path,
+ _get_so_suffixes())
+ if filename is None:
+ return
+ self.modulefilename = filename
+ self._vengine.collect_types()
+ self._has_module = True
+
+ def _write_source_to(self, file):
+ self._vengine._f = file
+ try:
+ self._vengine.write_source_to_f()
+ finally:
+ del self._vengine._f
+
+ def _write_source(self, file=None):
+ if file is not None:
+ self._write_source_to(file)
+ else:
+ # Write our source file to an in memory file.
+ f = NativeIO()
+ self._write_source_to(f)
+ source_data = f.getvalue()
+
+ # Determine if this matches the current file
+ if os.path.exists(self.sourcefilename):
+ with open(self.sourcefilename, "r") as fp:
+ needs_written = not (fp.read() == source_data)
+ else:
+ needs_written = True
+
+ # Actually write the file out if it doesn't match
+ if needs_written:
+ _ensure_dir(self.sourcefilename)
+ with open(self.sourcefilename, "w") as fp:
+ fp.write(source_data)
+
+ # Set this flag
+ self._has_source = True
+
+ def _compile_module(self):
+ # compile this C source
+ tmpdir = os.path.dirname(self.sourcefilename)
+ outputfilename = ffiplatform.compile(tmpdir, self.get_extension())
+ try:
+ same = ffiplatform.samefile(outputfilename, self.modulefilename)
+ except OSError:
+ same = False
+ if not same:
+ _ensure_dir(self.modulefilename)
+ shutil.move(outputfilename, self.modulefilename)
+ self._has_module = True
+
+ def _load_library(self):
+ assert self._has_module
+ if self.flags is not None:
+ return self._vengine.load_library(self.flags)
+ else:
+ return self._vengine.load_library()
+
+# ____________________________________________________________
+
+_FORCE_GENERIC_ENGINE = False # for tests
+
+def _locate_engine_class(ffi, force_generic_engine):
+ if _FORCE_GENERIC_ENGINE:
+ force_generic_engine = True
+ if not force_generic_engine:
+ if '__pypy__' in sys.builtin_module_names:
+ force_generic_engine = True
+ else:
+ try:
+ import _cffi_backend
+ except ImportError:
+ _cffi_backend = '?'
+ if ffi._backend is not _cffi_backend:
+ force_generic_engine = True
+ if force_generic_engine:
+ from . import vengine_gen
+ return vengine_gen.VGenericEngine
+ else:
+ from . import vengine_cpy
+ return vengine_cpy.VCPythonEngine
+
+# ____________________________________________________________
+
+_TMPDIR = None
+
+def _caller_dir_pycache():
+ if _TMPDIR:
+ return _TMPDIR
+ result = os.environ.get('CFFI_TMPDIR')
+ if result:
+ return result
+ filename = sys._getframe(2).f_code.co_filename
+ return os.path.abspath(os.path.join(os.path.dirname(filename),
+ '__pycache__'))
+
+def set_tmpdir(dirname):
+ """Set the temporary directory to use instead of __pycache__."""
+ global _TMPDIR
+ _TMPDIR = dirname
+
+def cleanup_tmpdir(tmpdir=None, keep_so=False):
+ """Clean up the temporary directory by removing all files in it
+ called `_cffi_*.{c,so}` as well as the `build` subdirectory."""
+ tmpdir = tmpdir or _caller_dir_pycache()
+ try:
+ filelist = os.listdir(tmpdir)
+ except OSError:
+ return
+ if keep_so:
+ suffix = '.c' # only remove .c files
+ else:
+ suffix = _get_so_suffixes()[0].lower()
+ for fn in filelist:
+ if fn.lower().startswith('_cffi_') and (
+ fn.lower().endswith(suffix) or fn.lower().endswith('.c')):
+ try:
+ os.unlink(os.path.join(tmpdir, fn))
+ except OSError:
+ pass
+ clean_dir = [os.path.join(tmpdir, 'build')]
+ for dir in clean_dir:
+ try:
+ for fn in os.listdir(dir):
+ fn = os.path.join(dir, fn)
+ if os.path.isdir(fn):
+ clean_dir.append(fn)
+ else:
+ os.unlink(fn)
+ except OSError:
+ pass
+
+def _get_so_suffixes():
+ suffixes = _extension_suffixes()
+ if not suffixes:
+ # bah, no C_EXTENSION available. Occurs on pypy without cpyext
+ if sys.platform == 'win32':
+ suffixes = [".pyd"]
+ else:
+ suffixes = [".so"]
+
+ return suffixes
+
+def _ensure_dir(filename):
+ dirname = os.path.dirname(filename)
+ if dirname and not os.path.isdir(dirname):
+ os.makedirs(dirname)
diff --git a/templates/skills/file_manager/dependencies/charset_normalizer-3.3.2.dist-info/INSTALLER b/templates/skills/file_manager/dependencies/charset_normalizer-3.3.2.dist-info/INSTALLER
new file mode 100644
index 00000000..a1b589e3
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/charset_normalizer-3.3.2.dist-info/INSTALLER
@@ -0,0 +1 @@
+pip
diff --git a/templates/skills/file_manager/dependencies/charset_normalizer-3.3.2.dist-info/LICENSE b/templates/skills/file_manager/dependencies/charset_normalizer-3.3.2.dist-info/LICENSE
new file mode 100644
index 00000000..ad82355b
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/charset_normalizer-3.3.2.dist-info/LICENSE
@@ -0,0 +1,21 @@
+MIT License
+
+Copyright (c) 2019 TAHRI Ahmed R.
+
+Permission is hereby granted, free of charge, to any person obtaining a copy
+of this software and associated documentation files (the "Software"), to deal
+in the Software without restriction, including without limitation the rights
+to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
+copies of the Software, and to permit persons to whom the Software is
+furnished to do so, subject to the following conditions:
+
+The above copyright notice and this permission notice shall be included in all
+copies or substantial portions of the Software.
+
+THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
+IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
+FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
+AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
+LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
+OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
+SOFTWARE.
\ No newline at end of file
diff --git a/templates/skills/file_manager/dependencies/charset_normalizer-3.3.2.dist-info/METADATA b/templates/skills/file_manager/dependencies/charset_normalizer-3.3.2.dist-info/METADATA
new file mode 100644
index 00000000..822550e3
--- /dev/null
+++ b/templates/skills/file_manager/dependencies/charset_normalizer-3.3.2.dist-info/METADATA
@@ -0,0 +1,683 @@
+Metadata-Version: 2.1
+Name: charset-normalizer
+Version: 3.3.2
+Summary: The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet.
+Home-page: https://github.com/Ousret/charset_normalizer
+Author: Ahmed TAHRI
+Author-email: ahmed.tahri@cloudnursery.dev
+License: MIT
+Project-URL: Bug Reports, https://github.com/Ousret/charset_normalizer/issues
+Project-URL: Documentation, https://charset-normalizer.readthedocs.io/en/latest
+Keywords: encoding,charset,charset-detector,detector,normalization,unicode,chardet,detect
+Classifier: Development Status :: 5 - Production/Stable
+Classifier: License :: OSI Approved :: MIT License
+Classifier: Intended Audience :: Developers
+Classifier: Topic :: Software Development :: Libraries :: Python Modules
+Classifier: Operating System :: OS Independent
+Classifier: Programming Language :: Python
+Classifier: Programming Language :: Python :: 3
+Classifier: Programming Language :: Python :: 3.7
+Classifier: Programming Language :: Python :: 3.8
+Classifier: Programming Language :: Python :: 3.9
+Classifier: Programming Language :: Python :: 3.10
+Classifier: Programming Language :: Python :: 3.11
+Classifier: Programming Language :: Python :: 3.12
+Classifier: Programming Language :: Python :: Implementation :: PyPy
+Classifier: Topic :: Text Processing :: Linguistic
+Classifier: Topic :: Utilities
+Classifier: Typing :: Typed
+Requires-Python: >=3.7.0
+Description-Content-Type: text/markdown
+License-File: LICENSE
+Provides-Extra: unicode_backport
+
+
Charset Detection, for Everyone 👋
+
+
+ The Real First Universal Charset Detector
+
+
+
+
+
+
+
+
+
+
+ In other language (unofficial port - by the community)
+
+
+
+
+
+> A library that helps you read text from an unknown charset encoding. Motivated by `chardet`,
+> I'm trying to resolve the issue by taking a new approach.
+> All IANA character set names for which the Python core library provides codecs are supported.
+
+