Skip to content

Commit

Permalink
Simplify list of features
Browse files Browse the repository at this point in the history
  • Loading branch information
jackmpcollins committed Dec 1, 2024
1 parent 1f18fc3 commit 1f5f193
Show file tree
Hide file tree
Showing 2 changed files with 16 additions and 16 deletions.
16 changes: 8 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,19 +1,16 @@
# magentic

Easily integrate Large Language Models into your Python code. Simply use the `@prompt` and `@chatprompt` decorators to create functions that return structured output from the LLM. Mix LLM queries and function calling with regular Python code to create complex logic.
Seamlessly integrate Large Language Models into Python code. Use the `@prompt` and `@chatprompt` decorators to create functions that return structured output from an LLM. Combine LLM queries and tool use with traditional Python code to build complex agentic systems.

## Features

- [Structured Outputs] using pydantic models and built-in python types.
- [Chat Prompting] to enable few-shot prompting with structured examples.
- [Function Calling] and [Parallel Function Calling] via the `FunctionCall` and `ParallelFunctionCall` return types.
- [Formatting] to naturally insert python objects into prompts.
- [Asyncio]. Simply use `async def` when defining a magentic function.
- [Streaming] structured outputs to use them as they are being generated.
- [Vision] to easily get structured outputs from images.
- [Streaming] of structured outputs and function calls, to use them while being generated.
- [LLM-Assisted Retries] to improve LLM adherence to complex output schemas.
- Multiple LLM providers including OpenAI and Anthropic. See [Configuration].
- [Observability] using OpenTelemetry, with native [Pydantic Logfire integration].
- [Type Annotations] to work nicely with linters and IDEs.
- [Configuration] options for multiple LLM providers including OpenAI, Anthropic, and Ollama.
- Many more features: [Chat Prompting], [Parallel Function Calling], [Vision], [Formatting], [Asyncio]...

## Installation

Expand Down Expand Up @@ -184,6 +181,8 @@ LLM-powered functions created using `@prompt`, `@chatprompt` and `@prompt_chain`
[Chat Prompting]: https://magentic.dev/chat-prompting
[Function Calling]: https://magentic.dev/function-calling
[Parallel Function Calling]: https://magentic.dev/function-calling/#parallelfunctioncall
[Observability]: https://magentic.dev/logging-and-tracing
[Pydantic Logfire integration]: https://logfire.pydantic.dev/docs/integrations/third-party/magentic/
[Formatting]: https://magentic.dev/formatting
[Asyncio]: https://magentic.dev/asyncio
[Streaming]: https://magentic.dev/streaming
Expand All @@ -192,6 +191,7 @@ LLM-powered functions created using `@prompt`, `@chatprompt` and `@prompt_chain`
[Configuration]: https://magentic.dev/configuration
[Type Annotations]: https://magentic.dev/type-checking


### Streaming

The `StreamedStr` (and `AsyncStreamedStr`) class can be used to stream the output of the LLM. This allows you to process the text while it is being generated, rather than receiving the whole output at once.
Expand Down
16 changes: 8 additions & 8 deletions docs/index.md
Original file line number Diff line number Diff line change
@@ -1,19 +1,17 @@
# Overview

Easily integrate Large Language Models into your Python code. Simply use the `@prompt` and `@chatprompt` decorators to create functions that return structured output from the LLM. Mix LLM queries and function calling with regular Python code to create complex logic.
Seamlessly integrate Large Language Models into Python code. Use the `@prompt` and `@chatprompt` decorators to create functions that return structured output from an LLM. Combine LLM queries and tool use with traditional Python code to build complex agentic systems.

## Features

- [Structured Outputs] using pydantic models and built-in python types.
- [Chat Prompting] to enable few-shot prompting with structured examples.
- [Function Calling] and [Parallel Function Calling] via the `FunctionCall` and `ParallelFunctionCall` return types.
- [Formatting] to naturally insert python objects into prompts.
- [Asyncio]. Simply use `async def` when defining a magentic function.
- [Streaming] structured outputs to use them as they are being generated.
- [Vision] to easily get structured outputs from images.
- [Streaming] of structured outputs and function calls, to use them while being generated.
- [LLM-Assisted Retries] to improve LLM adherence to complex output schemas.
- Multiple LLM providers including OpenAI and Anthropic. See [Configuration].
- [Observability] using OpenTelemetry, with native [Pydantic Logfire integration].
- [Type Annotations] to work nicely with linters and IDEs.
- [Configuration] options for multiple LLM providers including OpenAI, Anthropic, and Ollama.
- Many more features: [Chat Prompting], [Parallel Function Calling], [Vision], [Formatting], [Asyncio]...


## Installation

Expand Down Expand Up @@ -184,6 +182,8 @@ LLM-powered functions created using `@prompt`, `@chatprompt` and `@prompt_chain`
[Chat Prompting]: chat-prompting.md
[Function Calling]: function-calling.md
[Parallel Function Calling]: function-calling.md#parallelfunctioncall
[Observability]: logging-and-tracing.md
[Pydantic Logfire integration]: https://logfire.pydantic.dev/docs/integrations/third-party/magentic/
[Formatting]: formatting.md
[Asyncio]: asyncio.md
[Streaming]: streaming.md
Expand Down

0 comments on commit 1f5f193

Please sign in to comment.