-
Notifications
You must be signed in to change notification settings - Fork 648
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Describe how to install the runtime from a release #18173
Conversation
``` | ||
|
||
You can download the runtime from a release or build from source. | ||
|
||
#### :octicons-package-16: Download the runtime from a release |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The other "deployment configuration" pages also have a "Get the IREE runtime" section, like https://iree.dev/guides/deployment-configurations/gpu-vulkan/#get-the-iree-runtime. I think we should update them all together.
Python packages are regularly published to | ||
[PyPI](https://pypi.org/user/google-iree-pypi-deploy/). See the | ||
[Python Bindings](../../reference/bindings/python.md) page for more details. | ||
The `iree-runtime` package includes CPU support by via the `local-sync` and | ||
`local-task` drivers. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Rather than duplicate the install instructions, how about we mention the iree-runtime
and iree-compiler
packages together?
What we want to communicate efficiently is:
- Packages for the compiler and runtime are available on pypi (stable) and github (nightly)
- We expect/encourage users to use the compiler packages, not build from source, for both the Python compiler APIs and the
iree-compile
tool - We expect/encourage users to use the runtime python bindings from packages
- Users can use
iree-run-module
andiree-benchmark-module
from packages, but they will need to build the runtime from source for any C/C++ usage from a downstream application
I figured these "deployment configurations" would be focused on C/C++ linkage into an application, where a source build is needed. We do have projects like https://github.com/nod-ai/SHARK that use Python APIs from application code though.
Thoughts?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Sounds good to me and a combined Get the IREE compiler and runtime
section is fine for me. In that case we could restructure from
- Get the IREE compiler
- Download the compiler from a release
- Build the compiler from source
- Get the IREE runtime
- Download the runtime from a release
- Build the runtime from source
to
- Get the IREE compiler and Runtime
- Download a release
- Build from source
- Build the compiler
- Build the runtime
for example.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yeah, that's the high level view. The specifics can take a few iterations to get clear enough. I usually look at similar documentation for similar projects (PyTorch, ONNX, ONNX Runtime, TensorFlow, TensorFlow Lite, ncnn, etc.) for inspiration.
This updates and harmonizes the guides for the different deployment configurations. Several identical sections are provided as snippets. * Updates docs regarding ROCm support * Adds a subsection on downloading the runtime from a release * Adds instructions on checking for CUDA and ROCm support This superseds iree-org#18173 and iree-org#18655. skip-ci: Doc updates only
This updates and harmonizes the guides for the different deployment configurations. Several identical sections are provided as snippets. * Updates docs regarding ROCm support * Adds a subsection on downloading the runtime from a release * Adds instructions on checking for CUDA and ROCm support This superseds #18173 and #18655.
No description provided.