Skip to content

Fix version compatibility issue with transformers>4.34.0 for flash-attention2 patch #2288

Fix version compatibility issue with transformers>4.34.0 for flash-attention2 patch

Fix version compatibility issue with transformers>4.34.0 for flash-attention2 patch #2288

Triggered via pull request December 9, 2023 05:32
Status Success
Total duration 32s
Artifacts
This run and associated checks have been archived and are scheduled for deletion. Learn more about checks retention

python-package.yml

on: pull_request
Matrix: build
Fit to window
Zoom out
Zoom in