Update chatqna.py to support vLLM embeddings #2235
Triggered via pull request
December 18, 2024 02:26
lvliang-intel
synchronize
#1237
Status
Success
Total duration
17m 6s
Artifacts
3
pr-docker-compose-e2e.yml
on: pull_request_target
get-test-matrix
/
Get-test-matrix
6s
Matrix: example-test
Annotations
1 error and 4 warnings
example-test (ChatQnA, rocm) / run-test (test_compose_on_rocm.sh)
Process completed with exit code 1.
|
get-test-matrix / Get-test-matrix
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
|
example-test (ChatQnA, xeon) / get-test-case
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
|
example-test (ChatQnA, rocm) / get-test-case
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
|
example-test (ChatQnA, gaudi) / get-test-case
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
|
Artifacts
Produced during runtime
Name | Size | |
---|---|---|
ChatQnA_test_compose_on_gaudi.sh
|
85.3 KB |
|
ChatQnA_test_compose_on_rocm.sh
|
70.3 KB |
|
ChatQnA_test_compose_on_xeon.sh
|
81.8 KB |
|