Update chatqna.py to support vLLM embeddings #1005
Triggered via pull request
December 18, 2024 02:26
lvliang-intel
synchronize
#1237
Status
Success
Total duration
1h 37m 33s
Artifacts
–
pr-manifest-e2e.yml
on: pull_request_target
job1
/
Get-test-matrix
6s
Matrix: run-example
Annotations
1 error and 3 warnings
run-example (ChatQnA, gaudi) / test-k8s-manifest / manifest-test (test_manifest_vllm_on_gaudi.sh)
Process completed with exit code 1.
|
job1 / Get-test-matrix
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
|
run-example (ChatQnA, xeon) / test-k8s-manifest / get-test-case
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
|
run-example (ChatQnA, gaudi) / test-k8s-manifest / get-test-case
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
|