Skip to content

Update chatqna.py to support vLLM embeddings #1005

Update chatqna.py to support vLLM embeddings

Update chatqna.py to support vLLM embeddings #1005

Triggered via pull request December 18, 2024 02:26
@lvliang-intellvliang-intel
synchronize #1237
Status Success
Total duration 1h 37m 33s
Artifacts

pr-manifest-e2e.yml

on: pull_request_target
job1  /  Get-test-matrix
6s
job1 / Get-test-matrix
Matrix: run-example
Fit to window
Zoom out
Zoom in

Annotations

1 error and 3 warnings
job1 / Get-test-matrix
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
run-example (ChatQnA, xeon) / test-k8s-manifest / get-test-case
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636
run-example (ChatQnA, gaudi) / test-k8s-manifest / get-test-case
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636