Skip to content

Commit

Permalink
Update src/content/docs/ai-gateway/evaluations/add-human-feedback.mdx
Browse files Browse the repository at this point in the history
Co-authored-by: Jun Lee <[email protected]>
  • Loading branch information
daisyfaithauma and Oxyjun authored Oct 10, 2024
1 parent fdedd84 commit 033e1b2
Showing 1 changed file with 1 addition and 1 deletion.
Original file line number Diff line number Diff line change
Expand Up @@ -60,6 +60,6 @@ You need to select human feedback as an evaluator to receive its metrics.
After running the evaluation, review the results on the Evaluations tab.
You will be able to see the performance of the model based on cost, speed, and now human feedback, represented as the percentage of positive feedback (thumbs up).

The human feedback score is displayed as a percentage, showing how manyy of the dataset's responses were rated positively.
The human feedback score is displayed as a percentage, showing the distribution of positively rated responses from the database.

For more information on running evaluations, refer to the documentation [Set Up Evaluations](/ai-gateway/evaluations/set-up-evaluations/).

0 comments on commit 033e1b2

Please sign in to comment.