Reviewer-Provided Feedback
Confident AI enables reviewers to leave reviewer-provided feedback on each response. This feature is crucial for identifying failing LLM responses in production.
Feedback can be accessed, reviewed, and filtered on the Human Feedback page by data annotators, labelers, and domain experts for further discussion, including the potential inclusion of an response in the dataset.
Leaving A Feedback
1. Selecting A Response
Navigate to the response you wish to review (you can filter for responses based on LLM parameters, hyperparameters, and logged custom data on the observatory page) and click on the inspect button.
2. Leaving The Feedback
Provide a feedback by rating the response out of 5, and optionally offer an explanation for the rating and/or suggest the ideal expected response. Note that the explanation and ideal response are optional parameters. Click "Leave Feedback" to submit your review.
3. Viewing the feedback
The feedback you've submitted will now be available in the feedback tab within the response details panel. You can submit multiple pieces of feedback for a single response, all of which will be displayed here.