Create Evaluation

create-evaluation

Add Evaluation Details

Add in Evaluation name, description (optional), tags (optional), and select a dataset. create-evaluation

Add Contributor Evaluation

Contributor Evaluation lets you set up an evaluation that is labeled by humans. create-evaluation

Configure Contributor Evaluation

Configure the contributor Evaluation.

Annotation Layout configuration

The first step is to configure the data pieces that your contributors will see. You can select which columns you want the data to appear, and you can also configure how the data in the columns will show up.
For example, for this configuration:
create-evaluation
Annotators will see this layout when they label.
create-evaluation

Select Questions

Next, select the questions you want your contributor to answer. You can select a pre-existing question, or you can add a new one in this workflow. create-evaluation create-evaluation
The questions that you select will show up in the right hand side of the data the contributors will see.
create-evaluation

Create Evaluation

Select the rows on the dataset you want to run the evaluation, and click Create Evaluation. create-evaluation

Annotate evaluations

When you create an annotation, you can now navigate to that evaluation and see the a tab for Annotation Tasks. You or assigned labelers can now start labeling these tasks. create-evaluation

View Evaluation Results

Completed annotations.

After all the tasks have been annotated, the tasks will show up as completed on the annotation tasks table. create-evaluation

Data

The results will show up as a one column for each question on the data table. create-evaluation

Overview

The results will also show up as a graph with the visual representation of the evaluation result on the overview page. create-evaluation