Difference between revisions of "Evaluate NERSchemas"

From Anote2Wiki
Jump to: navigation, search
(Result)
 
(8 intermediate revisions by 2 users not shown)
Line 4: Line 4:
 
== Operation ==
 
== Operation ==
  
The user can apply a evaluation operation between two NERSchemas in order to calculate the annotation recall, precision and f-score between a NERSchema Gold Standard and a NERSchema to compare. To start the evaluation process right click on the NERSchema Datatype to be selected as NERSchema Gold Standard and select "Evaluate -> NER Schema".
+
You can apply an evaluation operation comparing two NERSchemas to calculate the annotation recall, precision and f-score between a NERSchema taken as a gold standard (typically taken from a corpus with manual annotations) and a NERSchema to compare (typically from running a specific algorithm or pipeline).  
 +
To start the evaluation process, right click on the NERSchema Datatype to be selected as gold standard and select "Evaluate -> NER Schema".
 +
 
  
 
[[File:NERSchema Evaluate.png|center|1500px]]
 
[[File:NERSchema Evaluate.png|center|1500px]]
 +
  
 
== Select NERSchema to compare ==
 
== Select NERSchema to compare ==
  
A GUI will be launched to choose NERSchema Datatype (in blue) to be compared with the NERSchema Gold Standard selected on clipboard.
+
A GUI will be launched to choose the NERSchema Datatype (in blue) to be compared with the gold standard selected on the clipboard.
 +
 
  
 
[[File:NERSchema Evaluate GUI.png|center|500px]]
 
[[File:NERSchema Evaluate GUI.png|center|500px]]
  
  
NOTES: Only will appear NERSchemas with the same Normalization process present in NERSchema Gold Standard.
+
NOTES: Only NERSchemas with the same Normalization process as the one NERSchema gold standard will appear.
  
 
== Result ==
 
== Result ==
  
A information pop-up with the annotation recall, precision and f-score measures will appear after the evaluation process.
+
An NER schema evaluation report will appear after the evaluation process with two tabs related to overall scores and scores per class type.
 +
 
 +
In the overall scores tab, the annotation recall, precision and f-score measures are presented for all annotated entities.
 +
 
 +
 
 +
[[File:NERchema Evaluate Result.png|center|500px]]
 +
 
 +
 
 +
In the scores per class type, the annotation recall, precision and f-score measures are presented for each annotated entity class.
 +
 
  
[[File:RESchema Evaluate Result.png|center|200px]]
+
[[File:RESchema Evaluate Result b.png|center|500px]]

Latest revision as of 17:32, 16 January 2015

Operation

You can apply an evaluation operation comparing two NERSchemas to calculate the annotation recall, precision and f-score between a NERSchema taken as a gold standard (typically taken from a corpus with manual annotations) and a NERSchema to compare (typically from running a specific algorithm or pipeline). To start the evaluation process, right click on the NERSchema Datatype to be selected as gold standard and select "Evaluate -> NER Schema".


NERSchema Evaluate.png


Select NERSchema to compare

A GUI will be launched to choose the NERSchema Datatype (in blue) to be compared with the gold standard selected on the clipboard.


NERSchema Evaluate GUI.png


NOTES: Only NERSchemas with the same Normalization process as the one NERSchema gold standard will appear.

Result

An NER schema evaluation report will appear after the evaluation process with two tabs related to overall scores and scores per class type.

In the overall scores tab, the annotation recall, precision and f-score measures are presented for all annotated entities.


NERchema Evaluate Result.png


In the scores per class type, the annotation recall, precision and f-score measures are presented for each annotated entity class.


RESchema Evaluate Result b.png