Visit our virtual booth on April 16.
On April 17 at at 12:45pm, Gil Irizarry VP Engineering will be presenting. To get valuable information on how to evaluate NLP tools, listen to his talk live online at ODSC.
[Apple | Organization] and [Oranges | Fruit]: How to Evaluate NLP Tools for Entity Extraction
You have some documents and you want to extract information, but which NLP tool or library to use? You have many from which to choose but how to evaluate which is best? Evaluating NLP tools is not a straightforward exercise since differences in output between tools often prevent direct comparison. NLP tools based on different underlying technologies will tag text differently, extract different sets of entities and classify those entities differently. Basis Technology often performs evaluations between disparate NLP tools and encounters these challenges. Learn how we handle them to produce meaningful scoring of tools.
This talk will also include best practices for annotating a test data set and selecting a gold standard, as well as common ways to measure both the accuracy of the annotation and the extraction.