- class docarray.array.mixins.evaluation.EvaluationMixin#
A mixin that provides ranking evaluation functionality to DocumentArrayLike objects
- evaluate(other, metric, hash_fn=None, metric_name=None, strict=True, **kwargs)#
Compute ranking evaluation metrics for a given DocumentArray when compared with a groundtruth.
This implementation expects to provide a groundtruth DocumentArray that is structurally identical to self. It is based on comparing the matches of documents inside the `DocumentArray.
This method will fill the evaluations field of Documents inside this DocumentArray and will return the average of the computations
DocumentArray) – The groundtruth DocumentArray` that the DocumentArray compares to.
float]]) – The name of the metric, or multiple metrics to be computed
str]]) – The function used for identifying the uniqueness of Documents. If not given, then
str]) – If provided, the results of the metrics computation will be stored in the evaluations field of each Document. If not provided, the name will be computed based on the metrics name.
bool) – If set, then left and right sides are required to be fully aligned: on the length, and on the semantic of length. These are preventing you to evaluate on irrelevant matches accidentally.
kwargs – Additional keyword arguments to be passed to metric_fn
- Return type:
The average evaluation computed or a list of them if multiple metrics are required