docarray.array.mixins.evaluation module#

class docarray.array.mixins.evaluation.EvaluationMixin[source]#

Bases: object

A mixin that provides ranking evaluation functionality to DocumentArrayLike objects

evaluate(other, metric, hash_fn=None, metric_name=None, strict=True, **kwargs)[source]#

Compute ranking evaluation metrics for a given DocumentArray when compared with a groundtruth.

This implementation expects to provide a groundtruth DocumentArray that is structurally identical to self. It is based on comparing the matches of documents inside the `DocumentArray.

This method will fill the evaluations field of Documents inside this DocumentArray and will return the average of the computations

  • other (DocumentArray) – The groundtruth DocumentArray` that the DocumentArray compares to.

  • metric (Union[str, Callable[..., float]]) – The name of the metric, or multiple metrics to be computed

  • hash_fn (Optional[Callable[[Document], str]]) – The function used for identifying the uniqueness of Documents. If not given, then is used.

  • metric_name (Optional[str]) – If provided, the results of the metrics computation will be stored in the evaluations field of each Document. If not provided, the name will be computed based on the metrics name.

  • strict (bool) – If set, then left and right sides are required to be fully aligned: on the length, and on the semantic of length. These are preventing you to evaluate on irrelevant matches accidentally.

  • kwargs – Additional keyword arguments to be passed to metric_fn

Return type:



The average evaluation computed or a list of them if multiple metrics are required