[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2023-08-18 UTC."],[],[],null,["# Module: tfr.keras.metrics\n\n\u003cbr /\u003e\n\n|-----------------------------------------------------------------------------------------------------------------------|\n| [View source on GitHub](https://github.com/tensorflow/ranking/blob/v0.5.3/tensorflow_ranking/python/keras/metrics.py) |\n\nKeras metrics in TF-Ranking.\n| **Note:** For metrics that compute a ranking, ties are broken randomly. This means that metrics may be stochastic if items with equal scores are provided.\n| **Warning:** Some metrics (e.g. Recall or MRR) are not well-defined when there are no relevant items (e.g. if `y_true` has a row of only zeroes). For these cases, the TF-Ranking metrics will evaluate to `0`.\n\nClasses\n-------\n\n[`class ARPMetric`](../../tfr/keras/metrics/ARPMetric): Average relevance position (ARP).\n\n[`class AlphaDCGMetric`](../../tfr/keras/metrics/AlphaDCGMetric): Alpha discounted cumulative gain (alphaDCG).\n\n[`class DCGMetric`](../../tfr/keras/metrics/DCGMetric): Discounted cumulative gain (DCG).\n\n[`class HitsMetric`](../../tfr/keras/metrics/HitsMetric): Hits@k metric.\n\n[`class MRRMetric`](../../tfr/keras/metrics/MRRMetric): Mean reciprocal rank (MRR).\n\n[`class MeanAveragePrecisionMetric`](../../tfr/keras/metrics/MeanAveragePrecisionMetric): Mean average precision (MAP).\n\n[`class NDCGMetric`](../../tfr/keras/metrics/NDCGMetric): Normalized discounted cumulative gain (NDCG).\n\n[`class OPAMetric`](../../tfr/keras/metrics/OPAMetric): Ordered pair accuracy (OPA).\n\n[`class PrecisionIAMetric`](../../tfr/keras/metrics/PrecisionIAMetric): Precision-IA@k (Pre-IA@k).\n\n[`class PrecisionMetric`](../../tfr/keras/metrics/PrecisionMetric): Precision@k (P@k).\n\n[`class RankingMetricKey`](../../tfr/keras/metrics/RankingMetricKey): Ranking metric key strings.\n\n[`class RecallMetric`](../../tfr/keras/metrics/RecallMetric): Recall@k (R@k).\n\nFunctions\n---------\n\n[`default_keras_metrics(...)`](../../tfr/keras/metrics/default_keras_metrics): Returns a list of ranking metrics.\n\n[`get(...)`](../../tfr/keras/metrics/get): Factory method to get a list of ranking metrics.\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n\u003cbr /\u003e\n\n| Other Members ------------- ||\n|-----------------|-----------------------------------|\n| absolute_import | Instance of `__future__._Feature` |\n| division | Instance of `__future__._Feature` |\n| print_function | Instance of `__future__._Feature` |\n\n\u003cbr /\u003e"]]