0 / 0
Precision in Watson OpenScale quality metrics
Last updated: Jun 15, 2023
Precision in Watson OpenScale quality metrics

Precision gives the proportion of correct predictions in total predictions of positive classes in Watson OpenScale.

Precision at a glance

  • Description: Proportion of correct predictions in predictions of positive class
  • Default thresholds: Lower limit = 80%
  • Default recommendation:
    • Upward trend: An upward trend indicates that the metric is improving. This means that model retraining is effective.
    • Downward trend: A downward trend indicates that the metric is deteriorating. Feedback data is becoming significantly different than the training data.
    • Erratic or irregular variation: An erratic or irregular variation indicates that the feedback data is not consistent between evaluations. Increase the minimum sample size for the Quality monitor.
  • Problem type: Binary classification
  • Chart values: Last value in the timeframe
  • Metrics details available: Confusion matrix

Do the math

Precision (P) is defined as the number of true positives (Tp) over the number of true positives plus the number of false positives (Fp).

                           number of true positives
Precision =  __________________________________________________________

             (number of true positives + the number of false positives)

Learn more

Reviewing quality results

Parent topic: Quality metrics overview

Generative AI search and answer
These answers are generated by a large language model in watsonx.ai based on content from the product documentation. Learn more