Calculates the precision of predictions and returns the result. Precision is the proportion of positive predictions that were correct. It answers: “Of all the items we predicted as positive, how many were actually positive?” Precision = True Positives / (True Positives + False Positives).
Scoring is between 0 and 1 with a perfect precision being 1.
Parameters
Name
Type
Description
Default
y_true
array
The actual observed values (ground truth).
required
y_pred
array
The model predicted values.
required
Returns
Name
Type
Description
float
The calculated precision score, ranging from 0.0 to 1.0.