# Evaluate Prediction - Binary

Returns a data frame with evaluation score of binary classification including the below.

* AUC
* f\_score
* accuracy
* misclassification\_rate
* precision
* recall
* specificity
* true\_positive - Number of positive predictions that actually are positive.
* false\_positive - Number of positive predictions that actually are negative.
* true\_negative - Number of negative predictions that actually are negative.
* false\_negative - Number of negative predictions that actually are positive.
* test\_size - The number of tested data.
* threshold - threshold value for prediction.

## How to Access This Feature

### From + (plus) Button

There are two ways to access. One is to access from 'Add' (Plus) button. ![](https://2850417076-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-M4HLCK3olgduYoe3RVS%2F-M4oMvCUDQwHTJ0eWi_f%2F-M4oNBw_BcjYewlKErwU%2Fevaluate_binary_add.png?generation=1586795479083887\&alt=media)

Another way is to access from a column header menu. ![](https://2850417076-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-M4HLCK3olgduYoe3RVS%2F-M4oMvCUDQwHTJ0eWi_f%2F-M4oNBwbFbqxhzyrYkIo%2Fevaluate_binary_col.png?generation=1586795479187580\&alt=media)

## How to Use?

![](https://2850417076-files.gitbook.io/~/files/v0/b/gitbook-legacy-files/o/assets%2F-M4HLCK3olgduYoe3RVS%2F-M4oMvCUDQwHTJ0eWi_f%2F-M4oNBwdmzMNKp_QGWoX%2Fevaluate_binary_param.png?generation=1586795479101844\&alt=media)

* Predicted Probability Column - The column with predicted values. Usually, it's predicted\_probability in the framework of Exploratory.
* Actual Value Column - The column with actual value.
* Threshold Value to Decide Predicted Label - You can choose how to decide threshold for predicted label.
  * Use Optimized Value - This searches threshold to optimize the chosen metric. It can be
    * F Score
    * Accuracy
    * Precision
    * Recall
    * Specificity
  * Enter Manually
    * Set threshold value manually.
