Confusion Matrix — Is it that confusing ???

Confusion Matrix

What’s it’s use ???

  1. True Positive : This indicates that the model is correct in predicting any positive attack to the server which is the best thing for the model to do.
  2. True Negative : This indicates the extent to which the model predicts the attack is happening but actually the attack isn’t. This is a false prediction but there isn’t any harm to the business as there no as such attack.
  3. False Positive : This indicates the extent to which the model predicts the wrong prediction for the actual attack being made. This is a serious concern issue where the model says that there isn’t any attack while the attack is actually happening. Having high values of FP inside a confusion matrix makes your model less suitable for production.
  4. False Negative : This indicates the extent to which the model does a wrong prediction for the attack not actually being made.

--

--

--

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Building a Neural graph-based Dependency Parser

Applying Attention on Lagged page views for Time-series Forecasting

Inside the hood — machine learning enhanced real time alarms with ZoneMinder

Efficient Neural Networks Training through Locality Sensitive Hashing

OMNIVORE: A Single Model for Many Visual Modalities |Paper Summary|

Horizontally Fused Training Array (HFTA): An Effective Hardware Utilization Squeezer for Training…

Fighting Bias in Machine Learning

Adapting the Hand Detector Tutorial to Your Own Data

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Yashwanth Medisetti

Yashwanth Medisetti

More from Medium

Understanding how backpropagation works

Cross-Validation Using K-Fold With Scikit-Learn

Liver Disease Prediction Using Lab Findings

Neural Network Concepts and technical details