Thanks to visit codestin.com
Credit goes to www.scribd.com

0% found this document useful (0 votes)
14 views39 pages

Ai SS CH 7 LM

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views39 pages

Ai SS CH 7 LM

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 39

Artificial Intelligence Class X - Chapter 7 1

Evaluation Methods

Now as we have gone through


all the possible combinations of
Prediction and Reality, let us see
how we can use these
conditions to evaluate the
model.
• Assume that the model always predicts that there is no
fire.

• But in reality, there is a 2% chance of forest fire breaking


out.

Let us go back • In this case, for 98 cases, the model will be right but for
to the Forest those 2 cases in which there was a forest fire, then too the
model predicted no fire.

Fire example: • Here,


True Positives = 0
True Negatives = 98
Total cases = 100
Therefore, accuracy becomes: (98 + 0) / 100 = 98%
Precision
Going back to the Forest Fire example, in this case, assume that the
model always predicts that there is a forest fire irrespective of the reality.
In this case, all the Positive conditions would be taken into account that
is, True Positive (Prediction = Yes and Reality = Yes) and False Positive
(Prediction = Yes and Reality = No). In this case, the firefighters will check
for the fire all the time to see if the alarm was True or False.

You might recall the story of the boy who falsely cries out that there are
wolves every time and so when they actually arrive, no one comes to his
rescue. Similarly, here if the Precision is low (which means there are more
False alarms than the actual ones) then the firefighters would get
complacent and might not go and check every time considering it could
be a false alarm.
• Let us consider that a model has 100% precision.

• Which means that whenever the machine says


there’s a fire, there is actually a fire (True Positive).

• In the same model, there can be a rare exceptional


case where there was actual fire but the system could
not detect it.

• This is the case of a False Negative condition.

• But the precision value would not be affected by it


because it does not take FN into account.

• Is precision then a good parameter for model


performance?
Recall
Recall

• Another parameter for evaluating the model’s performance is Recall.

• It can be defined as the fraction of positive cases that are correctly identified.

• It majorly takes into account the true reality cases where in Reality there was a fire
but the machine either detected it correctly or it didn’t.

• That is, it considers True Positives (There was a forest fire in reality and the model
predicted a forest fire) and False Negatives (There was a forest fire and the model
didn’t predict it)
F1 Score

You might also like