Hi,
Can someone please let me know if you face this following problem while training?
When we use sigmoid activation on the final output perceptron, it gives a value between 0 and 1. But, we want a value of either 0 (When the image is not in the class we are training for) or 1 (When the image belongs to the class we are training for).
So, we run it through some sort of threshold and get an output of 0 or 1 and compare it to the actual y value for accuracy measure.
The problem here is that, when the sigmoid output is printed, the values are more or less the same. Hence, we get a neural network that only gives out a result of 0.
While testing on previously unseen test data, if the distribution of the test data is 80% images not belonging to the class and 20 % images belonging to the class, the accuracy f our model is now 80% because the model always gives a value of 0 which is bound to be true 80% of the time.
We tried various loss functions and we still get the same result.
Please let us know if someone is facing similar problem
Hi,
Can someone please let me know if you face this following problem while training?
When we use sigmoid activation on the final output perceptron, it gives a value between 0 and 1. But, we want a value of either 0 (When the image is not in the class we are training for) or 1 (When the image belongs to the class we are training for).
So, we run it through some sort of threshold and get an output of 0 or 1 and compare it to the actual y value for accuracy measure.
The problem here is that, when the sigmoid output is printed, the values are more or less the same. Hence, we get a neural network that only gives out a result of 0.
While testing on previously unseen test data, if the distribution of the test data is 80% images not belonging to the class and 20 % images belonging to the class, the accuracy f our model is now 80% because the model always gives a value of 0 which is bound to be true 80% of the time.
We tried various loss functions and we still get the same result.
Please let us know if someone is facing similar problem