In the last post, we covered Gradient Descent for optimization to find minima and now it's time to understand how to know if what machine learning algorithms are predicting are good enough or something that we need.
Just pause for a second and think. If you ask a machine to predict something and it predicted it; how will you know whether the prediction is correct or not. Or better still, how far is the prediction from the desired value?
My guess is you will try to find the difference between the algorithm predicting and what was desired. And to improve the algorithm you will try to minimize the difference. Gradient technique was one approach to minimize that difference, there are many more which we will cover at some point. This difference between what is desired and what is available is what is represented by the cost function.