List of cost functions used in Neural Networks

A cost function is a quantitatif measure of the quality of a fit: how good the model is at reproducing the data. A cost function is a single value which is the the sum of the deviation of the model from the real value for all points in the dataset.

1. Quadratic Cost function: regression

where and are the true target value of point , and the predicted target value respectively.


2. Cross Entropy Cost: Classification


3. Exponential Cost

where is a hyper-parameter.


4. Hellinger Distance

it needs to have positive values in [0, 1].


5. Kullback-Leibler Divergence

Kullback-Leibler Divergence is also known as : Information Divergence, Information Gain, Relative entropy, KLIC divergence or KL Divergence, and is defined as:

where is a measure of the information lost when is used to approximate .

The cost function using KL Divergence is:


6. Generalized Kullback-Leibler Divergence


6. Itakura-Saito Distance