“SoftTarget – new form of regularization that guides the learning problem in a way that reduces over-fitting without sacrificing the capacity of the model.”
“Deep neural networks are learning models with a very high capacity and therefore prone to over-fitting. Many regularization techniques such as Dropout, DropCon- nect, and weight decay all attempt to solve the problem of over-fitting by reducing the capacity of their respective models. SoftTarget regularization proved to be an effective tool in various neural network architectures”.
Paper: http://arxiv.org/abs/1609.06693
PDF: http://arxiv.org/pdf/1609.06693