UP | HOME

Date: [2022-09-08 Thu]

Soft Max

Soft Max is a normalization function popular in Neural Network.

It can be thought as taking logits then it exponentiates them so we get count, then we divide by the sum of counts. Thus the output of soft max are in between 0 and 1 and so soft max converts logits to probabilities.


Backlinks


You can send your feedback, queries here