softmax激活函数实现(Softmax Activation Function Implementation)是神经网络中的最常见的激活函数之一。 softmax函数其公式为

标准代码如下

def softmax(scores: list[float]) -> list[float]:
    exp_scores = [math.exp(score) for score in scores]
    sum_exp_scores = sum(exp_scores)
    probabilities = [round(score / sum_exp_scores, 4) for score in exp_scores]
    return probabilities