PyTorch解决多分类问题
softmax函数
糖尿病数据集分类为0和1,MNIST数据集中有10个分类。
神经网络希望输出之前带有竞争性的结果,即所有概率之和为1,且所有概率均大于0,使用softmax可以实现这两个特点。
softmax函数
$p(y = i) = \frac{e^{Z_i}}{\Sigma_{j=0}^{K-1}e^{Z_j}},i\in \{0,…,K-1\}$
CrossEntropyLoss vs NLLLoss
NLLLOSS 损失函数
负对数似然损失1
2
3
4
5
6
7
8
9
10
11# -*- coding: UTF-8 -*-
import numpy as np
y = np.array([1,0,0])
z = np.array([0.2,0.1,-0.1])
y_pred = np.exp(z) / np.exp(z).sum()
print(y_pred)
loss = (-y * np.log(y_pred)).sum()
print(loss)
CrossEntropyLoss 损失函数
CrossEntropyLoss 损失函数 = Softmax + NLLLoss损失函数1
2
3
4
5
6
7
8# -*- coding: UTF-8 -*-
import torch
y = torch.LongTensor([0,1])
z = torch.Tensor([[0.3,0.1,-0.1],[0.1,1,-0.1]])
loss_fuction = torch.nn.CrossEntropyLoss()
loss = loss_fuction(z,y)
print(loss.item())
MINIST数据集
1 | import sys |
Kaggle — Otto Group Product Classification Challenge
Otto Group Product Classification Challenge
1 | import torch.nn as nn |
1 | import random |
1 | import os |