# DAY 41 Grad-CAM 与 Hook 函数
知识点回顾
1. 回调函数
2. lambda 函数
3. hook 函数的模块钩子和张量钩子
4. Grad-CAM 的示例
预训练模型知识点回顾:
1. 预训练的概念
2. 常见的分类预训练模型
3. 图像预训练模型的发展史
4. 预训练的策略
5. 预训练代码实战: resnet18
作业:
- 尝试在 cifar10 对比如下其他的预训练模型,观察差异,尽可能和他人选择的不同
- 尝试通过 ctrl 进入 resnet 的内部,观察残差究竟是什么
# Alexnet模型与训练 import torch import torch.nn as nn import torchvision.models as models import torchvision.transforms as transforms import torchvision.datasets as datasets from torch.utils.data import DataLoader # 定义数据预处理 transform = transforms.Compose([ transforms.RandomHorizontalFlip(), transforms.RandomCrop(32, padding=4), transforms.ToTensor(), transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225]) ]) # 加载 CIFAR-10 数据集 train_dataset = datasets.CIFAR10(root='./data', train=True, download=True, transform=transform) test_dataset = datasets.CIFAR10(root='./data', train=False, download=True, transform=transform) train_loader = DataLoader(train_dataset, batch_size=64, shuffle=True) test_loader = DataLoader(test_dataset, batch_size=64, shuffle=False) # 加载预训练的 AlexNet 模型 model = models.alexnet(pretrained=True) num_ftrs = model.classifier[6].in_features model.classifier[6] = nn.Linear(num_ftrs, 10) # 修改分类器以适应 CIFAR-10 数据集 # 定义损失函数和优化器 criterion = nn.CrossEntropyLoss() optimizer = torch.optim.Adam(model.parameters(), lr=0.001) # 训练模型 for epoch in range(10): model.train() for inputs, labels in train_loader: optimizer.zero_grad() outputs = model(inputs) loss = criterion(outputs, labels) loss.backward() optimizer.step() print(f'Epoch {epoch+1}, Loss: {loss.item()}') # 测试模型 model.eval() correct = 0 total = 0 with torch.no_grad(): for inputs, labels in test_loader: outputs = model(inputs) _, predicted = torch.max(outputs.data, 1) total += labels.size(0) correct += (predicted == labels).sum().item() print(f'Accuracy: {100 * correct / total:.2f}%')