新余 网站建设做手机网站多少钱
news/
2025/9/23 23:07:42/
文章来源:
新余 网站建设,做手机网站多少钱,网站基本建设是什么,wordpress系统在线升级ResNet (Residual Network) 是由微软研究院的何凯明等人在2015年提出的一种深度卷积神经网络结构。ResNet的设计目标是解决深层网络训练中的梯度消失和梯度爆炸问题#xff0c;进一步提高网络的表现。下面是一个ResNet模型实现#xff0c;使用PyTorch框架来展示如何实现基本的… ResNet (Residual Network) 是由微软研究院的何凯明等人在2015年提出的一种深度卷积神经网络结构。ResNet的设计目标是解决深层网络训练中的梯度消失和梯度爆炸问题进一步提高网络的表现。下面是一个ResNet模型实现使用PyTorch框架来展示如何实现基本的ResNet结构。这个例子包括了一个基本的残差块Residual Block以及ResNet-18的实现代码结构分为model.py模型文件和train.py训练文件。
model.py 首先我们导入所需要的包
import torch
from torch import nn
from torch.nn import functional as F 然后定义Resnet BlockResBlk类。 class ResBlk(nn.Module):def __init__(self):super(ResBlk, self).__init__()self.conv1 nn.Conv2d(ch_in, ch_out, kernel_size3, stride1, padding1)self.bn1 nn.BatchNorm2d(ch_out)self.conv2 nn.Conv2d(ch_out, ch_out, kernel_size3, stride1, padding1)self.bn2 nn.BatchNorm2d(ch_out)self.extra nn.Sequential()if ch_out ! ch_inself.extra nn.Sequential(nn.Conv2d(ch_in, ch_out, kernel_size3, stride1)nn.BatchNorm2d(ch_out))def forward(self, x):out F.relu(self.bn1(self.conv1(x)))out F.relu(self.bn2(self.conv2(x)))out self.extra(x) outreturn out最后根据ResNet18的结构对ResNet Block进行堆叠。 class Resnet18(nn.Module):def __init__(self):super(Resnet18, self).__init__()self.conv1 nn.Sequential(nn.Conv2d(3, 64, kernel_size3, stride1, padding1)nn.BatchNorm2d(64))self.blk1 ResBlk(64, 128)self.blk2 ResBlk(128, 256)self.blk3 ResBlk(256, 512)self.blk4 ResBlk(512, 1024)self.outlayer nn.Linear(512, 10)def forward(self, x):x F.relu(self.conv1(x))x self.blk1(x)x self.blk2(x)x self.blk3(x)x self.blk4(x)# print(after conv1:, x.shape)x F.adaptive_avg_pool2d(x, [1,1])x x.view(x.size(0), -1)x self.outlayer(x)return x 其中在网络结构搭建过程中需要用到中间阶段的图片参数用下述测试过程求得。
def main():tmp torch.randn(2, 3, 32, 32)out blk(tmp)print(block, out.shape)x torch.randn(2, 3, 32, 32)model ResNet18()out model(x)print(resnet:, out.shape)
train.py 首先导入所需要的包
import torch
from torchvision import datasets
from torchvision import transforms
from torch import nn, optimizer 然后定义main()函数
def main():batchsz 32cifar_train datasets.CIFAR10(cifar, True, transformtransforms.Compose([transforms.Resize((32, 32)),transforms.ToTensor()]), downloadTrue)cifar_train DataLoader(cifar_train, batch_sizebatchsz, shuffleTrue)cifar_test datasets.CIFAR10(cifar, False, transformtransforms.Compose([transforms.Resize((32, 32)),transforms.ToTensor()]), downloadTrue)cifar_test DataLoader(cifar_test, batch_sizebatchsz, shuffleTrue)x, label iter(cifar_train).next()print(x:, x.shape, label:, label.shape)device torch.device(cuda)model ResNet18().to(device)criteon nn.CrossEntropyLoss()optimizer optim.Adam(model.parameters(), lr1e-3)print(model)for epoch in range(100):for batchidx, (x, label) in enumerate(cifar_train):x, label x.to(device), label.to(device)logits model(x)loss criteon(logitsm label)optimizer.zero_grad()loss.backward()optimizer.step()print(loss.item())with torch.no_grad():total_correct 0total_num 0for x, label in cifar_test:x, label x.to(device), label.to(device)logits model(x)pred logits.argmax(dim1)total_correct torch.eq(pred, label).floot().sum().item()total_num x.size(0)acc total_correct / total_numprint(epoch, acc)
本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.mzph.cn/news/914180.shtml
如若内容造成侵权/违法违规/事实不符,请联系多彩编程网进行投诉反馈email:809451989@qq.com,一经查实,立即删除!