事情如下:
最近加载预训练字典时惊讶的发现,load_state_dict pth文件中的字典长度竟然和model.state_dict,长度不一样。写了一个测试的代码如下,求帮看看:
from models.resnet import pretrained_resnet101 as resnet101
import torch.nn as nn
model,_ = resnet101(16,112,pretrained_resnet101_path='../CTEN-main/models/resnet-101-kinetics.pth')
model.fc = nn.Linear(model.fc.in_features, 400)
b = model.state_dict()
len(b)
Loading pretrained 3D ResNet-101 ../CTEN-main/models/resnet-101-kinetics.pth
626
from collections import OrderedDict
import torch
new = OrderedDict()
# new['ttt']=torch.tensor([5, 6, 7])
miss, unexp = model.load_state_dict(new, strict=False)
len(miss)
522
正常来说下面的miss长度不应该也是626吗!?