Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about testing #41

Open
purple7seven opened this issue Sep 14, 2019 · 1 comment
Open

Question about testing #41

purple7seven opened this issue Sep 14, 2019 · 1 comment

Comments

@purple7seven
Copy link

Thanks your project. I trained the model by my dataset. But when i test the model, i got the error.

Network structure: [DataParallel - SRFBN], with parameters: [3,631,478]

===> Loading model from [./experiments/SRFBN_in3f32_x4/epochs/best_ckp.pth]...
Traceback (most recent call last):
File "test.py", line 106, in
main()
File "test.py", line 35, in main
solver = create_solver(opt)
File "/home/xuhuali/projects/SRFBN_CVPR19/solvers/init.py", line 5, in create_solver
solver = SRSolver(opt)
File "/home/xuhuali/projects/SRFBN_CVPR19/solvers/SRSolver.py", line 70, in init
self.load()
File "/home/xuhuali/projects/SRFBN_CVPR19/solvers/SRSolver.py", line 309, in load
load_func(checkpoint)
File "/home/xuhuali/anaconda3/lib/python3.6/site-packages/torch/nn/modules/module.py", line 777, in load_state_dict
self.class.name, "\n\t".join(error_msgs)))
RuntimeError: Error(s) in loading state_dict for DataParallel:
Missing key(s) in state_dict: "module.block.upBlocks.3.0.weight", "module.block.upBlocks.3.0.bias", "module.block.upBlocks.3.1.weight", "module.block.upBlocks.4.0.weight", "module.block.upBlocks.4.0.bias", "module.block.upBlocks.4.1.weight", "module.block.upBlocks.5.0.weight", "module.block.upBlocks.5.0.bias", "module.block.upBlocks.5.1.weight", "module.block.downBlocks.3.0.weight", "module.block.downBlocks.3.0.bias", "module.block.downBlocks.3.1.weight", "module.block.downBlocks.4.0.weight", "module.block.downBlocks.4.0.bias", "module.block.downBlocks.4.1.weight", "module.block.downBlocks.5.0.weight", "module.block.downBlocks.5.0.bias", "module.block.downBlocks.5.1.weight", "module.block.uptranBlocks.2.0.weight", "module.block.uptranBlocks.2.0.bias", "module.block.uptranBlocks.2.1.weight", "module.block.uptranBlocks.3.0.weight", "module.block.uptranBlocks.3.0.bias", "module.block.uptranBlocks.3.1.weight", "module.block.uptranBlocks.4.0.weight", "module.block.uptranBlocks.4.0.bias", "module.block.uptranBlocks.4.1.weight", "module.block.downtranBlocks.2.0.weight", "module.block.downtranBlocks.2.0.bias", "module.block.downtranBlocks.2.1.weight", "module.block.downtranBlocks.3.0.weight", "module.block.downtranBlocks.3.0.bias", "module.block.downtranBlocks.3.1.weight", "module.block.downtranBlocks.4.0.weight", "module.block.downtranBlocks.4.0.bias", "module.block.downtranBlocks.4.1.weight".
size mismatch for module.conv_in.0.weight: copying a param with shape torch.Size([128, 3, 3, 3]) from checkpoint, the shape in current model is torch.Size([256, 3, 3, 3]).
size mismatch for module.conv_in.0.bias: copying a param with shape torch.Size([128]) from checkpoint, the shape in current model is torch.Size([256]).
size mismatch for module.feat_in.0.weight: copying a param with shape torch.Size([32, 128, 1, 1]) from checkpoint, the shape in current model is torch.Size([64, 256, 1, 1]).
size mismatch for module.feat_in.0.bias: copying a param with shape torch.Size([32]) from checkpoint, the shape in current model is torch.Size([64]).
size mismatch for module.block.compress_in.0.weight: copying a param with shape torch.Size([32, 64, 1, 1]) from checkpoint, the shape in current model is torch.Size([64, 128, 1, 1]).
size mismatch for module.block.compress_in.0.bias: copying a param with shape torch.Size([32]) from checkpoint, the shape in current model is torch.Size([64]).
size mismatch for module.block.upBlocks.0.0.weight: copying a param with shape torch.Size([32, 32, 8, 8]) from checkpoint, the shape in current model is torch.Size([64, 64, 8, 8]).
size mismatch for module.block.upBlocks.0.0.bias: copying a param with shape torch.Size([32]) from checkpoint, the shape in current model is torch.Size([64]).
size mismatch for module.block.upBlocks.1.0.weight: copying a param with shape torch.Size([32, 32, 8, 8]) from checkpoint, the shape in current model is torch.Size([64, 64, 8, 8]).
size mismatch for module.block.upBlocks.1.0.bias: copying a param with shape torch.Size([32]) from checkpoint, the shape in current model is torch.Size([64]).
size mismatch for module.block.upBlocks.2.0.weight: copying a param with shape torch.Size([32, 32, 8, 8]) from checkpoint, the shape in current model is torch.Size([64, 64, 8, 8]).
size mismatch for module.block.upBlocks.2.0.bias: copying a param with shape torch.Size([32]) from checkpoint, the shape in current model is torch.Size([64]).
size mismatch for module.block.downBlocks.0.0.weight: copying a param with shape torch.Size([32, 32, 8, 8]) from checkpoint, the shape in current model is torch.Size([64, 64, 8, 8]).
size mismatch for module.block.downBlocks.0.0.bias: copying a param with shape torch.Size([32]) from checkpoint, the shape in current model is torch.Size([64]).
size mismatch for module.block.downBlocks.1.0.weight: copying a param with shape torch.Size([32, 32, 8, 8]) from checkpoint, the shape in current model is torch.Size([64, 64, 8, 8]).
size mismatch for module.block.downBlocks.1.0.bias: copying a param with shape torch.Size([32]) from checkpoint, the shape in current model is torch.Size([64]).
size mismatch for module.block.downBlocks.2.0.weight: copying a param with shape torch.Size([32, 32, 8, 8]) from checkpoint, the shape in current model is torch.Size([64, 64, 8, 8]).
size mismatch for module.block.downBlocks.2.0.bias: copying a param with shape torch.Size([32]) from checkpoint, the shape in current model is torch.Size([64]).
size mismatch for module.block.uptranBlocks.0.0.weight: copying a param with shape torch.Size([32, 64, 1, 1]) from checkpoint, the shape in current model is torch.Size([64, 128, 1, 1]).
size mismatch for module.block.uptranBlocks.0.0.bias: copying a param with shape torch.Size([32]) from checkpoint, the shape in current model is torch.Size([64]).
size mismatch for module.block.uptranBlocks.1.0.weight: copying a param with shape torch.Size([32, 96, 1, 1]) from checkpoint, the shape in current model is torch.Size([64, 192, 1, 1]).
size mismatch for module.block.uptranBlocks.1.0.bias: copying a param with shape torch.Size([32]) from checkpoint, the shape in current model is torch.Size([64]).
size mismatch for module.block.downtranBlocks.0.0.weight: copying a param with shape torch.Size([32, 64, 1, 1]) from checkpoint, the shape in current model is torch.Size([64, 128, 1, 1]).
size mismatch for module.block.downtranBlocks.0.0.bias: copying a param with shape torch.Size([32]) from checkpoint, the shape in current model is torch.Size([64]).
size mismatch for module.block.downtranBlocks.1.0.weight: copying a param with shape torch.Size([32, 96, 1, 1]) from checkpoint, the shape in current model is torch.Size([64, 192, 1, 1]).
size mismatch for module.block.downtranBlocks.1.0.bias: copying a param with shape torch.Size([32]) from checkpoint, the shape in current model is torch.Size([64]).
size mismatch for module.block.compress_out.0.weight: copying a param with shape torch.Size([32, 96, 1, 1]) from checkpoint, the shape in current model is torch.Size([64, 384, 1, 1]).
size mismatch for module.block.compress_out.0.bias: copying a param with shape torch.Size([32]) from checkpoint, the shape in current model is torch.Size([64]).
size mismatch for module.out.0.weight: copying a param with shape torch.Size([32, 32, 8, 8]) from checkpoint, the shape in current model is torch.Size([64, 64, 8, 8]).
size mismatch for module.out.0.bias: copying a param with shape torch.Size([32]) from checkpoint, the shape in current model is torch.Size([64]).
size mismatch for module.conv_out.0.weight: copying a param with shape torch.Size([3, 32, 3, 3]) from checkpoint, the shape in current model is torch.Size([3, 64, 3, 3]).

Is my data wrong?or the pytorch version wrong?

@Paper99
Copy link
Owner

Paper99 commented Sep 15, 2019

Maybe you need to modify your network options in test_*.json according to your training configures (network options in train_*.json).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants