You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
warnings.warn("Using a non-full backward hook when the forward contains multiple autograd Nodes "
Traceback (most recent call last):
File "main_wad.py", line 236, in <module>
main()
File "/usr/local/lib/python3.7/dist-packages/click/core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "/usr/local/lib/python3.7/dist-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/usr/local/lib/python3.7/dist-packages/click/core.py", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/usr/local/lib/python3.7/dist-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/usr/local/lib/python3.7/dist-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "main_wad.py", line 206, in demo4
probs, ids = gcam.forward(images)
File "/content/grad-cam-pytorch/grad_cam.py", line 31, in forward
self.logits = self.model(image)
File "/usr/local/lib/python3.7/dist-packages/torch/nn/modules/module.py", line 893, in _call_impl
hook_result = hook(self, input, result)
File "/content/grad-cam-pytorch/grad_cam.py", line 118, in forward_hook
self.fmap_pool[key] = output.detach()
AttributeError: 'tuple' object has no attribute 'detach'
Thank you so much!
The text was updated successfully, but these errors were encountered:
The difference between my model and resnet50 is only the last 3 layers,shown as below:
(avgpool): AdaptiveAvgPool2d(output_size=(1, 1))
(fc): Linear(in_features=2048, out_features=1000, bias=True)
)
(bottleneck): Sequential(
(0): AdaptiveAvgPool2d(output_size=(1, 1))
(1): Flatten(start_dim=1, end_dim=-1)
)
(head): Linear(in_features=2048, out_features=7, bias=True)
The error
Thank you so much!
The text was updated successfully, but these errors were encountered: