Self-built tools for research.
This is a simple package to print each layer forward and backward time of PyTorch models.
You can use this tool by three steps:
-
Install split_layer by running
pip3 install split_layer -U --user
-
Find the file which defines the structure of a Network. Add the following code:
from split_layer import split_layer_dec
@split_layer_dec(__file__)
class Net():
Notice: Make sure the forward function input parameter and itermediate output should be x
. For example:
x = F.relu(self.conv1(x))
return x
- Replace
loss.backward()
with something likenet.backward(outputs)
. Then you can run your training code as usual.
-
It is built according to the accepted answer of this question. Now, it is not flexible enough, and DO NOT support DP or DDP models. We will develop it further in the future.
-
It works both on CPU and GPU.
Make sure inspect
and torch
has been installed.