load model failed with custom torch.nn.Module class 2022-07-18 01:58:04,499 [INFO ] W-9000-model_1-stdout org.pytorch.serve.wlm.WorkerLifeCycle - AttributeError: Can’t get attribute ‘transformer_Model’ on <module ‘main’ from ‘/opt/conda/lib/python3.6/site-packages/ts/model_service_worker.py’> #3246
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
When I am trying to deploy a custom neural network, made like so
I get the error,
class 2022-07-18 01:58:04,499 [INFO ] W-9000-model_1-stdout org.pytorch.serve.wlm.WorkerLifeCycle - AttributeError: Can’t get attribute ‘transformer_Model’ on <module ‘main’ from ‘/opt/conda/lib/python3.6/site-packages/ts/model_service_worker.py’>
I believe this is because the
type(neural_net
) =__main__.Model
. So when I use sagemaker's PyTorchModel function to deploy, it is not seeing it as a pytorch object, and thus erroring. This shouldn't be the case though, because the above class is how you are supposed to create custom pytorch classifiers.To further elaborate how I'm creating my endpoint, in a sagemaker note book I run
in order to instantiate the model
in another cell, run
to deploy.
Just to clarify, I have tested this code and endpoint on various different pytorch models, and they only seem to not work when I create a pytorhc model using the above
class Model(nn.Module):
Beta Was this translation helpful? Give feedback.
All reactions