How does sagemaker know which AMI to use for an instance when using the Estimator
class to run jobs
#4047
Unanswered
SebastianScherer88
asked this question in
Help
Replies: 1 comment
-
the concern here is that i provide my own custom image (for, say, neuron), then how do i make sure that the instance sagemaker is provisioning to run that job has the correct AMI + neuron kernel dependencies? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I am aware that it is possible to specify
but is it possible to also specify the node's setup, either via AMI specification or additional node-template-like setup script?
if sagemaker also resolves the AMI spec based on the python, pytorch, huggingface etc params, then how does that work if i specifiy my own docker image? many custom hardware configurations (GPU, neuron) require kernel installation at the node's OS level - how does that work in general, and how does that look like when providing a custom image?
Beta Was this translation helpful? Give feedback.
All reactions