·
Editors' PicksFeaturesDeep DivesGrowContribute
Wouldn’t it be nice if you never lost your trained ML models on the Google Colab? Rclone to the rescue.
Image by author.
Do you ever start training a machine learning model on the Google Colab and the runtime disconnects before you can save your results? It use to happen to me and I would get frustrated. I would leave my model training to come back a few hours later and find it finished without saving!
Image by author.
So what can you do about this? Use Rclone¹, Google Drive², and Google Colab³ together with the following 3 major steps:
1) Install Rclone.
2) Configure Rclone.
3) Export with Rclone.
Do this in the very first cell of your Google Colab Jupyter Notebook.
<span id="3c3b" class="hr mn lk gu mo b dn mp mq s mr">! curl https://rclone.org/install.sh | sudo bash</span>
Configure the newly installed Rclone.
<span id="ff6d" class="hr mn lk gu mo b dn mp mq s mr">!rclone config</span>
Go to the link it generates for you.
Copy the code from your Google Drive Sign in for Rclone access.
Paste it where it tells you to Enter verification code
Add an export after your model finishes training.
Copy your current “/content/” directory in the Google Colab to wherever you want on your Google Drive through the Rclone remote path we just established.
<span id="ebce" class="hr mn lk gu mo b dn mp mq s mr">!rclone copy "/content/" remote:"/YOUR_PATH_TO_GDRIVE_DESIRED_LOCATION/"</span>
So after following those 3 major steps outlined above, you should be able to back-up whatever models you are training with Google Colab (assuming they finish).
Here is an example Jupyter Notebook that implements theses steps from start to finish.
You can see for that example, my Google Drive stored the trained models as such under content:
Image by author.
You can find more examples in this Github repository I made for various TensorFlow machine learning problem notebooks.
I hope you found this article helpful! Thank you for reading and hit the like if so. Follow me on here for more machine learning based content coming soon!
You can also follow me on LinkedIn: https://www.linkedin.com/in/stevensmiley1989/
- Rclone.[ https://rclone.org/]
- Google Drive. [ https://g.co/kgs/qu7aAY]
- Google Colab. [ https://colab.research.google.com/]
- TensorFlow. Martín Abadi, Ashish Agarwal, Paul Barham, Eugene Brevdo, Zhifeng Chen, Craig Citro, Greg S. Corrado, Andy Davis,Jeffrey Dean, Matthieu Devin, Sanjay Ghemawat, Ian Goodfellow, Andrew Harp, Geoffrey Irving, Michael Isard, Rafal Jozefowicz, Yangqing Jia,Lukasz Kaiser, Manjunath Kudlur, Josh Levenberg, Dan Mané, Mike Schuster,Rajat Monga, Sherry Moore, Derek Murray, Chris Olah, Jonathon Shlens,Benoit Steiner, Ilya Sutskever, Kunal Talwar, Paul Tucker,Vincent Vanhoucke, Vijay Vasudevan, Fernanda Viégas,Oriol Vinyals, Pete Warden, Martin Wattenberg, Martin Wicke,Yuan Yu, and Xiaoqiang Zheng. TensorFlow: Large-scale machine learning on heterogeneous systems, 2015. Software available fromtensorflow.org.
Writing about Data Science, CV, DL, ML, AI, Python https://www.linkedin.com/in/stevensmiley1989/
45
45
45
Your home for data science. A Medium publication sharing concepts, ideas and codes.
Read more from Towards Data Science
Dmytro Nikolaiev (Dimid)in Towards Data Science
Arun Ramanujapuramin logistimo