This Project is the reposiory for solving AI Engineer Party
- Download the data and upload it to bigquery.
- Load data and learn with AutoML.
- Store the learned weight in storage.
- Deploy the model. In this case, load the weight from storage and then inference.
1.set your GCE region. In my case us central1
$ gcloud config set compute/zone us-central1
- install Google Cloud SDK
$ https://cloud.google.com/sdk/docs/quickstarts
- install google cloud python API
$ pip install --upgrade google-cloud-storage
$ pip install --upgrade google-cloud-bigquery
- Download MNIST dataset
$ tools/make_data.py
- Upload on BigQuery
$ bq load --source_formag=CSV -F":" mnist.train data/train.txt.gz\
"key:integer, image:string,label:integer"
$ bq load --source_formag=CSV -F":" mnist.test data/test.txt.gz\
"key:integer, image:string,label:integer"
- Check BirQuery python API operation
from google.cloud import bigquery
client = bigquery.Client()
query = ("SELECT image, label FROM mnist.train")
query_job = client.query(
query,
) #API request -starts the query
for row in query_job: # API request -fetches results
print(row)
- Save data from Bigquery
python tools/data_from_bq.py
- Install AutoML nni of Microsoft
pip install --upgrade nni
click here for details(https://github.com/microsoft/nni)
- Training using AutoML
nnictl create --config config.yml
- Create Google Storage Bucket to upload saved model state_dict
$ PROJECT_ID = $(gcloud config list project --format "value(core.projcet)")
$ BUCKET="${PROJECT_ID}-ml"
#create bucker
$ gsutil mb -c regional -l us-central1 gs://${BUCKET}
#upload saved model
$ gsutil -m cp -R save_model/model_param.pth gs://${BUCKET}
- Deploy google functions
$ cd gdeploy
$ gcloud beta functions deploy [function_name] --runtime python37 --trigger-http
- Test API usgin 'curl' click here for details about curl
$ curl -X POST\
curl -X POST -H "Content-Type:application/json" -d '{"url":"[image url]"}'
- Juntae Kim, Korea University DAVIAN LAB