What is the best way to authenticate to Google Cloud Storage Bucket from a shell script (To be scheduled to run daily/hourly) using a service account? I have gone through the below link, but I still have some doubts regarding the login process.
How to use Service Accounts with gsutil, for uploading to CS + BigQuery
Are the below mentioned login steps a one-time process? If yes how does the login work for subsequent executions?
My understanding is that the below commands writes content to the .boto file which is used in subsequent executions? But according to below link - it writes to a separate json file inside .config/gcloud? Does gsutil support creating boto files with service account info?
In such a case what is the use of a .boto file ? and why/when do we need to pass it via BOTO_PATH/BOTO_CONFIG?
In gsutil (standalone), login using below steps
gsutil config -e
Optionally -o to output to a file other than ~/.boto
gsutil as part of gcloud
gcloud auth activate-service-account [email protected] --key-file=/path/key.json --project=PROJECT_ID
What is the best way to prevent intervention from other scripts?
For example, let us assume we have shell script S1, connecting to project P1 to upload data to Bucket B1, If another shell script say S2 is triggered at exactly the same time connecting to Project P2 uploading to Bucket B2, will it cause an issue? What is the best practice to avoid such issues?
Is it possible to limit the login to only the time of script execution? Say, the script is scheduled using cron to run at 10:00 AM UTC and the script completes its execution by 10:30 AM UTC. Is it possible to prevent any actions in the time between 10:30 till next run? In other words is it possible to log out and then login programatically without intervention?
Environment: Centos