I want to copy an object from a GCP Compute instance to a Google Storage Bucket using gsutil cp
. Both belong to the same owner (me) and to the same project. I want to automate the whole machine so authenticating manually is not what I want.
I have activated the necessary permissions to use a service account on a Compute instance (details below) but when I try to gsutil cp
a file to the bucket, I get an AccessDeniedException
.
The error message complains about missing storage.object.create
or storage.object.list
permissions depending on if my bucket target path ends in a folder (gs://<bucket>/test/
) or file (gs://<bucket>/test.txt
).
What I did to get permissions (I have already tried a lot, including creating redundant custom roles which I also assigned to the service account):
- Start the instance:
gcloud instances create <instance> [...] \
--service--account <name>@<project>.iam.gserviceaccount.com \
--scopes cloud-platform,storage-full
- Give the service account permissions on creation.
- Give the service account permissions afterwards as well (just to be safe):
gcloud projects add-iam-policy-binding <project> \
--member serviceAccount:<name>@<project>.iam.gserviceaccount.com \
--role roles/storage.objectAdmin
- Edit Storage bucket permissions for the service account:
gsutil iam ch \
serviceAccount:<name>@<project>.iam.gserviceaccount.com:roles/storage.objectAdmin \
gs://<bucket>
- Edit Storage bucket access control list (owner permission):
gsutil acl ch -u <name>@<project>.iam.gserviceaccount.com:O gs://<bucket>
- At some point enabled bucket-level IAM policies instead of per-object policies (just to be safe).
- On the instance, use
gcloud auth activate-service-account --key-file <key>.json
to authenticate the account.
However, no matter, what I do, the error does not change and I am not able to write to the bucket. I can, however, read files from the bucket.
At this point I am just wasting money trying to get this to work.