70

How can I move data directly from one Google Cloud Storage project to another? I understand how to upload and how to download, but I want to transfer directly between projects.

2
  • 3
    "Projects" are a construct to organise tools... Objects live in buckets regardless. So you'll have to just copy it from bucket to bucket. Commented Jun 25, 2015 at 13:38
  • 1
    @HannahS If there's an answer that worked for you, please mark it as the accepted answer. This rewards the author and helps other people who have the same question. Commented Jan 24, 2019 at 14:30

8 Answers 8

58

To copy any single object from one GCS location to another, you can use the copy command. This can be done from either of our public APIs, or by using the command-line client, gsutil.

With gsutil, the cp command could be used like this:

gsutil cp gs://bucket1/obj gs://bucket2/obj2

Edit:
Since I wrote this, the Google Cloud Transfer Service has become available, which is good for copying whole buckets between GCS projects, or for copying whole buckets from S3 into GCS. You can find out more here.

9
  • 28
    This seems to copy between two buckets in the same project. Do you have an example of copying from one project to another?
    – HannahS
    Commented Jul 19, 2015 at 7:28
  • 18
    The buckets in this example can be in different projects. There is nothing in this example that mention specific project.
    – Eyal Levin
    Commented Oct 28, 2015 at 14:47
  • 9
    The problem is in the credentials, you must have credentials with the access to the both projects.
    – xmedeko
    Commented Oct 10, 2016 at 7:01
  • 1
    @JohnAndrews what worked for me is: try doing the gsutil cp command as detailed elsewhere in this question, and it gives you an error telling you what service account is lacking permissions. Then, you add that one to the bucket's permissions (I used role Storage Object Admin) and try again.
    – Tom Hundt
    Commented Jul 17, 2019 at 16:12
  • 2
    I ended up using: gsutil -m rsync -pPr PROD-15745 gs://eci_staging_transfer/PROD-15745 where PROD-15745 was the folder to copy, and eci_staging_transfer the bucket name. rsync docs. More useful gsutil cmds: gsutil cp dmesg.txt gs://my_bucket/ (just copy a file), gsutil ls -al gs://my_bucket/ (list files there), gsutil rm gs://my_bucket/dmesg.txt (delete a file). The ls output format is different from what you're used to. You can always check the bucket contents via the GCP console GUI, too.
    – Tom Hundt
    Commented Jul 17, 2019 at 16:24
35

Open The Web console Storage > Tranfer to create a new transfer.

Select the source bucket you want to copy from. Just like cratervale menitoned just above here, bucket identifiers are globally unique (this is key to the solution). So once you get to the destination part of the transfer form, you can write/paste the target bucket right in it's text input. Even if that bucket is from another project. It will show you a green icon once the target has been verified being an existing bucket. You can continue the form again to finalise your setup.

Once you started the transfer from the form, you can follow it's progress by hitting the refresh button on top of the console.

2
  • As of 2022, this is the fastest and easier way to go, thanks a lot. I was able to transfer hundreds of GB in seconds.
    – eexit
    Commented Feb 15, 2022 at 22:16
  • @eexit do you know if this works to move from GCS to Google Drive? Commented Dec 1, 2023 at 19:54
29

This is [one of] the quickest ways to do it:

gsutil -m rsync -r gs://bucket-source/dir gs://bucket-destination/dir

Please note that /dir refers to a directory [or sub-directories e.g. /dir1/dir2] under the main bucket. It does not refer to a file name. If you try to transfer individual files, you will get an error.

See more configuration options in the official docs.

However, there are a few things you should setup properly to prevent issues. Here's a setup list:

  1. Create a service account for your source bucket [from the source project, Google Cloud Console -> IAM -> Service Account]. Use Storage Admin as Role. Make sure you create a JSON key and download this to a safe place on your computer. Take note of the path to this file [e.g. path/to/source-service-account.json] as you will need it later.
  2. Create a service account for your destination bucket [same process as above, but make sure to switch to the destination project]. You may download JSON key if you need to use it later, but this is optional.
  3. Add the service account of the source bucket [created in 1. above] to the destination bucket [From the destination project, Google Cloud Console -> Storage -> Browser, then click on the main bucket, then click on the Permissions tab, then click on the "Add Members" button. Add the email address of the source bucket service account in the text box provided, then give Storage Admin permissions]
  4. If you're using gcloud cli [command line tools], are logged in to the source project, you can run the gsutil command now. However, if you're not properly authenticated, you might get access permission errors. You can authenticate using the service account file [the one you created and downloaded in 1. above] by running the following command gcloud auth activate-service-account --key-file=/path/to/source-service-account.json. Once you do this, you will be logged in to GCP using the service account. You may now run the gsutil command to transfer your files.
  5. When you're done, check your login status using gcloud auth list. And, you can switch accounts using gcloud config set account 'ACCOUNT'

Cheers.

1
  • Question. If I copy from bucket->bucket within the same project and region do I pay? What if I copy from bucket->bucket within same region but different project, do I pay? Who can answer this for me?
    – DUDANF
    Commented Mar 11, 2020 at 14:30
17

If you want to use the console follow @Martin van Dam's answer.

If you want to use the shell :

Step 1. Open google cloud shell

Step 2. Run gcloud init & follow the process to connect to the cloud project that bucket1 belongs to.

Step 3. run gsutil cp -r gs://[bucket1]/* gs://[bucket2]

You are done!


*Now there's a catch! If both bucket belongs to the same project these steps will work flawlessly. But in case both buckets don't belong to the same project or the same google cloud account. It won't work. You need to fix the permissions.

If they belong to the same GCP account :

Go to Storage > Browser > Select bucket > Options > Edit bucket permissions > add member > insert the service account email id for the project that the bucket2 belongs to > set role to Storage.Storage Admin > Save. Then run gstuil cp command.

If they belong to the separate GCP accounts :

Go to Storage > Browser > Select bucket > Options > Edit bucket permissions > add member > insert the gmail id which the project that the bucket2 belongs to > set role to Storage.Storage Admin > Save. Then run gstuil cp command.

1
  • I managed to copy between two projects without any problems. Nice one
    – Anytoe
    Commented Mar 15, 2021 at 14:30
9

Bucket names in GCS are unique across all of your projects. For example, Project1 and Project2 cannot both have buckets named 'images', although they can each have folders inside those buckets named 'images'.

This can seem misleading because gsutil may ask you to select a project to work with. For the copy command, this selection can be disregarded.

gsutil cp gs://bucket1/obj gs://bucket2/obj

will allow you to copy an object in Project1/bucket1 to Project2/bucket2

9

If you have a key or a service account that gives you access to both projects, it is super simple and works at light speed to use gsutils.

This is what I did from my local mac and synced terabytes of data in minutes (yes, minutes and not hours)

gsutil -m rsync -r gs://my/source/project/bucket/files/ gs://my/target/project/bucket/directory/

The key here is to use -m flag.

Check official docs at https://cloud.google.com/storage/docs/gsutil/commands/rsync for more details.

4
  • This works very well, and very quickly since -m makes it run in parallel and it uses rsync so it copies efficiently not moving unchanged files. You do need an account with the right privilege on both projects as expected.
    – keni
    Commented Jan 21, 2020 at 11:26
  • Not sure why this has been voted down, this is the most efficient way to do it quickly. gsutil rsync makes the content of a target folder identical to the content of a source folder by copying, updating or deleting any file in the target folder that has changed in the source folder.
    – nights
    Commented May 21, 2020 at 6:30
  • The link above include "]" so it will be broken if you click it. Here is the right one: cloud.google.com/storage/docs/gsutil/commands/rsync
    – DevNerd
    Commented Jun 4, 2021 at 17:58
  • Yup, it's an exaggeration to say that this method does in minutes that cp will take hours to do.
    – ankush981
    Commented Jun 23, 2022 at 19:55
4

As per the docs Moving Buckets.

You can simply use gsutil.

gsutil cp -r gs://[SOURCE_BUCKET]/* gs://[DESTINATION_BUCKET]

note: _if using zsh. Make sure you wrap your source bucket in single quotes. Because zsh will attempt to expand the wildcard before gsutil sees it. See here.

You can find the link for gsutil in your storage browser Overview tab.

0
3

Using Google Cloud Shell

Go to the first project which has the bucket you wanted to copy
gcloud config set project [PROJECT1 ID]

Made a directory you can mount that bucket to
mkdir test

Mount the bucket to the directory
gcsfuse [BUCKET1] test

Switch to the second project, which had the bucket you wanted to populate
gcloud config set project [PROJECT2 ID]

Copy the contents of the new folder to the second bucket
gsutil cp -r /home/user/test gs://[BUCKET2]

Not the answer you're looking for? Browse other questions tagged or ask your own question.