Timeline for Exporting data from Google Cloud Storage to Amazon S3
Current License: CC BY-SA 4.0
7 events
when toggle format | what | by | license | comment | |
---|---|---|---|---|---|
Oct 24, 2022 at 18:16 | comment | added | urchino | My current estimate is that it will take approximately a week to transfer 250Gb of data using this mechanism. It produces a lot of errors. Random 'Connection reset by peer' is the latest. Note that all the errors are on the gsync to google side - you get the same behaviour if you try to take a local copy of the files without using S3 at all. The GSUtil cmd tool is very flaky and should be avoided if at all possible. | |
Oct 24, 2022 at 16:38 | comment | added | urchino | This works ok, although I got a load of 'connection refused' and 'broken pipe' errors doing this from OS X so many retries have been necessary with rsync -i to ignore files that already copied ok (OS X single process mode is more reliable but slooow). Also I got an MD5 mismatch error that stopped many files from copying over. Error was 'md5 signature for source object doesn't match destination object digest'. This can be resolved by specifying the encryption type in the command: gsutil -h "x-amz-server-side-encryption: AES256" -m rsync -rdi gs://storagename s3://bucketname | |
Oct 21, 2022 at 2:07 | comment | added | Chris | You can just execute curl "awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip" unzip awscliv2.zip sudo ./aws/install | |
Feb 21, 2020 at 16:54 | comment | added | Andrew Irwin | is it possible to install aws cli in google cloud shell? if so can you tell me how | |
S Feb 21, 2019 at 11:32 | history | suggested | user11095528 | CC BY-SA 4.0 |
improved formatting
|
Feb 21, 2019 at 11:32 | review | Suggested edits | |||
S Feb 21, 2019 at 11:32 | |||||
Feb 18, 2019 at 11:14 | history | answered | Noordeen | CC BY-SA 4.0 |