Skip to main content
deleted 14 characters in body
Source Link
Javeed Shakeel
  • 3.2k
  • 3
  • 36
  • 44

Using the gsutil tool we can do a wide range of bucket and object management tasks, including:

  1. Creating and deleting buckets.
  2. Uploading, downloading, and deleting objects.
  3. Listing buckets and objects. Moving, copying, and renaming objects.

we can copy data from a Google Cloud Storage bucket to an amazon s3 bucket using gsutil rsync and gsutil cp operations. whereas

gsutil rsync collects all metadata from the bucket and syncs the data to s3

gsutil -m rsync -r gs://your-gcs-bucket s3://your-s3-bucket

gsutil cp copies the files one by one and as the transfer rate is good it copies 1 GB in 1 minute approximately.

gsutil cp gs://<gcs-bucket> s3://<s3-bucket-name>

if you have a large number of files with high data volume then use this bash script and run it in the background with multiple threads using the screen command in amazon or GCP instance with AWS credentials configured and GCP auth verified.

Before running the script list all the files and redirect to a file and read the file as input in the script to copy the file

gsutil ls gs://<gcs-bucket> > file_list_part.out

Bash script:

#!/bin/bash
echo "start processing" 
input="file_list_part.out"
while IFS= read -r line
do
    command="gsutil cp ${line} s3://<bucket-name>"
    echo "command :: $command :: start time :: $now"
    eval $command
    retVal=$?
    if [ $retVal -ne 0 ]; then
        echo "Error copying file"
        exit 1
    fi
    echo "Copy completed successfully"
done < "$input"
echo "completed processing"

execute the Bash script and write the output to a log file to check the progress of completed and failed files.

bash file_copy.sh > /root/logs/file_copy.log 2>&1

Using the gsutil tool we can do a wide range of bucket and object management tasks, including:

  1. Creating and deleting buckets.
  2. Uploading, downloading, and deleting objects.
  3. Listing buckets and objects. Moving, copying, and renaming objects.

we can copy data from a Google Cloud Storage bucket to an amazon s3 bucket using gsutil rsync and gsutil cp operations. whereas

gsutil rsync collects all metadata from the bucket and syncs the data to s3

gsutil -m rsync -r gs://your-gcs-bucket s3://your-s3-bucket

gsutil cp copies the files one by one and as the transfer rate is good it copies 1 GB in 1 minute approximately.

gsutil cp gs://<gcs-bucket> s3://<s3-bucket-name>

if you have a large number of files with high data volume then use this bash script and run it in the background with multiple threads using the screen command in amazon or GCP instance with AWS credentials configured and GCP auth verified.

Before running the script list all the files and redirect to a file and read the file as input in the script to copy the file

gsutil ls gs://<gcs-bucket> > file_list_part.out

Bash script:

#!/bin/bash
echo "start processing" 
input="file_list_part.out"
while IFS= read -r line
do
    command="gsutil cp ${line} s3://<bucket-name>"
    echo "command :: $command :: start time :: $now"
    eval $command
    retVal=$?
    if [ $retVal -ne 0 ]; then
        echo "Error copying file"
        exit 1
    fi
    echo "Copy completed successfully"
done < "$input"
echo "completed processing"

execute the Bash script and write the output to a log file to check the progress of completed and failed files.

bash file_copy.sh > /root/logs/file_copy.log 2>&1

Using the gsutil tool we can do a wide range of bucket and object management tasks, including:

  1. Creating and deleting buckets.
  2. Uploading, downloading, and deleting objects.
  3. Listing buckets and objects. Moving, copying, and renaming objects.

we can copy data from a Google Cloud Storage bucket to an amazon s3 bucket using gsutil rsync and gsutil cp operations. whereas

gsutil rsync collects all metadata from the bucket and syncs the data to s3

gsutil -m rsync -r gs://your-gcs-bucket s3://your-s3-bucket

gsutil cp copies the files one by one and as the transfer rate is good it copies 1 GB in 1 minute approximately.

gsutil cp gs://<gcs-bucket> s3://<s3-bucket-name>

if you have a large number of files with high data volume then use this bash script and run it in the background with multiple threads using the screen command in amazon or GCP instance with AWS credentials configured and GCP auth verified.

Before running the script list all the files and redirect to a file and read the file as input in the script to copy the file

gsutil ls gs://<gcs-bucket> > file_list_part.out

Bash script:

#!/bin/bash
echo "start processing" 
input="file_list_part.out"
while IFS= read -r line
do
    command="gsutil cp ${line} s3://<bucket-name>"
    echo "command :: $command :: $now"
    eval $command
    retVal=$?
    if [ $retVal -ne 0 ]; then
        echo "Error copying file"
        exit 1
    fi
    echo "Copy completed successfully"
done < "$input"
echo "completed processing"

execute the Bash script and write the output to a log file to check the progress of completed and failed files.

bash file_copy.sh > /root/logs/file_copy.log 2>&1
deleted 2 characters in body
Source Link
Javeed Shakeel
  • 3.2k
  • 3
  • 36
  • 44

Using the gsutil tool we can do a wide range of bucket and object management tasks, including:

  1. Creating and deleting buckets.
  2. Uploading, downloading, and deleting objects.
  3. Listing buckets and objects. Moving, copying, and renaming objects.

we can copy data from a Google Cloud Storage bucket to an amazon s3 bucket using gsutil rsync and gsutil cp operations. whereas

gsutil rsync collects all metadata from the bucket and syncs the data to s3

gsutil -m rsync -r gs://your-gcs-bucket s3://your-s3-bucket

gsutil cp copies the files one by one and as the transfer rate is good it copies 1 GB in 1 minute approximately.

gsutil cp gs://<gcs-bucket> s3://<s3-bucket-name>

if you have a large number of files with high data volume then use this bash script and run it in the background with multiple threads using the screen command in amazon or GCP instance with AWS credentials configured and GCP auth verified.

Before running the script list all the files and redirect to a file and read the file as input in the script to copy the file

`gsutilgsutil ls gs://<gcs-bucket> > file_list_part.out`out

Bash script:

#!/bin/bash
echo "start processing" 
input="file_list_part.out"
while IFS= read -r line
do
    command="gsutil cp ${line} s3://<bucket-name>"
    echo "command :: $command :: start time :: $now"
    eval $command
    retVal=$?
    if [ $retVal -ne 0 ]; then
        echo "Error copying file"
        exit 1
    fi
    echo "Copy completed successfully"
done < "$input"
echo "completed processing"

execute the Bash script and write the output to a log file to check the progress of completed and failed files.

bash file_copy.sh > /root/logs/file_copy.log 2>&1

Using the gsutil tool we can do a wide range of bucket and object management tasks, including:

  1. Creating and deleting buckets.
  2. Uploading, downloading, and deleting objects.
  3. Listing buckets and objects. Moving, copying, and renaming objects.

we can copy data from a Google Cloud Storage bucket to an amazon s3 bucket using gsutil rsync and gsutil cp operations. whereas

gsutil rsync collects all metadata from the bucket and syncs the data to s3

gsutil -m rsync -r gs://your-gcs-bucket s3://your-s3-bucket

gsutil cp copies the files one by one and as the transfer rate is good it copies 1 GB in 1 minute approximately.

gsutil cp gs://<gcs-bucket> s3://<s3-bucket-name>

if you have a large number of files with high data volume then use this bash script and run it in the background with multiple threads using the screen command in amazon or GCP instance with AWS credentials configured and GCP auth verified.

Before running the script list all the files and redirect to a file and read the file as input in the script to copy the file

`gsutil ls gs://<gcs-bucket> > file_list_part.out`

Bash script:

#!/bin/bash
echo "start processing" 
input="file_list_part.out"
while IFS= read -r line
do
    command="gsutil cp ${line} s3://<bucket-name>"
    echo "command :: $command :: start time :: $now"
    eval $command
    retVal=$?
    if [ $retVal -ne 0 ]; then
        echo "Error copying file"
        exit 1
    fi
    echo "Copy completed successfully"
done < "$input"
echo "completed processing"

execute the Bash script and write the output to a log file to check the progress of completed and failed files.

bash file_copy.sh > /root/logs/file_copy.log 2>&1

Using the gsutil tool we can do a wide range of bucket and object management tasks, including:

  1. Creating and deleting buckets.
  2. Uploading, downloading, and deleting objects.
  3. Listing buckets and objects. Moving, copying, and renaming objects.

we can copy data from a Google Cloud Storage bucket to an amazon s3 bucket using gsutil rsync and gsutil cp operations. whereas

gsutil rsync collects all metadata from the bucket and syncs the data to s3

gsutil -m rsync -r gs://your-gcs-bucket s3://your-s3-bucket

gsutil cp copies the files one by one and as the transfer rate is good it copies 1 GB in 1 minute approximately.

gsutil cp gs://<gcs-bucket> s3://<s3-bucket-name>

if you have a large number of files with high data volume then use this bash script and run it in the background with multiple threads using the screen command in amazon or GCP instance with AWS credentials configured and GCP auth verified.

Before running the script list all the files and redirect to a file and read the file as input in the script to copy the file

gsutil ls gs://<gcs-bucket> > file_list_part.out

Bash script:

#!/bin/bash
echo "start processing" 
input="file_list_part.out"
while IFS= read -r line
do
    command="gsutil cp ${line} s3://<bucket-name>"
    echo "command :: $command :: start time :: $now"
    eval $command
    retVal=$?
    if [ $retVal -ne 0 ]; then
        echo "Error copying file"
        exit 1
    fi
    echo "Copy completed successfully"
done < "$input"
echo "completed processing"

execute the Bash script and write the output to a log file to check the progress of completed and failed files.

bash file_copy.sh > /root/logs/file_copy.log 2>&1
Source Link
Javeed Shakeel
  • 3.2k
  • 3
  • 36
  • 44

Using the gsutil tool we can do a wide range of bucket and object management tasks, including:

  1. Creating and deleting buckets.
  2. Uploading, downloading, and deleting objects.
  3. Listing buckets and objects. Moving, copying, and renaming objects.

we can copy data from a Google Cloud Storage bucket to an amazon s3 bucket using gsutil rsync and gsutil cp operations. whereas

gsutil rsync collects all metadata from the bucket and syncs the data to s3

gsutil -m rsync -r gs://your-gcs-bucket s3://your-s3-bucket

gsutil cp copies the files one by one and as the transfer rate is good it copies 1 GB in 1 minute approximately.

gsutil cp gs://<gcs-bucket> s3://<s3-bucket-name>

if you have a large number of files with high data volume then use this bash script and run it in the background with multiple threads using the screen command in amazon or GCP instance with AWS credentials configured and GCP auth verified.

Before running the script list all the files and redirect to a file and read the file as input in the script to copy the file

`gsutil ls gs://<gcs-bucket> > file_list_part.out`

Bash script:

#!/bin/bash
echo "start processing" 
input="file_list_part.out"
while IFS= read -r line
do
    command="gsutil cp ${line} s3://<bucket-name>"
    echo "command :: $command :: start time :: $now"
    eval $command
    retVal=$?
    if [ $retVal -ne 0 ]; then
        echo "Error copying file"
        exit 1
    fi
    echo "Copy completed successfully"
done < "$input"
echo "completed processing"

execute the Bash script and write the output to a log file to check the progress of completed and failed files.

bash file_copy.sh > /root/logs/file_copy.log 2>&1