Photo on Unsplash

Copy DigitalOcean Spaces to an AWS S3 Bucket

Moving data between cloud providers

Mohammed Tayeh
2 min readOct 5, 2022

sometime it’s good to copy important data between cloud providers, in my case we have about 10 TB of important data on DO spaces, and we need to copy it to AWS S3, in this guide i will show you how we do this process using Rclone.

Rclone

Rclone is a command-line program to manage files on cloud storage. it’s easy to use with a lot of features.

Install Rclone

curl https://rclone.org/install.sh | bash

Configurations

you can use the command rclone config or you can put info direct on config file. before we start you need to create an access key and secret key for Spaces, and endpoint from space setting page, also IAM user for AWS S3 with the required permissions to manage s3.

we will add our config to config file on this path ~/.config/rclone/rclone.conf

[spaces]
type = s3
env_auth = false
access_key_id = SPACES_ACCESS_KEY
secret_access_key = SPACES_SECRET
endpoint = fra1.digitaloceanspaces.com
acl = private
[s3]
type = s3
env_auth = false
access_key_id = AWS_ACCESS_KEY
secret_access_key = AWS_SECRET
region = us-east-1
acl = private

test your configurations using this commands to list your buckets and spaces

$ rclone lsd s3:
$ rclone lsd spaces:

Sync data with Rclone command

rclone sync spaces:<space_name> s3:<bucket_nmae>

The Rclone command line won’t print any output even if the operation is successful. However, once it returns, you should see all of the data from your Spaces now located in your AWS S3 bucket.

And that’s it! , you can now easily transfer data between cloud storage providers

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

Mohammed Tayeh
Mohammed Tayeh

Written by Mohammed Tayeh

Seasoned DevOps Engineer with diverse extensive background in cloud platforms. Highly motivated.

No responses yet