Photo on Unsplash

Copy DigitalOcean Spaces to an AWS S3 Bucket

Moving data between cloud providers

sometime it’s good to copy important data between cloud providers, in my case we have about 10 TB of important data on DO spaces, and we need to copy it to AWS S3, in this guide i will show you how we do this process using Rclone.


Rclone is a command-line program to manage files on cloud storage. it’s easy to use with a lot of features.

Install Rclone

curl | bash


you can use the command rclone config or you can put info direct on config file. before we start you need to create an access key and secret key for Spaces, and endpoint from space setting page, also IAM user for AWS S3 with the required permissions to manage s3.

we will add our config to config file on this path ~/.config/rclone/rclone.conf

type = s3
env_auth = false
access_key_id = SPACES_ACCESS_KEY
secret_access_key = SPACES_SECRET
endpoint =
acl = private
type = s3
env_auth = false
access_key_id = AWS_ACCESS_KEY
secret_access_key = AWS_SECRET
region = us-east-1
acl = private

test your configurations using this commands to list your buckets and spaces

$ rclone lsd s3:
$ rclone lsd spaces:

Sync data with Rclone command

rclone sync spaces:<space_name> s3:<bucket_nmae>

The Rclone command line won’t print any output even if the operation is successful. However, once it returns, you should see all of the data from your Spaces now located in your AWS S3 bucket.

And that’s it! , you can now easily transfer data between cloud storage providers



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Mohammed Tayeh

Mohammed Tayeh


Seasoned DevOps Engineer with diverse extensive background in cloud platforms. Highly motivated.