How To Migrate from Amazon S3 to DigitalOcean Spaces with rclone
0

Introduction

DigitalOcean areas is an item storage space solution made to ensure it is simple and inexpensive to keep and provide considerable amounts of information. When you yourself have formerly relied on other item storage space solutions, migrating information to areas might be one of the very first tasks.

In this guide, we’ll protect just how to data that are migrate DigitalOcean areas from Amazon’s S3 block storage space solution utilising the rclone energy. We’ll show just how to install rclone, the setup settings to utilize to gain access to both storage space solutions, and commands that can be used to synchronize your files and validate their integrity within Spaces.

Creating API Keys and Finding Bucket Qualities

(we will need some information about our Amazon S3 and DigitalOcean Spaces accounts******)Before we begin installing and configuring rclone to copy our objects to Spaces. We will need a set of API keys for both ongoing solutions your device may use and we’ll have to know the spot and location constraint values for the buckets.

Generating a DigitalOcean Spaces API Key and choosing the API Endpoint

To create a DigitalOcean Spaces API key, proceed with the “Creating an Access Key” portion of our how exactly to produce a DigitalOcean area and API Key guide.

Save the access key ID and key that is secret that we could configure rclone to gain access to our account.

Next, we must discover the API that is appropriate endpoint. If you have currently produced a DigitalOcean room you want to move your items to, you will see the room’s endpoint in the DigitalOcean control interface by choosing the room and viewing the Settings tab:

DigitalOcean Spaces endpoint

If you’ve got perhaps not produced an area yet, rclone can immediately produce the area you choose included in the copying procedure. The endpoint because full case would be the Spaces region you wish to use followed by .digitaloceanspaces.com. You can find the regions that are available areas in DigitalOcean control interface by viewing the choice choices regarding the areas creation web page. During this writing just the “nyc3” area can be acquired (with endpoint of nyc3.digitaloceanspaces.com).

Generating an Amazon S3 API Key

If you don’t have an Amazon API key with authorization to handle assets that are s3 you will have to create those now. Within AWS Management Console, click your account title and choose My protection qualifications from fall down menu:

AWS select security credentials

Next, choose Users in menu that is left-hand then click on the Add individual key:

AWS add user button

Type in a User name and choose Programmatic access in Access type area. Click on the Next: Permissions key to carry on:

AWS user details

On the web page that follows, find the Attach current policies straight choice towards the top after which kind s3read in Policy type filter. Check out the AmazonS3ReadOnlyAccess policy field after which click on the Next: Review key to carry on:

AWS S3 read access

Review the consumer precisely another web page after which click on the Create individual key whenever ( that is ready*******)

AWS create user

On the page that is final you’ll see the qualifications for the brand new individual. Click on the Show website link underneath the Secret access key line to look at the qualifications:

AWS secret keys

Copy the Access key ID and Secret access key someplace secure in order to configure rclone to utilize those qualifications. You can even click on the Download .csv key to truly save the qualifications towards computer.

Finding the Amazon S3 Bucket area and Location Constraints

Now, we must discover the area and location constraint values for the S3 bucket.

Click Services in menu that is top kind S3 in search club that seems. Find the S3 solution to attend the S3 administration system.

We need certainly to try to find the spot title associated with the bucket we need to move. The location shall be shown beside the bucket title:

AWS S3 bucket region

We need certainly to discover the area sequence and matching location discipline connected with our bucket’s area. Try to find your bucket’s area title inside region that is s3 from Amazon to obtain the appropriate area and location constraint strings. Inside our instance, our area title is “US East (N. Virginia)”, so we’d utilize us-east-1 whilst the area sequence and our location constraint will be blank.

Now we can install and configure rclone using this information.( that we have the appropriate information from our Amazon account,*******)

Install rclone in your neighborhood Computer

You’re now prepared to install rclone in your neighborhood computer.

Visit the Downloads portion of the task’s internet site discover binaries associated with the energy put together for various platforms. Down load the binary that is zipped matches your personal computer’s os towards Downloads directory to begin with.

Once you’ve got the rclone zip file downloaded towards computer, proceed with the area below that fits your platform.

Linux

(we will need to ensure that the unzip utility is available.

Before we can extract the archive,*******)

(you can update the local package index and install unzip by typing:

If you are running Ubuntu or Debian,*******)

  • sudo apt-get enhance
  • sudo apt-get unzip that is install

(you can install unzip by typing:

If you are running CentOS or Fedora,*******)

With unzip set up, demand directory in which you downloaded the rclone zip file:

Next, unzip the archive and transfer to the directory:( that is new*******)

  • unzip rclone*
  • cd rclone-v*

From right here, we could duplicate the binary to your /usr/local/bin directory such that it can be acquired ( that is system-wide*******)

  • sudo cp rclone /usr/local/bin

Next, we could include the page that is manual the system so that we can easily get help on the command syntax and available options. Make sure that the manual that is local we need can be acquired after which copy the rclone.1 file:

  • sudo mkdir/usr/local/share/man/man1 that is-p
  • sudo cp rclone.1 /usr/local/share/man/man1

Update the man database to incorporate the manual that is new to your system:

Finally, we could produce the setup directory and start a configuration file up to determine our S3 and areas qualifications:

  • mkdir -p ~/.config/rclone
  • nano ~/.config/rclone/rclone.conf

This will start your text editor up with a fresh blank file. Skip ahead to your area on determining your item storage space reports to carry on.

macOS

(you downloaded the rclone zip file:

If you are running macOS, begin by navigating in the terminal to the directory where*******)

Next, unzip the file and transfer to this new directory degree:

  • unzip -a rclone*
  • cd rclone-v*

Next, verify the /usr/local/bin directory can be acquired after which go the rclone binary inside:

  • sudo mkdir/usr/local/bin that is-p
  • sudo cp rclone /usr/local/bin

Finally, we could produce the setup directory and start a configuration file up to determine our S3 and areas qualifications:

  • mkdir -p ~/.config/rclone
  • nano ~/.config/rclone/rclone.conf

This will start your text editor up with a fresh blank file. Skip ahead to your area on determining your item storage space reports to carry on.

Windows

If you might be operating Windows, start with navigating to your Downloads directory in Windows File Explorer. Find the rclone zip right-click and file. All...🙁 in the context menu that appears, click Extract*******)

Windows extract rclone zip file

Follow the prompts to draw out the files from zip archive.

The rclone.exe energy needs to be run from demand line. Start a fresh Command Prompt (the cmd.exe system) screen by pressing the Windows key in corner that is lower-left typing cmd, and choosing Command Prompt.

Inside, demand rclone course you removed by typing:

  • cd "%HOMEPATH%Downloadsrclone*rclone*"

List the directory articles to validate that you will be in location:

that is correct

Output

10/23/2017 01:02 PM <DIR> . 10/23/2017 01:02 PM <DIR> .. 10/23/2017 01:02 PM 17 git-log.txt 10/23/2017 01:02 PM 296,086 rclone.1 10/23/2017 01:02 PM 16,840,192 rclone.exe 10/23/2017 01:02 PM 315,539 README.html 10/23/2017 01:02 PM 261,497 README.txt 5 File(s) 17,713,331 bytes 2 s that are dir( 183,296,266,240 bytes free

You will have to maintain this directory when you wish to make use of the rclone.exe demand.

Note: On macOS and Linux, we operate the device by typing rclone, but on Windows, the demand is named rclone.exe. Through the remainder with this guide, I will be supplying commands as rclone, therefore make sure you replace rclone.exe everytime whenever operating on Windows.

Next, we could produce the setup directory and start a configuration file up to determine our S3 and areas qualifications:

  • mkdir "%HOMEPATH%.configrclone"
  • notepad "%HOMEPATH%.configrclonerclone.conf"

This will start your text editor up with a fresh blank file. Keep ahead to understand just how to determine your item storage space reports in setup file.

Configure the S3 and Spaces Accounts

We can determine our Amazon S3 and DigitalOcean Spaces setup in file that is new that rclone can handle content between our two reports.

Let's begin by determining our S3 account. Paste the section that is following the setup file:

~/.config/rclone/rclone.conf

[s3]
kind = s3
env_auth = false
access_key_id = aws_access_key
secret_access_key = aws_secret_key
area = aws_region
location_constraint = aws_location_constraint
acl = personal

right here, we define a fresh rclone "remote" called s3. We set the type to s3 making sure that rclone understands the way that is appropriate connect to and handle the remote storage space resource. We'll determine the S3 qualifications in setup file it self, therefore we set env_auth to false.

Next, we set the access_key_id and secret_access_key factors to the S3 access key and key that is secret correspondingly. Make sure you replace the values to your S3 qualifications connected with your account.

We set the spot and location constraint in line with the properties of our bucket that is s3 that found in the Amazon region chart. Finally, the access is set by us control policy to "private" making sure that assets aren't general public automatically.

Now, we could determine a section that is similar our DigitalOcean Spaces configuration. Paste the section that is following the setup file:

~/.config/rclone/rclone.conf

. . .

[spaces]
kind = s3
env_auth = false
access_key_id = spaces_access_key
secret_access_key = spaces_secret_key
endpoint = nyc3.digitaloceanspaces.com
acl = personal

In this area, we're determining a fresh remote called "spaces". Once more, we're establishing type to s3 since Spaces provides an S3-compatible API. We turn fully off env_auth making sure that we could determine the areas qualifications in the setup file.

Next, we set the access_key_id and secret_access_key factors to your values produced for the DigitalOcean account. We set the endpoint to your spaces that are appropriate we determined earlier. Finally, we set the acl to private again to protect our assets them.( until we want to share*******)

Save and shut the file when you're completed.

On macOS and Linux, make sure you secure the permissions down associated with the setup file since our qualifications are inside:

  • chmod 600 ~/.config/rclone/rclone.conf

On Windows, permissions are rejected to users that are non-administrative clearly provided, so we have tonot need to modify access by hand.

Copying items from S3 to Spaces

Now which our setup is complete, we're prepared to move our files.

Begin by checking the rclone configured remotes:

Output

s3: areas:

Both associated with the parts we defined are shown.

We can see the available S3 buckets by asking rclone to record the "directories" from the s3 remote (remember to include the colon to your end associated with the remote title):

Output

-1 2017-10-20 15:32:28 -1 source-of-files

The above production indicates this one bucket, called source-of-files ended up being present in our S3 account.

If you've got currently produced a DigitalOcean area, you'll duplicate the task to look at your areas:

Output

-1 2017-10-25 19:00:35 -1 existing-space

To view the articles of an bucket that is s3 DigitalOcean Space, you can use the tree command. Pass in the name that is remote followed closely by a colon and title associated with the "directory" you want to record (the bucket or room title):

  • rclone tree s3:source-of-files

Output

/ ├── README.txt ├── demo_dir │ ├── demo1 │ └── demo2 └─ media that are ├─ Social that is ─ Rebrand 032815.ppt ├── TechnicLauncher.jar ├──.docx that is nda_template ├── textfile.txt └─ the_mother_of_all_demos.mp4 that is 2 directories, 8 files

when you're prepared, you'll duplicate the files from your own bucket that is s3 to DigitalOcean area by typing:

  • rclone sync s3:source-of-files areas:dest-of-files

(with the given name******)If you hadn't previously created the Space you selected, rclone will attempt to create one for you. This can fail in the event that title supplied has already been used by another account or in the event that title does not meet with the naming demands for DigitalOcean Spaces (lowercase letters, figures, and dashes just).

Assuming every thing went well, rclone begins copying items from S3 to areas.

once the transfer is complete, you'll aesthetically be sure the items have actually moved by viewing all of them with the tree subcommand:

  • rclone tree spaces:dest-of-files

Output

/ ├── README.txt ├── demo_dir │ ├── demo1 │ └── demo2 └─ media that are ├─ Social that is ─ Rebrand 032815.ppt ├── TechnicLauncher.jar ├──.docx that is nda_template ├── textfile.txt └─ the_mother_of_all_demos.mp4 that is 2 directories, 8 files

For more verification that is robust make use of the check subcommand to compare the items both in remotes:

  • rclone check s3:source-of-files areas:dest-of-files

Output

2017/10/25 19:51:36 NOTICE: S3 bucket dest-of-files: 0 distinctions discovered 2017/10/25 19:51:36 NOTICE: S3 bucket dest-of-files: 2 hashes cannot be examined

This will compare the hash values of each and every item both in remotes. You might receive a message indicating that some hashes could not be compared. In that full instance, you'll rerun the demand with all the --size-only banner (which simply compares predicated on quality) and/or --download banner (which downloads each item from both remotes to compare in your area) to validate the transfer integrity.

Conclusion

In this guide, we have covered just how to move items from Amazon S3 to DigitalOcean areas. We created API qualifications for both solutions, set up and configured the rclone energy on our neighborhood computer, after which copied all items from an bucket that is s3 a DigitalOcean area.

The rclone customer may be used for several other item storage space administration tasks including uploading or downloading files, mounting buckets regarding the neighborhood filesystem, and producing or deleting extra buckets. Read the man web page for more information concerning the functionality the device provides.

Just how to Install and make use of Lnav Log Viewer on Ubuntu 16.04 LTS

Previous article

How exactly to create Shiny Server on Ubuntu 16.04

Next article

You may also like

Comments

Leave a Reply

More in DigitalOcean