Synchronize File With Cloud Storage Using [Rclone] In (Linux)

Last Updated on: July 18, 2020

Rclone is an open-source command-line program to sync files and directories to and from different cloud storage providers. It preserves timestamps, and if you are transferring from local to cloud, it can delete the local files after verification (awesome).

Without further ado, install rclone on Linux/macOS/BSD systems using:

					

curl https://rclone.org/install.sh | sudo bash

Alternatively, if you are on Ubuntu, it is available in the snap store, you can install using the following command:

Setting Up Rclone

To set up rclone, you would want to make note of the cloud storage provider you are backing up to, in this guide, I’ll be using the renowned ‘Mega’, once you know where you are backing up to, fire up your terminal and run rclone configfor setup:

Name                 Type
====                 ====
onedrive             onedrive

e) Edit existing remote
n) New remote
d) Delete remote
r) Rename remote
c) Copy remote
s) Set configuration password
q) Quit config
e/n/d/r/c/s/q>

Off course it is a new remote, so, click ‘n’

Type a name, e.g, I’ll name it after what I am backing up ‘classicpress’

Once you’ve chosen a name, hit enter, and you’ll see a list of 38 supported cloud storage:

1. rclone supported cloud storage

Mega is at number 21, so, if you are using mega input 21, and hit enter

Now, enter the user (should be email address if you are using mega), hit enter and press Y to type in your password:

2. Enter User and Password

Once you’ve entered and confirmed your password, press ‘N’ to ignore advanced config, and finally press y:

3. Press Y for Yes this is OK (default)

Once you are done, click q to quit, and run this as a test:

your remote is the name you choose when we started the setup (mine is classicpress). If you received no output we are ready to go, if you receive an error, please go over the above step again.

Let’s get started backing up:

To copy a local directory to a Mega directory called mybackup, you do the following (if you don’t have a directory already created in mega, it would create one):

					

rclone copy /home/path classicpress:mybackup

To check the debug information while transferring, use the following command:

					

rclone copy /home/path classicpress:mybackup -vv

Copy a local directory into a sub-directory in remote:

					

rclone copy /home/path classicpress:mybackup/subdirectory

Sync Local Directory To Remote:

To sync, simply swap the copy with sync

					

rclone sync /home/path classicpress:mybackup -vv

Be careful with the above command, what it does is Sync the source to the destination (the remote), changing the destination only. It makes the source and destination identical, modifying destination only.

Caution: Since this can cause data loss, test first with the –dry-run flag to see exactly what would be copied and deleted.

Move Files from Source to Destination

To move the contents of the source directory to the destination directory, use the following command:

					

rclone move /home/path classicpress:mybackup

This would copy the source to destination, and then deletes the source.

Caution: Since this can cause data loss, test first with the –dry-run flag to see exactly what would be moved and deleted.

Automatically Backing up Directory from Local to Remote:

If you want to automatically backup a certain directory to your cloud storage without the need of running the command manually, we can utilize a bash script, I’ll be using by ajkis but I’ll modify it to fit into our use case.

First, create a script and a log directory in your user home directory:

					

cd ~ && mkdir scripts logs

Now create a file in the scripts folder, and open it:

					

touch rclone_automove.sh && nano rclone_automove.sh

Paste the below into the file:

					

#!/bin/bash
# RCLONE Automove Script

if pidof -o %PPID -x "$0"; then
   exit 1
fi

LOGFILE="/home/user/logs/rclone_automove.log"
FROM="/storage/path/"
TO="classicpress:mybackup"

# CHECK FOR FILES IN FROM FOLDER THAT ARE OLDER THAN 15 MINUTES
if find $FROM* -type f -mmin +15 | read
  then
  start=$(date +'%s')
  echo "$(date "+%d.%m.%Y %T") RCLONE UPLOAD STARTED" | tee -a $LOGFILE
  # MOVE FILES OLDER THAN 15 MINUTES 
  rclone move "$FROM" "$TO" --transfers=20 --checkers=20 --delete-after --min-age 15m --log-file=$LOGFILE
  echo "$(date "+%d.%m.%Y %T") RCLONE UPLOAD FINISHED IN $(($(date +'%s') - $start)) SECONDS" | tee -a $LOGFILE
fi
exit

Close the file with CTRL + X and Enter, and run the following command:

					

chmod a+x /home/user/scripts/rclone_automove.sh

chmod a+x will add the exec bits to the file but will not touch other bits. Adding this would make it runnable as a bash program.

Finally, add it to the crontab, type crontab -eand add the following:

					

* * * * * /home/user/scripts/rclone_automove.sh >/dev/null 2>&1


The above command would run the script every time. You can learn more about crontab if you want to schedule it to your taste.

To confirm if the script is running, run htop, click filter and type in the name of the script:

4. rclone running in bash script

Note: The image shows rclone copy because, I am backing up a copy of my local directory to a cloud storage, if you want to copy, you can change the rclone move "$FROM"in the bash script to this: rclone copy "$FROM"

There you go!

Bonus:

To run it every 15 minutes, use:

					

*/15 * * * * /home/user/scripts/rclone_automove.sh >/dev/null 2>&1

To run it every 5 hours, use:

					

0 */5 * * * /home/user/scripts/rclone_automove.sh >/dev/null 2>&1


To run at 7 Pm Every Friday, use:

					

* 19 * * 4 /home/user/scripts/rclone_automove.sh >/dev/null 2>&1

Enjoy

Comment policy: Respectful and beneficial comments are welcome with full open hands. However, all comments are manually moderated and those that doesn't relate with what the passage is saying or offensive comments would be deleted. Thanks for understanding!

Leave a Reply

Your email address will not be published. Required fields are marked *