Synchronise folders between Linux server and client

I sometimes find myself in a situation where I work on one project but use several different computer systems (all of which run Linux) for that. An example would be an exercise for university which I work on at home on my PC but then take my laptop to university to discuss it with colleagues. Always keeping all the files up to date on both systems is tedious and I often forget to do so. Hence my motivation for the challenge. I have a small server at home which runs 24/7 and is also accessible from outside via SSH. I want to use that as something of a central storage unit for project folders. For that I need a system in place that automatically synchronizes selected local folders to and from that server. The synchronization should occur without me having to initialize it as I'd surely forget that. "Synchronization" in this context means that all files present in a folder on one machine should also be present in an identically named folder on the other one. If a file has a more recent modification date on one machine than on the other the more recent file should replace the older one. Tracking the changes or keeping a history of changes is not necessary. To sum it up: <ul> <li>file synchronization between Linux client and server</li> <li>no graphical user interface server side</li> <li>SSH access to server</li> <li>automated, no user input necessary</li> </ul> Regards and thanks for any input.
1 answer

Synchronizing folders in Linux by using rsync

Linux has a nifty little tool called rsync that should be available on the distribution of your choice (provided it was updated at least once since the stone age). From the man pages:

Rsync is a fast and extraordinarily versatile file copying tool. It can copy locally, to/from another host over any remote shell, or to/from a remote rsync daemon. It offers a large number of options that control every aspect of its behavior and permit very flexible specification of the set of files to be copied. It is famous for its delta-transfer algorithm, which reduces the amount of data sent over the network by sending only the differences between the source files and the existing files in the destination. Rsync is widely used for backups and mirroring and as an improved copy command for everyday use.

So rsync is perfect for the job of keeping folders on different machines up to date. The proposed solution uses a central server approach looks as follows:

  • Rsync is running in daemon mode on the server. For authentication a file with user:password pairs is used. Information on how to set up an rsync daemon can be found in the related man pages.
  • On the client machines rsync is executed by a small script. Authentication again happens through a file stating username and password that is passed to rsync via the --password-file parameter. This allows for automation which is achieved by execution through a cron job as well as during boot and shutdown. How to achieve this is dependant on the distribution in use. On e.g. Arch Linux rc.local and /etc/rc.local.shutdown would be good places for executing the script on boot and shutdown respectively. Information on how to set up a cron job may be found in the man pages for crontab


  • The user credentials for authentication are stored in a plain text file. While this is necessary for automated execution it may be a no-go for some people. The alternatives are not using a password at all or using ssh as connection protocol. Both have options have their own pitfalls.
  • The setup (i.e. the exact calls and paths) on the client side will vary from distribution to distribution and will most likely need tweaking on each new machine; no solution that works out of the box.