linux

Synchronizing folders in Linux by using rsync

Linux has a nifty little tool called rsync that should be available on the distribution of your choice (provided it was updated at least once since the stone age). From the man pages:

Rsync is a fast and extraordinarily versatile file copying tool. It can copy locally, to/from another host over any remote shell, or to/from a remote rsync daemon. It offers a large number of options that control every aspect of its behavior and permit very flexible specification of the set of files to be copied. It is famous for its delta-transfer algorithm, which reduces the amount of data sent over the network by sending only the differences between the source files and the existing files in the destination. Rsync is widely used for backups and mirroring and as an improved copy command for everyday use.

So rsync is perfect for the job of keeping folders on different machines up to date. The proposed solution uses a central server approach looks as follows:

  • Rsync is running in daemon mode on the server. For authentication a file with user:password pairs is used. Information on how to set up an rsync daemon can be found in the related man pages.
  • On the client machines rsync is executed by a small script. Authentication again happens through a file stating username and password that is passed to rsync via the --password-file parameter. This allows for automation which is achieved by execution through a cron job as well as during boot and shutdown. How to achieve this is dependant on the distribution in use. On e.g. Arch Linux rc.local and /etc/rc.local.shutdown would be good places for executing the script on boot and shutdown respectively. Information on how to set up a cron job may be found in the man pages for crontab

Caveats:

  • The user credentials for authentication are stored in a plain text file. While this is necessary for automated execution it may be a no-go for some people. The alternatives are not using a password at all or using ssh as connection protocol. Both have options have their own pitfalls.
  • The setup (i.e. the exact calls and paths) on the client side will vary from distribution to distribution and will most likely need tweaking on each new machine; no solution that works out of the box.

Taggings:

Synchronise folders between Linux server and client

I sometimes find myself in a situation where I work on one project but use several different computer systems (all of which run Linux) for that. An example would be an exercise for university which I work on at home on my PC but then take my laptop to university to discuss it with colleagues. Always keeping all the files up to date on both systems is tedious and I often forget to do so. Hence my motivation for the challenge. I have a small server at home which runs 24/7 and is also accessible from outside via SSH. I want to use that as something of a central storage unit for project folders. For that I need a system in place that automatically synchronizes selected local folders to and from that server. The synchronization should occur without me having to initialize it as I'd surely forget that. "Synchronization" in this context means that all files present in a folder on one machine should also be present in an identically named folder on the other one. If a file has a more recent modification date on one machine than on the other the more recent file should replace the older one. Tracking the changes or keeping a history of changes is not necessary. To sum it up: <ul> <li>file synchronization between Linux client and server</li> <li>no graphical user interface server side</li> <li>SSH access to server</li> <li>automated, no user input necessary</li> </ul> Regards and thanks for any input.

Setting up a Linux Server to use LDAP for Samba, SSH

The main issue is that Samba and SSH do not use LDAP directly, instead you have to use the technologies NSS and PAM which will then access the LDAP system. So in order to set up two systems, about 5 systems have to be set up, which seems difficult and I was not able to find proper documentation about the whole process yet.

easy to use backup tool for linux

Create an incremental backup of a debian-linux server with an easy to use tool. The backup-server is reachable via ftp and sftp. The backup should be encrypted and compressed to be space and bandwidth efficient. There should be an easy way to either restore parts of the backup or the full system after a failure.

perl script for that solution

this is a simple perl script using mbox:parser to check for mails and store the attachments to a given folder...
can be invoked by crontab (also takes care of emptying the box) e.g.:
/1 * * * * root sudo -u scans getscans >> /data/daten/000_Scans_Server/00logfile.txt; chmod a+r /data/daten/000_Scans_Server/00logfile.txt; echo -n > /var/mail/scans

Taggings:

Don't Use SBS as the Default Gateway

After having issues with dropped internet connections, when a Small Business Server 2003 is used as the (default) gateway I started to do some quick research. However it didn't turn up anything too useful.

While Microsoft proposed using the SBS as the default gateway (with two NICs) as a viable solution, I wasn't so sure. If the server was compromised, it was game over. There was no second layer of security and the whole system didn't work as expected.

So instead of solving the problem (the security concerns couldn't be resolved anyway), I just mitigated it by using an old server as a Linux firewall. For easier handling I chose http://www.endian.com/en/community/ (no, I'm not affiliated with them, this is no hidden product placement) - one of a bunch of free (GPL) unified thread management [UTM] systems. It offers in- and outbound firewalls, proxies for HTTP, POP3, SMTP,... (with capabilites for spam detection, virus scanning,...), logging, VPN,...
So it definitely adds an additional layer of protection.

After deploying the system, the internet connection has been rock-solid until today.

Lesson learned: Think out of the box. By solving one problem (internet connection), one can probably archieve something else as well (better security).

Final note: Windows SBS2008 cannot be used as a default gateway any more. Probably because Microsoft wants to sell more ISA licenses, but it probably didn't work too well...

Taggings:

How to enable sound in Ubuntu Linux on Asus computers

I have installed Ubuntu 8.04 on my ASUS laptop. And it works fine. However I cannot get the sound to work. The sound doesn't work even if the speakers are not muted. How can one get the sound to work?

Install D-Link DWL-650+ WLAN Device under Linux

For some old WLAN cards like DWL-650+ from D-Link no official drivers for Linux has been released. Furthermore during the installation process the card cannot be installed. There exist some third party device drivers that support those devices, but these are hard to find and often doesn’t work very well. Especially, the DWL-650+ causes problems in combination with Fedora 8. Sometimes it seems to work and sometimes it doesn’t.

Configure fetchmail and procmail to filter your emails

I got plenty of mail accounts and I have a laptop and a personal computer and I am tired of reading every mail twice. One time at home and one time on my laptop. So I want a mail system where I have to read my mails only one time. Imap seems to be promising to do this and since i got a server who is capable of imap i wanna configure it to get the mails into the right folders from my different mail accounts. And procmail offers a filtering system which can help me in automatically get the mails into the right folder.

Setting up a Subversion repository and corresponding Trac app on a (Debian/Apache) server

So the basic problem here is how to best work together productively in a programming/development project and the question which are good tools providing capabilities to potentially support and improve this. One such tool is a version control systems (VCS) which basically helps to keep the code and especially its progress under (version) control in a code repository (transparently). A specific such VCS is Subversion (a.k.a. SVN). This is at the time one of the most popular and up-to-date systems of its kind (i.e., centralized VCS). Its basic slogan is "CVS done right" (which consequently can be seen as Subversion's indirect predecessor system). There are many tools available to work with Subversion from a client/user perspective like stand-alone client apps, integration in editors, e.g., via special plugins or also complete integration in a full-blown IDE (integrated development environment). Now, what's up to do is to actually setup a Subversion system on a server and to create a code repository for a programming/development project. In this case a Debian-based OS with an Apache Web server is chosen as infrastructure. Additionally a Trac app should be installed which among other things offers nice and convenient Web visualization of the code repository (and its progress) to users. When all this is in place users can take the benefits of using Subversion for version control of the code of the project.

Pages

Subscribe to linux