Making Your Network Transparent
By Ben Okopnik
Some months ago, I set up a anti-spam system on my laptop which I wrote up back in issue #176. Having it based on my local machine, however, produces a significant amount of SMTP traffic. Since I wanted to minimize that load, I copied my .procmailrc, my whitelist, and my blacklist files to my home directory on the mail server. As a result, my mail traffic dropped from about 1k messages per day, plus the round trips to GMail for any emails in the "doubtful" category, to about 25 (valid) messages per day. This is especially wonderful since I often have a slow or fragile connection to the Net, depending on where I happen to be at the moment.
However, there's still a slight catch: the two list files mentioned above get updated on a regular basis. That is, if I get an email from someone and I decide that I'm going to correspond with that person regularly, I white-list them by hitting 'Ctrl-W' in Mutt (this, of course, requires setting up a keystroke alias in '~/.muttrc'.) Conversely, black-listing someone just takes hitting 'Ctrl-B'. Both of these actions, obviously, update their relevant list file - but they do so locally, and that's not where my (primary) spam filter is anymore. What to do? Logging into the mailserver on a regular basis and copying the files would be a hassle and an additional task that I'd have to remember - and that's precisely the kind of load that I don't want to add to my routine. Fortunately, automating it is easy.
SSH Authorization
Needless to say, your network transactions need to be secure. Fortunately, the standard tool for these, 'ssh', is perfectly suited to the task - and it even allows for secure connections without using a password. All you need to do is configure the two machines to perform authorization via public key exchange, essentially by copying your public key from one to the other. Here's the procedure:
- Ensure that you have a .ssh directory in your home directory on your local machine. Create it if it doesn't exist.
- Assuming that you don't already have one, generate a 1024-bit SSH key;
each system from which you'll want to connect in this way will need one.
If the key already exists (i.e., you already have an 'id_dsa.pub' or an
'id_rsa.pub' file in your ~/.ssh directory), then you can skip this step.
ssh-keygen -tdsa
- Append your local public key to your remote host's
'~/.ssh/authorized_keys' file:
ssh user@remote_host 'cat>>~/.ssh/authorized_keys'<~/.ssh/id_dsa.pub
Enter your password when prompted, and take pleasure in knowing that this is the last time you'll need to do so.
You should now be able to simply type 'ssh user@remote_host' and be logged in - no password required. In fact, you can make this exchange even simpler by giving your remote system a short alias; just add an entry in your local ~/.ssh/config file (create it if it doesn't exist) similar to this one:
Host sfs Hostname smithfamilyserver.com Protocol 2 User joe Compression yes
Once that's done, you'll be able to log into the above server simply by typing 'ssh sfs'. Nice, short, and simple.
Configuring rsync
At this point, I could simply copy the files that I want to the server by issuing an 'scp' command ('secure copy', part of the SSH suite); however, as a matter of good general practice, I like to only update the files if it's necessary - i.e., if they either don't exist or if the local files are different from the remote ones - and skip the update otherwise. The 'rsync' command can do exactly that - and I can even tell it to use SSH as the transport mechanism. All that takes is a couple of simple steps:
- Define the 'rsync' transport by adding the appropriate line to your
login configuration file:
echo 'export RSYNC_RSH=/usr/bin/ssh -l remote_username' >> ~/.bash_profile
Note: depending on your distro and on how you use your system, you may also need to add it to your ~/.xprofile ; in fact, you might as well do it just to make sure, since it won't do any harm. If you use a shell other than Bash, then presumably you'll know what to do to set and export that variable in that shell. - Tell the shell to re-read that file (e.g., 'source ~/.bash_profile'), or simply log out and log back in.
Now that both SSH and rsync are configured, updating the remote files is simply a matter of issuing the following command:
rsync ~/.mail-* sfs:
The colon, of course, tells 'rsync' (just as it would for the 'ssh' command) that we're copying the files to a remote host. The default remote location is your home directory on the remote machine; obviously, you can specify any directory you want there - assuming you have the right system permissions for it - by adding it immediately after the colon.
Automating it is just as easy: just create a line in your 'crontab' file that will run the above command on your desired schedule. For example, if I want these two files updated once an hour, I'll set up a cron job by typing 'crontab -e' and editing my crontab to look like this:
# m h dom mon dow command 05 * * * * /usr/bin/rsync /home/ben/.mail-{accept,deny}-list sfs:
Following the comment line, this means "on the 5th minute of every hour of every day of every month on every day of the week, execute this command." That task will now be executed for you, regular as clockwork - and without you having to think about it ever again.
Wrap-up
Obviously, this isn't something that you'd go through just to copy a couple of files; that's easy enough without any special configuration. However, once you've set this up, it can serve you in many different ways - and SSH and rsync are both great tools to have in your toolbox. For me, they come in handy many times a day - and since I have them correctly configured, my network actions are just as simple as the ones involving files on my local machine. Here are a few examples:
ssh lg # Log into the Linux Gazette machine rsync file.html www:okopnik.com/misc # Copy or update 'file.html' to the 'misc/' directory of my website ssh 203k 'tail /var/log/messages' # See the last 10 entries in the log on a client's server rsync -a ~/devel rb:backup/`date +%%FT%%T`/ # Back up my 'devel' dir in a time-stamped subdir on my remote server
Enjoy, and let us know about any interesting uses you find for your newly-transparent network!
Share |
Talkback: Discuss this article with The Answer Gang
Ben is the Editor-in-Chief for Linux Gazette and a member of The Answer Gang.
Ben was born in Moscow, Russia in 1962. He became interested in electricity at the tender age of six, promptly demonstrated it by sticking a fork into a socket and starting a fire, and has been falling down technological mineshafts ever since. He has been working with computers since the Elder Days, when they had to be built by soldering parts onto printed circuit boards and programs had to fit into 4k of memory (the recurring nightmares have almost faded, actually.)
His subsequent experiences include creating software in more than two dozen languages, network and database maintenance during the approach of a hurricane, writing articles for publications ranging from sailing magazines to technological journals, and teaching on a variety of topics ranging from Soviet weaponry and IBM hardware repair to Solaris and Linux administration, engineering, and programming. He also has the distinction of setting up the first Linux-based public access network in St. Georges, Bermuda as well as one of the first large-scale Linux-based mail servers in St. Thomas, USVI.
After a seven-year Atlantic/Caribbean cruise under sail and passages up and
down the East coast of the US, he is currently anchored in northern
Florida. His consulting business presents him with a variety of challenges,
and his second brain Palm Pilot is crammed full of alarms,
many of which contain exclamation points.
He has been working with Linux since 1997, and credits it with his complete loss of interest in waging nuclear warfare on parts of the Pacific Northwest.