Some notes on logging and SSH access from cron jobs

October 8th, 2013 at 5:56 am

In the process of making the semi-official CPython mirror on Github auto-update, I ventured into cron-land; it’s a land I’ve hardly been to before, so here’s a quick blog post describing some of the interesting things I learned. This was written for Ubuntu 12.04, but should apply with very minimal changes to any Linux.

The basic stuff: crontab -e to edit your crontab, crontab -l to dump it to stdout.

If you’re wondering which tasks cron ran recently look in /var/log/syslog.

A common problem that comes up with writing crontabs is that the environment the cron jobs are executed in are different from your normal environment. They will have your username, but not much in terms of environment variables you have that set up the way your terminal experience looks. A good way to see what kind of environment cron has when it runs your jobs is to add this rule:

*/1 * * * * env > /temp/my-cronenv

This tells cron to dump its environment to /tmp/my-cronenv every minute. Once you have a my-cronenv file, you can reproduce running your jobs in cron’s environment by running them as:

$ env - cat /temp/my-cronenv  <the script>

Another common question that comes up is "how to do logging from my cron jobs?". The mechanics of logging itself depend on the language the script is written in, of course. For Python there’s the logging package. But where to store those logs? If you want your logs to be where all the cool kids’ logs are, that would be /var/log. But you usually don’t have non-sudo permissions in that directory. So do this, replacing foobar with your username:

$ sudo mkdir /var/log/foobar_logs
$ sudo chown foobar /var/log/foobar_logs/

From now on, you’re free to create new files and edit existing ones in /var/log/foobar_logs.

A hairier problem exists with SSH. Suppose that you want your cron job to log into some remote server (whether for Git access, scp, rsync, or remote command execution) for which you’ve diligently set up a public/private key pair. And you even went as far as to run ssh-agent on your local machine to avoid entering that pesky private key passphrase every time (you do use a passphrase for your secret key, right?) How do you make sure that your cron jobs have proper access to ssh-agent and don’t need the passphrase?

There’s a number of ways to go about this, but I found this walkthrough using keychain effective.

First, install the keychain program. Second, add this to your ~/.bash_profile (we don’t need this to run for every terminal, just on login):

# Use keychain to keep ssh-agent information available in a file
/usr/bin/keychain $HOME/.ssh/id_rsa
source $HOME/.keychain/${HOSTNAME}-sh

Tweak as needed for the location of your private SSH keys. Also, make sure your .bash_profile is actually invoked at start-up. When logging into Ubuntu graphically, this may not be the case unless it’s sourced in .profile.

Third, add this to the cron job script (if your cron job is a Python program, just wrap it in a shell script):

source $HOME/.keychain/${HOSTNAME}-sh

That’s all.

Related posts:

  1. Some notes on POSIX regular expressions
  2. Problem logging into use.perl ?
  3. Non-blocking socket access on Windows
  4. Introducing the “SICP reading notes”

8 Responses to “Some notes on logging and SSH access from cron jobs”

  1. Elazar LeibovichNo Gravatar Says:

    Why do you care about storing your key without password if you intend to keep an agent with the key in memory all day long.

    If you worry that root user would steal your key – he can still pry it from your agent with pmap or something.

    If you worry about other users stealing your key – chmod is your friend.

    If you worry about people stealing your physical hard drive then:

    1) I think you worry far too much.
    2) You’d better off use full disk encryption. At least for a small partition. This is a known and proven way to do what you’re actually looking for.

    I really see zero benefit with using an agent for this particular use case.

  2. nilujeNo Gravatar Says:
    Host *
        ForwardAgent yes

    baaddddd idea

    Assuming you’re connected on a server that is compromised, the attacker could access to any of the server you can connect using your private key

    Forwarding the agent might be a good thing in a few cases, but certainly not in a Host * section in the ssh/config file.

  3. elibenNo Gravatar Says:

    @niluje,

    I agree. Looking deeper into it, this configuration bit is not needed at all for my specific task. The place I picked it up from also wanted to forward agent connections specifically, but even in that case I agree that host restriction is important. I removed it completely from the post. Thanks for the useful tip!

  4. elibenNo Gravatar Says:

    @Elazar,

    If you have a number of different private keys and you need to have them on more than one machine, copying is involved. Having the key encrypted reduces the stress of sending/copying these keys around, even when that is also encrypted. Less chance of screw-ups, etc.

  5. Elazar LeibovichNo Gravatar Says:

    @eliben,

    To reduce the stress what you really need to do is to use ssh-certificates with expiry period to your liking, and recreate them. Never send the root certificate to anyone. Keep it on a machine disconnected from the internet if you’re paranoid, and just generate the certificates from time to time, and move them to disk-on-key.

    I don’t think encrypted keys are much worse than certificates, operationally and security-wise. (Encrypting the key will give you a smaller chance for leak, but if you leaked – you’re out of luck for life).

    @eliben, @niluje,

    Why touch ~/.ssh/config at all? Just add -A to the particular session ssh command line.

  6. Elazar LeibovichNo Gravatar Says:

    I forgot to mention that with certificate, you never copy a private key, and you have a different certificate for each computer, so in case of accidental leak – you’re relatively OK (blacklist the leaked one, and it’ll expire anyway).

    IMHO, you never ever copy the private key of the node, so there’s really much less chance anyone it’ll leak (following the rule of never ever touch a private key is easier than following the rule of copy keys only to this IP).

    If you’re extra paranoid, you can chmod -r the key, and have a script that chmod +r it just enough time for the ssh-agent to load it. In this case there’s little chance to copy it anywhere by mistake.

  7. MikeNo Gravatar Says:

    This is a very good article on SSH login without password. Here is another one that worked for me when I first started doing this. It’s very simple, concise and easy to understand. http://www.thegeekstuff.com/2008/11/3-steps-to-perform-ssh-login-without-password-using-ssh-keygen-ssh-copy-id/

  8. Paul MacnabNo Gravatar Says:

    Your article provides some quick tips for SSH access. Your answer for “how to do logging from my cron jobs?” also helped me solve this issue. Thanks.

Leave a Reply

To post code with preserved formatting, enclose it in `backticks` (even multiple lines)