{ claus.conrad }

Backup Jenkins configuration to S3

đź“… Apr 09, 2013
⌛ 2 minutes

Here's a simple Jenkins job that backs up your Jenkins configuration (i.e. job definitions) to Amazon S3.

Dependencies

Global configuration

  1. Set up a S3 profile at jenkinshost/configure.

Job configuration

  1. Create a new free-style project and give it a name.

  2. (Optional) I recommend to check “Discard Old Builds” and choose the following settings to save disk space:
    Strategy: Log Rotation
    Days to keep builds: (empty)
    Max # of builds to keep: (empty)
    Days to keep artifacts: 1
    Max # of builds to keep with artifacts: 1

  3. Set a build trigger of your choice; I use this:
    Build periodically - Schedule: H H(0-5) * * *

  4. Add an “Execute shell” build step with the following command:

    # Delete all files in the workspace
    rm -rf *
    # Create a directory for the job definitions
    mkdir -p $BUILD_ID/jobs
    # Copy global configuration files into the workspace
    cp $JENKINS_HOME/*.xml $BUILD_ID/
    # Copy keys and secrets into the workspace
    cp $JENKINS_HOME/identity.key $BUILD_ID/
    cp $JENKINS_HOME/secret.key $BUILD_ID/
    cp $JENKINS_HOME/secret.key.not-so-secret $BUILD_ID/
    cp -r $JENKINS_HOME/secrets $BUILD_ID/
    # Copy user configuration files into the workspace
    cp -r $JENKINS_HOME/users $BUILD_ID/
    # Copy job definitions into the workspace
    rsync -am --include='config.xml' --include='*/' --prune-empty-dirs --exclude='*' $JENKINS_HOME/jobs/ $BUILD_ID/jobs/
    # Create an archive from all copied files (since the S3 plugin cannot copy folders recursively)
    tar czf $BUILD_ID.tar.gz $BUILD_ID/
    # Remove the directory so only the archive gets copied to S3
    rm -rf $BUILD_ID
    
  5. Create a post-build action of type “Publish artifacts to S3 Bucket” and configure as follows:
    S3 profile: Choose the profile from the global configuration
    Source: **
    Destination bucket: Enter the name of the bucket where you want the archive to go
    Note: Even though it says “Destination bucket”, it is possible to enter a bucket name AND path, the S3 plugin will create the directory or use it if it already exists.

S3 configuration

The setup described above creates a new backup each day. I like being able to go back in history, e. g. if the accidental deletion of a job was discovered after several days. I’d recommend to use a lifecycle policy on the bucket to remove old backups after a desired number of days.

Alternatively one could achieve a similar effect by enabling versioning on the bucket and change the archive’s file name to a fixed value by changing the tar command in the shell script:
tar czf jenkins-configuration.tar.gz $BUILD_ID/