Forge Home

s3sync

A module that enables easy S3 based backups

10,426 downloads

10,243 latest version

2.1 quality score

We run a couple of automated
scans to help you access a
module's quality. Each module is
given a score based on how well
the author has formatted their
code and documentation and
modules are also checked for
malware using VirusTotal.

Please note, the information below
is for guidance only and neither of
these methods should be considered
an endorsement by Puppet.

Version information

  • 0.1.1 (latest)
  • 0.1.0
released May 26th 2014

Start using this module

  • r10k or Code Manager
  • Bolt
  • Manual installation
  • Direct download

Add this module to your Puppetfile:

mod 'juliakreger-s3sync', '0.1.1'
Learn more about managing modules with a Puppetfile

Add this module to your Bolt project:

bolt module add juliakreger-s3sync
Learn more about using this module with an existing project

Manually install this module globally with Puppet module tool:

puppet module install juliakreger-s3sync --version 0.1.1

Direct download is not typically how you would use a Puppet module to manage your infrastructure, but you may want to download the module in order to inspect the code.

Download
Tags: backup, backups, s3

Documentation

juliakreger/s3sync — version 0.1.1 May 26th 2014

puppet-s3sync

A very simple puppet module for keeping things in sync with S3.

One quick note, this was developed to run against SmartOS, it however it should work for Ubuntu. I created it to backup some websites and databases to S3, and as such this is a very simple applicaiton.

How to use:

I broke this module into 3 individual invocations, which you'd likely want to make from your node manifest.

Step 1) Lets make sure s3cmd is installed.

class {'s3sync': }

Step 2) Lets Setup a .s3cfg file!

s3sync::user { 'a whitty resource title':
  user           => 'your username',
  aws_key_id     => 'your aws key id',
  aws_secret_key => 'your aws secret key',
  gpg_passphrase => 'a passphrase for gpg encryption',
  path           => '/path/to/your/users/home/folder/',
}

Part of the complexity for this setp is due to puppets insistance that all paths must be absolute. If there is a better way, please fork, and send me back a pull request because I don't quite like it, but it works for now.

Step 3) Go ahead set us up some cron jobs to make sure my stuff gets synced out to S3.

s3sync::cron {'a whitty backup job resource title':
  user       => 'your username',
  localpath  => '/path/to/your/files/you/want/synced',
  bucketpath => 'bucketname/bucketfolder/',
  backuptime => [ '12', '00' ],
}

The backup time is in the form of ['hour', 'minute'] and will execute daily.