After getting the command line interface for s3 working on the Raspberry Pi, I decided I actually needed something a bit more automatic, so I wrote a python script to automatically get s3 bucket sizes and send out an email to each of my clients who own that bucket, getting the details from a csv stored on dropbox. The app first downloads this csv file, and runs through each line, sending out emails to the users I’ve assigned to each bucket, finally sending a total to me and uploading the new csv to dropbox, along with an error log, if any. I’ve set this to run on my crontab each month. I wanted an offsite solution, as I’m always doing something different with my pi – at first I tried the openshift PaaS, but it wasn’t quite flexible enough, so I took the plunge and got a $5 per month VPS from Digital Ocean.

If you’re starting from scratch, you’ll need (all free services except the VPS):
1. An account set up with AWS
2. An account with dropbox
2. An account with Sendgrid for sending emails
3. Set up your VPS (or your RPi will do!)
4. Install python-setuptools, python-dev, and python-pip
5. To keep my projects separate and tidy, I’m using a python virtual environment – see here for more info:
6. Finally, install these modules through pip: sendgrid, boto and dropbox

Here’s what I did to get everything working:

1.  We need to set up and app with dropbox – log in and goto the app console, and create a new app, select dropbox api, choose files and datastores, limit to own folder and then pick a name and you’re good to go. In the app settings, scroll down and view the Generated access token, and save it somewhere for use in the python file (not the app key or secret – we don’t need these as its only a single user project)

2. For my particular app, my data is stored in a third file, a CSV file in my dropbox’s app folder. Each line has the following data:

s3bucket name, username, email, space allotted, space used, no of objects

3. Here is the code for the main app. Please note that I’ve removed all my usernames and passwords for the various accounts.

4. Since I’ve created it to run in a virtual environment, I needed to set this to activate first before running the python script, so I created this bash script:

#! /bin/bash
cd /home/andy/apps/s3size
source bin/activate

and set this line in the crontab to run each month:

0 0 1 * * /home/andy/apps/s3size/

5. That’s it. Now all I need for any new users is to add a new line to the CSV file in my dropbox app folder, and the python app automatically gets this every time it runs, and updates the number of files / size of the bucket before reuploading to dropbox.