Automate Fly Postgres Database Backup & Store To S3 bucket With GitHub Actions

Data security and reliability are paramount for any application’s success.

Regular backups play a crucial role in ensuring that our data remains safe and accessible.

Fly.io’s postgres offers automated snapshots, but for added peace of mind, relying on pg_dump for backups became essential.

GitHub Actions allows us to automate, customize, and execute software development workflows directly in our repositories.

In this blog post, we’ll explore how to automate the backup process of a Fly.io postgres database using GitHub Actions and saving them to a designated S3 bucket.

  • Fly Deploy Tokens:

    Deploy tokens are used for secure authentication and authorization purposes during the deployment process. Before going further please read about Fly Deploy Tokens and create a deploy token.

  • S3 Bucket Creation Steps

    As we are using S3 bucket to store the db backup files. Please follow the S3 Bucket Creation Steps to have S3 bucket with required credentials.

Within our GitHub repository, head to the .github/workflows directory (if absent, create it), and generate a new YAML file titled backup.yml.

Insert the following YAML content into our freshly crafted backup.yml file:

  

  name: Back up database
  run-name: Task

  # every sunday
  on:
    schedule:
      - cron: "0 0 * * 0"
    workflow_dispatch:
  jobs:
    backup:
      runs-on: ubuntu-latest
      env:
        FLY_API_TOKEN: ${{ secrets.FLY_PRODUCTION_DB_TOKEN }}
        FLY_DB_APP: ${{ secrets.FLY_DB_APP_NAME }}
        PGUSER: postgres
        PGPASSWORD: ${{ secrets.PGPASSWORD }}
        PGDATABASE: app_production
        PGHOST: localhost
        PGPORT: 5434
        S3_BUCKET: app-db-backup

      steps:
        - uses: s3-actions/s3cmd@v1.2.0
          with:
            provider: aws
            region: us-east-1
            access_key: ${{ secrets.S3_ACCESS_KEY }}
            secret_key: ${{ secrets.S3_SECRET_KEY }}
        - uses: superfly/flyctl-actions/setup-flyctl@master
        - name: Set filename
          run: echo "filename=db-$(date -u +"%Y-%m-%d-%H%M%S").sql" >> $GITHUB_ENV
        - name: install pg_dump version 15.2
          run: |
            sudo sh -c 'echo "deb http://apt.postgresql.org/pub/repos/apt $(lsb_release -cs)-pgdg main" > /etc/apt/sources.list.d/pgdg.list'
            wget --quiet -O - https://www.postgresql.org/media/keys/ACCC4CF8.asc | sudo tee /etc/apt/trusted.gpg.d/pgdg.asc &>/dev/null
            sudo apt-get update
            sudo apt-get install -y postgresql-client-15
            psql --version
        - name: Dump database, gzip, and upload to S3
          run: |
            flyctl proxy $PGPORT:5432 -a $FLY_DB_APP &
            sleep 3
            echo Dumping ...
            PGPASSWORD=${PGPASSWORD} pg_dump -h $PGHOST -p $PGPORT -x -U $PGUSER -F c -b -v  -Z0 -f ${{ env.filename }} ${PGDATABASE}
            gzip ${{ env.filename }}
            ls
            s3cmd put --acl-private ${{ env.filename }}.gz s3://$S3_BUCKET/${{ env.filename }}.gz
  

Update all the values like FLY_API_TOKEN, FLY_DB_APP, PGPASSWORD and S3 bucket access_key and secret_key with the help of Github secrets.

The above Github action runs on every sunday. It takes the db backup and stores it on S3 bucket.

Need help on your Ruby on Rails or React project?

Join Our Newsletter