Skip to content

Amazon S3

Store backups in AWS S3 with support for storage classes, lifecycle policies, and multi-region durability.

Configuration

Credential Profile required

Amazon S3 requires a Credential Profile of type ACCESS_KEY. Create one in Settings → Vault → Credentials before saving the destination.

FieldDescriptionDefaultRequired
NameFriendly name for this destination-
RegionAWS region (e.g. us-east-1, eu-central-1)us-east-1
BucketS3 bucket name-
Primary CredentialACCESS_KEY credential profile (Access Key ID + Secret Access Key)-
Path PrefixFolder path within the bucket-
Storage ClassS3 storage class for uploaded objectsSTANDARD

Storage Classes

ClassUse Case
STANDARDFrequent access (default)
STANDARD_IAInfrequent access, lower cost
GLACIERLong-term archive (retrieval in minutes to hours)
DEEP_ARCHIVECheapest storage, retrieval in 12+ hours

Setup Guide

  1. Create an S3 bucket in your preferred region via the AWS Console
  2. Create an IAM user with programmatic access:
    • Go to IAM ConsoleUsersCreate user
    • Attach the AmazonS3FullAccess policy (or a scoped policy - see below)
    • Create an Access Key (use case: "Application outside AWS") and copy both keys
  3. Create an ACCESS_KEY credential profile in Settings → Vault → Credentials with the Access Key ID and Secret Access Key (guide)
  4. Go to DestinationsAdd DestinationAmazon S3
  5. Enter your Region and Bucket, then select the credential profile in the Primary Credential picker
  6. (Optional) Set a Path Prefix to organize backups in a subfolder
  7. (Optional) Select a Storage Class for cost optimization
  8. Click Test to verify the connection
Minimal IAM Policy (recommended)

Instead of AmazonS3FullAccess, scope permissions to a single bucket:

json
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Action": ["s3:PutObject", "s3:GetObject", "s3:DeleteObject", "s3:ListBucket"],
      "Resource": [
        "arn:aws:s3:::your-bucket-name",
        "arn:aws:s3:::your-bucket-name/*"
      ]
    }
  ]
}

How It Works

  • Backups upload via the AWS SDK using multipart upload for large files
  • All credentials are stored AES-256-GCM encrypted in the database
  • Storage class is set per-object at upload time
  • The Path Prefix creates a virtual folder structure within your bucket

Troubleshooting

AccessDenied

Access Denied (403)

Solution: Verify the IAM user has s3:PutObject, s3:GetObject, s3:DeleteObject, and s3:ListBucket permissions on the correct bucket ARN.

NoSuchBucket

The specified bucket does not exist

Solution: Check bucket name spelling. S3 bucket names are globally unique and case-sensitive.

InvalidAccessKeyId

The AWS Access Key Id you provided does not exist in our records

Solution: Regenerate the access key in IAM Console. Ensure there are no leading/trailing spaces when pasting.

Slow Uploads / Timeout

Solution: Choose a region geographically close to your DBackup server. For large backups, ensure your server has sufficient upload bandwidth.

Next Steps

Released under the GNU General Public License. | Privacy · Legal Notice