A tool to find open S3 buckets and dump their contents :droplet:
usage: s3scanner [-h] [-o OUTFILE] [-d] [-l] [--version] buckets # s3scanner - Find S3 buckets and dump! # # Author: Dan Salmon - @bltjetpack, github.com/sa7mon positional arguments: buckets Name of text file containing buckets to check optional arguments: -h, --help show this help message and exit -o OUTFILE, --out-file OUTFILE Name of file to save the successfully checked buckets in (Default: buckets.txt) -d, --dump Dump all found open buckets locally -l, --list Save bucket file listing to local file: ./list-buckets/${bucket}.txt --version Display the current version of this tool
The tool takes in a list of bucket names to check. Found S3 buckets are output to file. The tool will also dump or list the contents of 'open' buckets locally.
This tool will attempt to get all available information about a bucket, but it's up to you to interpret the results.
Settings available for buckets:
Any or all of these permissions can be set for the 2 main user groups:
What this means: Just because a bucket returns "AccessDenied" for it's ACLs doesn't mean you can't read/write to it. Conversely, you may be able to list ACLs but not read/write to the bucket
virtualenv venv && source ./venv/bin/activate
pip install -r requirements.txt
python ./s3scanner.py
(Compatibility has been tested with Python 2.7 and 3.6)
Build the Docker image:
sudo docker build -t s3scanner https://github.com/sa7mon/S3Scanner.git
Run the Docker image:
sudo docker run -v /input-data-dir/:/data s3scanner --out-file /data/results.txt /data/names.txt
This command assumes that names.txt
with domains to enumerate is in /input-data-dir/
on host machine.
This tool accepts the following type of bucket formats to check:
google-dev
uber.com
, sub.domain.com
yahoo-staging.s3-us-west-2.amazonaws.com
(To easily combine with other tools like bucket-stream)flaws.cloud:us-west-2
> cat names.txt
flaws.cloud
google-dev
testing.microsoft.com
yelp-production.s3-us-west-1.amazonaws.com
github-dev:us-east-1
Dump all open buckets, log both open and closed buckets to found.txt
> python ./s3scanner.py --include-closed --out-file found.txt --dump names.txt
Just log open buckets to the default output file (buckets.txt)
> python ./s3scanner.py names.txt
Save file listings of all open buckets to file
> python ./s3scanner.py --list names.txt
Issues are welcome and Pull Requests are appreciated. All contributions should be compatible with both Python 2.7 and 3.6.
master | |
---|---|
enhancements | |
bugs |
test_scanner.py
pytest -n NUM
where num is number of parallel processes.pytest -q -s test_scanner.py::test_namehere
License: MIT https://opensource.org/licenses/MIT