21

I have multiple AWS accounts and I need to list all S3 buckets per account and then view each buckets total size.

Currently, I can only view the storage size of a single S3 bucket with:

aws s3 ls s3://mybucket --recursive --human-readable --summarize
Kyle Steenkamp
  • 1,152
  • 1
  • 9
  • 17

6 Answers6

20

Resolution 1

So I solved this with the following script. I originally posted the question just in case there was an easier way that I was not aware of.

#!/bin/bash
aws_profile=('profile1' 'profile2' 'profile3');

#loop AWS profiles for i in "${aws_profile[@]}"; do echo "${i}" buckets=($(aws s3 ls s3:// --recursive --profile "${i}" --region your_region | awk '{print $3}'))

#loop S3 buckets for j in "${buckets[@]}"; do echo "${j}" aws s3 ls s3://"${j}" --recursive --human-readable --summarize --profile "${i}" --region your_region | awk END'{print}' done

done

Resolution 2

Using Dashboards in CloudWatch in the AWS console.

You can then simply specify all S3 buckets and add the numbers stats to show the storage size metrics.

This won't cost you plenty of API calls and can be significantly faster depending on the size of the s3 buckets(takes quite awhile to get the size on very large buckets).

Verdict

Creating the Dashboard (Resolution 2) on each AWS account was the most efficient option for me cause it is way quicker for me to log in and grab the metrics manually from each AWS account than to wait for the scripts API calls to finish. :(

A. Kendall
  • 133
  • 5
Kyle Steenkamp
  • 1,152
  • 1
  • 9
  • 17
  • Why | awk END'{print}' ? – Tensibai Oct 04 '17 at 14:57
  • 1
    Using the recursive option shows the size of every folder and file and I only need the output of the total size of the bucket. – Kyle Steenkamp Oct 04 '17 at 15:08
  • 1
    You need to select a '1 day' or greater period for the dashboard to display anything – Jeremy Leipzig Jul 16 '18 at 17:38
  • 2
    Beware that the solution proposed in Resolution 2 increments your Cloudwatch cost since dashboards have a $3 cost each at the moment. https://aws.amazon.com/cloudwatch/pricing/?nc1=h_ls – Drubio Mar 06 '19 at 17:52
  • What does the [@] in "${buckets[@]}" mean? – Joe Mar 21 '20 at 20:14
  • 1
    @Joe: the @ in ${buckets[@]} represents all elements in the array. If you leave off the [@] the loop will only work on the first element. See the section "Looping through arrays" here: https://opensource.com/article/18/5/you-dont-know-bash-intro-bash-arrays#:~:text=looping%20through%20arrays – Russell G Jul 01 '21 at 13:22
5

You will need to write a script that parses and queries this data because as far as I'm aware, there is not a tool or cli function that performs this. Luckily, you can gather all of this information with the CLI.

  1. List and parse all of the accounts in your org.

    aws organizations list-accounts
    
  2. For each account, list and parse all of the buckets.

    aws s3api list-buckets --query "Buckets[].Name"
    
  3. Finally, get the size of each bucket within each account. You can use the same cli command you were before, but be warned that you are going to be listing the individual size of each item within the bucket. You can also use this cli command to get bucket size.

    aws s3api list-objects --bucket BUCKETNAME --output json --query "
    [sum(Contents[].Size), length(Contents[])]"
    
Preston Martin
  • 3,278
  • 4
  • 17
  • 39
  • 1
  • does not list the profile information of the AWS account. I did not have time to regex this from the aws config file where this information is stored so I just hardcoded the values in the script i posted below
  • – Kyle Steenkamp Oct 04 '17 at 14:44