Showing posts with label list S3 buckets. Show all posts
Showing posts with label list S3 buckets. Show all posts

Saturday, October 19, 2024

How to List S3 Buckets and Objects Using AWS CLI

How to List S3 Buckets and Objects Using AWS CLI

Amazon Simple Storage Service (S3) is a scalable cloud storage solution provided by AWS, widely used for storing data of all kinds. Whether you are managing backups, application files, or large datasets, the AWS CLI (Command Line Interface) is an essential tool for quickly interacting with S3. One of the most frequent tasks is listing buckets and objects in your S3 storage.

In this article, we’ll guide you through various methods of listing your S3 buckets and their contents using AWS CLI. We will explain each command and provide examples to help you get started quickly.

Prerequisites

  • AWS CLI is installed: You can install it from the AWS CLI installation guide.
  • AWS CLI is configured: Run the command aws configure to set up your credentials (Access Key, Secret Access Key, Region, etc.).
  • Necessary permissions: Make sure your IAM user has the right permissions to access and list S3 buckets. The required permission is s3:ListBucket.

1. Listing All S3 Buckets

To list all the S3 buckets in your AWS account, use the following command:

aws s3 ls

This command will return a list of all S3 buckets with their creation dates.

Example Output:

2023-10-12 12:34:56 bucket-name-1
2023-09-10 08:21:33 bucket-name-2

2. Listing Contents of a Specific S3 Bucket

If you want to list all the objects in a specific bucket, you can append the bucket name to the command:

aws s3 ls s3://bucket-name

Replace bucket-name with the actual name of your S3 bucket.

Example Output:

2024-01-10 14:20:15    1024 file1.txt
2024-01-10 14:30:25    2048 file2.txt

3. Listing Objects in a Specific Folder

S3 buckets can contain virtual directories (folders). To list the contents of a specific folder within a bucket, specify the folder name:

aws s3 ls s3://bucket-name/folder-name/

Example Output:

2024-02-15 15:10:05    512  folder-name/file3.jpg
2024-02-16 10:12:45   1024  folder-name/file4.pdf

4. Listing Objects Recursively

To list all objects in a bucket, including those stored in subdirectories, use the --recursive option:

aws s3 ls s3://bucket-name --recursive

Example Output:

2024-01-10 14:20:15    1024 folder1/file1.txt
2024-01-10 14:30:25    2048 folder2/file2.txt
2024-01-11 09:15:10    512  folder2/subfolder/file3.jpg

5. Listing with Human-Readable File Sizes

To view file sizes in a human-readable format (e.g., KB, MB, GB), use the --human-readable option:

aws s3 ls s3://bucket-name --human-readable

Example Output:

2024-01-10 14:20:15   1.0 KiB folder1/file1.txt
2024-01-10 14:30:25   2.0 KiB folder2/file2.txt

6. Summarizing Total Files and Sizes

To get a summary of the total number of objects and their cumulative size in a bucket, use the --summarize option along with --recursive:

aws s3 ls s3://bucket-name --recursive --summarize

Example Output:

2024-01-10 14:20:15    1024 folder1/file1.txt
2024-01-10 14:30:25    2048 folder2/file2.txt

Total Objects: 2
Total Size: 3 KiB

7. Filtering Results Using Wildcards

You can filter the objects by file name patterns using wildcards:

aws s3 ls s3://bucket-name --recursive --exclude "*" --include "*.txt"

This command will only list .txt files, excluding other file types.

Common Errors and How to Fix Them

  • Access Denied Error: Ensure that your IAM user has the necessary permissions to list the bucket contents. You need s3:ListBucket and possibly other permissions for more advanced actions.
  • No Such Bucket: Verify that the bucket name is correct and exists in the region you’re working in.
  • CLI Configuration Issues: Ensure the AWS CLI is properly configured using aws configure, and check if you’re using the correct AWS profile if necessary.

Using the AWS CLI to list S3 buckets and objects is a powerful way to interact with your storage without needing to navigate the AWS Management Console. Whether you're listing all buckets, viewing files in a folder, or summarizing the total size of a bucket, these commands provide flexibility and control over your cloud storage operations.

By mastering these CLI commands, you can streamline your cloud management processes and handle S3 tasks more efficiently, saving both time and effort.