Listen to this Post
In this article, we explore how to configure read-only access and file upload permissions for a specific AWS S3 bucket, ensuring clients can only interact with their designated storage space.
Key Steps for AWS S3 Bucket Access Control
1. Create an IAM User
- Example: `cliente-bradesco-bi`
- Assign a secure password (e.g.,
rdfe@$4125
).
2. Define a Custom IAM Policy
- Policy Name: `acesso-s3-bradesco-bi`
- JSON Policy Example:
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "s3:GetObject", "s3:PutObject", "s3:ListBucket" ], "Resource": [ "arn:aws:s3:::cliente-bradesco-bi", "arn:aws:s3:::cliente-bradesco-bi/" ] } ] }
3. Attach the Policy to the IAM User
- Navigate to IAM > Users > cliente-bradesco-bi > Permissions > Add Permissions.
4. Grant AWS Console Access (Optional)
- Enable console login for the user if needed.
5. Test Access Restrictions
- The user should only see and interact with `cliente-bradesco-bi` bucket.
You Should Know: AWS CLI and Automation
AWS CLI Commands for Bucket Management
- List Buckets (Restricted to Permitted Ones)
aws s3 ls s3://cliente-bradesco-bi --profile cliente-bradesco-bi
Upload a File via CLI
aws s3 cp local-file.txt s3://cliente-bradesco-bi/ --profile cliente-bradesco-bi
Download a File
aws s3 cp s3://cliente-bradesco-bi/remote-file.txt . --profile cliente-bradesco-bi
Check User Permissions
aws iam list-user-policies --user-name cliente-bradesco-bi
Automating with Python (Boto3)
import boto3 s3 = boto3.client('s3', aws_access_key_id='ACCESS_KEY', aws_secret_access_key='SECRET_KEY') List objects in the bucket response = s3.list_objects_v2(Bucket='cliente-bradesco-bi') for obj in response['Contents']: print(obj['Key']) Upload a file s3.upload_file('local-file.txt', 'cliente-bradesco-bi', 'remote-file.txt')
Important AWS Documentation Links
What Undercode Say
Managing AWS S3 permissions is crucial for security and efficiency. Always:
– Use least privilege access.
– Avoid hardcoding credentials—use IAM roles where possible.
– Monitor bucket activity with AWS CloudTrail.
– Automate deployments using Terraform or AWS CloudFormation.
For Linux admins, integrating AWS CLI with cron jobs can help automate backups:
0 3 /usr/bin/aws s3 sync /backup s3://cliente-bradesco-bi/backups/
Windows users can leverage PowerShell for S3 operations:
Read-S3Object -BucketName cliente-bradesco-bi -Key "file.txt" -File "C:\downloads\file.txt"
Expected Output:
A secure, restricted AWS S3 bucket where clients can only access their designated files, enforced via IAM policies and automation scripts.
References:
Reported By: Wanderson Silva – Hackers Feeds
Extra Hub: Undercode MoN
Basic Verification: Pass ✅