Thursday, August 18, 2016

IAM policy to grant access to user specific folder

Let's talk about a situation where one needs give access to a user on a specific folder under a S3 bucket. It is somewhat similar to folder level permissions on *nix based system where user has access to his/her 'home' directory


Scenario:

A. Objective is to give a specific user access to a specific folder under a S3 bucket;
user name: s3-sub (a leats privilege user )
Bucket: test-bucket-bijit
Sub folder: /test-bucket-bijit/s3-sub-home

B. Criteria: User "s3-sub" should only have ReadOnly access on the bucket "test-bucket-bijit" but
        full access on "/test-bucket-bijit/s3-sub-home"

C. Resolution:

Let's create a custom inline policy for the user which would accomplish the Objective above.  
        Make sure you validate it using the policy validator,

The following policy has two blocks; in

Block 1. ReadOnlyAccess is given to the user on the bucket "test-bucket-bijit" and in

Block 2. the user is allowed to perform all actions within "/test-bucket-bijit/s3-sub-home/"

{
 "Version": "2012-10-19",
 "Statement": [
{
 "Effect": "Allow",
 "Action": [
"s3:Get*",
"s3:List*"
 ],
 "Resource": "*"
},
{
"Sid": "AllowAllS3ActionsInUserFolder",
"Effect": "Allow",
"Action": [
"s3:*"
],
"Resource": [
  "arn:aws:s3:::test-bucket-bijit/s3-sub-home/*"
]
}
 ]
}

Wednesday, August 17, 2016

Managing Amazon S3 using AWS-CLI

This small write up is about how one can use AWS-CLI tool manage your S3 bucket from your laptop;

Let's assume you have an AWS account (IAM user, an user who has limited access to AWS resources) say, "s3-limited" and this user is attached to a custom group  "limited-access-group" which is having "AmazonS3FullAccess" policy attached to it.

Thus, this particular user is capable of carrying out all the operations like read, write, remove etc. on a Amazon S3 bucket.

Let's talk about how this can be achieved using AWS CLI tool;

You can check how to install AWS CLI tool here;

Once the installtion is done, it is time to configure the tool; (Please keep "AWS access key" and "AWS Secret key" handy);

1. Configure a specific AWS user (eg. s3-limited):

$ aws configure --profile s3-limited
AWS Access Key ID [None]: <provide the access key>
AWS Secret Access Key [None]: <provide secret key>
Default region name [None]: <provide region name>
Default output format [None]:

2. Now, lets test the set up:

i. Create a bucket (aws s3 mb):

$ aws s3 mb s3://test-bucket-bijit  --profile s3-limited
make_bucket: s3://test-bucket-bijit/


ii. List buckets (aws s3 ls):
$ aws s3 ls --profile s3-limited
2016-08-11 13:36:05 test-bucket-bijit

iii. Put some contents under the bucket:
I have uploaded a test file using AWS web interface (S2 dashboard)

iv. List the contents of that Bucket:
$ aws s3 ls s3://test-bucket-bijit
2016-08-11 13:41:30         11 test-file-bijit.txt


v. Let's try to push a file to S3 bucket:

$ aws s3 cp xx.txt s3://test-bucket-bijit/
upload: ./xx.txt to s3://test-bucket-bijit/xx.txt

vi. List contents:
$ aws s3 ls s3://test-bucket-bijit/
2016-08-11 13:41:30         11 test-file-bijit.txt
2016-08-11 13:53:39          0 xx.txt

vii.         Let's copy (download) a file from S3 to a local directory;
$ aws s3 cp s3://test-bucket-bijit/xx.txt .
download: s3://test-bucket-bijit/xx.txt to ./xx.txt


Please note: you can also copy files between two S3 buckets.