Retrieving subfolders names in S3 bucket from boto3

Below piece of code returns ONLY the ‘subfolders’ in a ‘folder’ from s3 bucket. import boto3 bucket=”my-bucket” #Make sure you provide / in the end prefix = ‘prefix-name-with-slash/’ client = boto3.client(‘s3’) result = client.list_objects(Bucket=bucket, Prefix=prefix, Delimiter=”https://stackoverflow.com/”) for o in result.get(‘CommonPrefixes’): print ‘sub folder : ‘, o.get(‘Prefix’) For more details, you can refer to https://github.com/boto/boto3/issues/134

Security of REST authentication schemes

A previous answer only mentioned SSL in the context of data transfer and didn’t actually cover authentication. You’re really asking about securely authenticating REST API clients. Unless you’re using TLS client authentication, SSL alone is NOT a viable authentication mechanism for a REST API. SSL without client authc only authenticates the server, which is irrelevant … Read more

How to write a file or data to an S3 object using boto3

In boto 3, the ‘Key.set_contents_from_’ methods were replaced by Object.put() Client.put_object() For example: import boto3 some_binary_data = b’Here we have some data’ more_binary_data = b’Here we have some more data’ # Method 1: Object.put() s3 = boto3.resource(‘s3’) object = s3.Object(‘my_bucket_name’, ‘my/key/including/filename.txt’) object.put(Body=some_binary_data) # Method 2: Client.put_object() client = boto3.client(‘s3′) client.put_object(Body=more_binary_data, Bucket=”my_bucket_name”, Key=’my/key/including/anotherfilename.txt’) Alternatively, the binary … Read more

Make a bucket public in Amazon S3

You can set a bucket policy as detailed in this blog post: http://ariejan.net/2010/12/24/public-readable-amazon-s3-bucket-policy/ As per @robbyt’s suggestion, create a bucket policy with the following JSON: { “Version”: “2008-10-17”, “Statement”: [ { “Sid”: “AllowPublicRead”, “Effect”: “Allow”, “Principal”: { “AWS”: “*” }, “Action”: [ “s3:GetObject” ], “Resource”: [ “arn:aws:s3:::bucket/*” ] } ] } Important: replace bucket in … Read more

Correct S3 + Cloudfront CORS Configuration?

On June 26, 2014 AWS released proper Vary: Origin behavior on CloudFront so now you just Set a CORS Configuration for your S3 bucket including <AllowedOrigin>*</AllowedOrigin> In CloudFront -> Distribution -> Behaviors for this origin Allowed HTTP Methods: +OPTIONS Cached HTTP Methods +OPTIONS Cache Based on Selected Request Headers: Whitelist the Origin header. Wait for … Read more

Can I stream a file upload to S3 without a content-length header?

You have to upload your file in 5MiB+ chunks via S3’s multipart API. Each of those chunks requires a Content-Length but you can avoid loading huge amounts of data (100MiB+) into memory. Initiate S3 Multipart Upload. Gather data into a buffer until that buffer reaches S3’s lower chunk-size limit (5MiB). Generate MD5 checksum while building … Read more

How to upload a file to directory in S3 bucket using boto

NOTE: This answer uses boto. See the other answer that uses boto3, which is newer. Try this… import boto import boto.s3 import sys from boto.s3.key import Key AWS_ACCESS_KEY_ID = ” AWS_SECRET_ACCESS_KEY = ” bucket_name = AWS_ACCESS_KEY_ID.lower() + ‘-dump’ conn = boto.connect_s3(AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY) bucket = conn.create_bucket(bucket_name, location=boto.s3.connection.Location.DEFAULT) testfile = “replace this with an actual filename” print … Read more