How to make all Objects in AWS S3 bucket public by default?

Go to http://awspolicygen.s3.amazonaws.com/policygen.html Fill in the details such as: In Action select “GetObject” Select “Add Statement” Then select “Generate Policy” Copy the text example: { “Id”: “Policy1397632521960”, “Statement”: [ { “Sid”: “Stmt1397633323327”, “Action”: [ “s3:GetObject” ], “Effect”: “Allow”, “Resource”: “arn:aws:s3:::bucketnm/*”, “Principal”: { “AWS”: [ “*” ] } } ] } Now go to your AWS … Read more

AWS S3: The bucket you are attempting to access must be addressed using the specified endpoint

It seems likely that this bucket was created in a different region, IE not us-west-2. That’s the only time I’ve seen “The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint.” US Standard is us-east-1

How to call Amazon S3 bucket using postdata preSigned Url to upload a file using Karate

Try this change: And multipart file file = { read: ‘../testData/validPdfFile.pdf’} Read this for a little more explanation: https://github.com/intuit/karate/tree/develop#multipart-file Other than that you seem to be doing everything right. So it is up to your de-bugging skills now. Or give us a way to replicate: https://github.com/intuit/karate/wiki/How-to-Submit-an-Issue

How long should I wait after applying an AWS IAM policy before it is valid?

The phrase “almost immediately” is used 5 times in the IAM FAQ, and is, of course, somewhat subjective. Since AWS is a globally-distributed system, your changes have to propagate, and the system as a whole seems to be designed to favor availability and partition tolerance as opposed to immediate consistency. I don’t know whether you’ve … Read more

Read a file line by line from S3 using boto?

Here’s a solution which actually streams the data line by line: from io import TextIOWrapper from gzip import GzipFile … # get StreamingBody from botocore.response response = s3.get_object(Bucket=bucket, Key=key) # if gzipped gzipped = GzipFile(None, ‘rb’, fileobj=response[‘Body’]) data = TextIOWrapper(gzipped) for line in data: # process line

How can I use boto to stream a file out of Amazon S3 to Rackspace Cloudfiles?

Other answers in this thread are related to boto, but S3.Object is not iterable anymore in boto3. So, the following DOES NOT WORK, it produces an TypeError: ‘s3.Object’ object is not iterable error message: s3 = boto3.session.Session(profile_name=my_profile).resource(‘s3’) s3_obj = s3.Object(bucket_name=my_bucket, key=my_key) with io.FileIO(‘sample.txt’, ‘w’) as file: for i in s3_obj: file.write(i) In boto3, the contents … Read more

setting up s3 for logs in airflow

UPDATE Airflow 1.10 makes logging a lot easier. For s3 logging, set up the connection hook as per the above answer and then simply add the following to airflow.cfg [core] # Airflow can store logs remotely in AWS S3. Users must supply a remote # location URL (starting with either ‘s3://…’) and an Airflow connection … Read more