Relative Content

Tag Archive for amazon-web-servicesamazon-s3

How to query AWS list-objects-v2 to return a result set of Glacier Deep Archive objects which have been restored

I’m trying to find a good way to list items within a bucket prefix that have been transitioned from Glacier Deep Archive to Standard. I see there is a RestoreStatus->IsRestoreInProgress (boolean). According to the docs the key only exists for items that have been restored, and the value will be false for items which have been restored. So I am trying to limit my results to that with:

Find S3 cost breakdown retrospectively

I noticed that the S3 costs for last month jumped 500%. I am not sure which bucket is responsible for it. I just created the cost allocation tags on each bucket and waiting for them to show up in cost explorer so that I can activate them.

Static site on S3 doesn’t load css, images and scripts

I have a simple static web page which should be hosted on S3. Its only purpose is to call a certain API (https://myapi.com), to receive the response ({redirectUrl: string}), and then to redirect the user to the redirectUrl from the response after 5 seconds or to let them manually jump there using a button.

I am able to upload file to my aws bucket using boto3 but can’t delete

I’m reaching out for help with an issue I’m experiencing with deleting objects from my S3 bucket using Boto3.
Here’s a brief summary:
I’m able to upload objects to my bucket without issues
However, when I try to delete objects, I receive an Access Denied error
I’ve verified my credentials and permissions, but I suspect there might be a policy issue
I’ve checked the bucket policy and IAM policies, but I’m not sure what I’m missing. Any guidance or suggestions would be greatly appreciated!

Cannot delete directory bucket in AWS S3

Im trying to delete a directory bucket in AWS S3. The bucket located in Tokyo region (ap-northeast-1). When I tried to delete the bucket, it forced me to empty the bucket first, but when I tried to empty the bucket, it said “No objects found in bucket”.
When I investigated in the cost explorer, it was ExpressOneZoneStorage from S3. It’s kinda annoying since the bucket still costs me a small amount of money. The bucket does not attach any policy, Block Public Access is on.

Custom Meta Data in AWS S3 ObjectCreate:* event

Is it possible to add a custom field in the S3 ObjectCreate:* event? Currently, when I have metadata attached to the file, and lambda is triggered, I don’t get the metadata in the event payload but have to use the S3 API to read the metadata. The reason I don’t want to use the API is because I need the metadata to set the context of our application before doing any processing for login purposes.

upload 300+ GB zip file on AWS S3 bucket

i am using AWS cli to upload my enterprise application backup
i have near about 300 + Gb .zip file backup to upload.
However after certain time command crash after using below command

mountpoint-s3 change region

In the AWS project: mountpoint-s3 https://github.com/awslabs/mountpoint-s3#getting-started, how do I set the region?