By Zelia / Last Updated May 31, 2024

Importance of uploading files to S3

Amazon S3 is a highly scalable and secure storage service that offers durable object storage for a wide range of applications and use cases. Uploading files to S3 allows businesses and developers to store and retrieve data easily, leverage the benefits of AWS services, and build robust, scalable applications. Whether it's hosting static websites, storing backups, or serving media files, S3 provides a reliable and cost-effective solution.

access-denied-when-uploading-file-to-s3

Causes of Access Denied error

Access Denied error while uploading files to S3 can occur due to various reasons:

  • Incorrect IAM Permissions: Insufficient or incorrect IAM (Identity and Access Management) permissions associated with the user or role attempting the upload can lead to Access Denied error.
  • Invalid Bucket or Object Names: Typos, misspellings, or incorrect names when specifying the S3 bucket or object can result in Access Denial.
  • Bucket Policy Restrictions: The bucket's access control policy might restrict the upload operation. Review the bucket policy and adjust it accordingly to allow the necessary permissions for the user or role.
  • AWS CLI or SDK Configuration Issues: Improper configuration of AWS CLI (Command Line Interface) or SDKs can lead to authentication failures and Access Denial.

How to solve Access Denied when uploading file to S3

Now that you understand the causes of Access Denied error, to resolve Access Denied when uploading files to S3, follow these steps:

✧ IAM user permission to S3:PutObject Access Denied 403

The IAM user must additionally have rights for s3:PutObjectAcl in their IAM policy if the object's access control list (ACL) has to be updated during the upload.

✧ Conditions in S3 Access Denied public bucket policy

Review your bucket policy to see whether any of the following example restrictions on uploads to your bucket exist. The IAM user must fulfil the condition for the upload to succeed if the bucket policy has a condition and the condition is legitimate.

Note: Verify the condition's association with an Allow statement (Effect: "Allow") or a Deny statement (Effect: "Deny") while reviewing conditions. The user must adhere to the requirements of an Allow statement or avoid the requirements of a Deny statement for the upload to succeed.

1. Check for a condition that only permits uploads from a particular IP address, such as this:

"Condition": {

  "IpAddress": {

    "aws:SourceIp": "54.240.143.0/24"

  }

}

2. The IAM user must access your bucket from the permitted IP addresses if your bucket policy contains this requirement. Check for a condition similar to the following that prohibits uploads unless the object belongs to a particular storage class:

"Condition": {

  "StringEquals": {

    "s3:x-amz-storage-class": [

      "STANDARD_IA"

    ]

  }

3. If your policy has this condition, the user must upload objects that belong to the permitted storage class. For instance, the STANDARD_IA storage class is necessary for the prior conditional expression. As a result, the user must utilize an AWS Command Line Interface (AWS CLI) command to upload the object, something like this:

aws s3api put-object --bucket DOC-EXAMPLE-BUCKET --key examplefile.jpg --body c:\examplefile.jpg --storage-class STANDARD_IA

Note: Make sure you're using the most recent AWS CLI version if you experience issues when executing commands using the AWS CLI.

4. Check for a condition that prohibits uploads unless the object is given a particular access control list (ACL), similar to the following:

"Condition": {

                "StringEquals": {

                    "s3:x-amz-acl":["public-read"]

                }

            }

5. If your policy has this condition, then users must upload objects with the allowed ACL. For instance, since the above requirement needs the public-read ACL, the user must upload the object using a command like the one below:

aws s3api put-object --bucket DOC-EXAMPLE-BUCKET --key examplefile.jpg --body c:\examplefile.jpg --acl public-read

6. Check for a requirement that uploads must provide the bucket owner (canonical user ID) complete control of the object, such as the following:

"Condition": {

  "StringEquals": {

    "s3:x-amz-grant-full-control": "id=AccountA-CanonicalUserID"

  }

}

7. If your policy includes this requirement, the user must upload objects using a command similar to the following:

aws s3api put-object --bucket DOC-EXAMPLE-BUCKET --key examplefile.jpg --body c:\examplefile.jpg --grant-full-control id=CanonicalUserID

8. Check for a condition that prohibits uploads unless an AWS Key Management System (AWS KMS) key has been used to encrypt the object. For example:

"Condition": {
"StringEquals": {
"s3:x-amz-server-side-encryption-aws-kms-key-id": "arn:aws:kms:us-east-1:111122223333:key/abcdabcd-abcd-abcd-abcd-abcdabcdabcd"
}
}

9. If your policy includes this requirement, the user must upload objects using a command similar to the following:

aws s3api put-object --bucket DOC-EXAMPLE-BUCKET --key examplefile.jpg --body c:\examplefile.jpg --server-side-encryption aws:kms --ssekms-key-id arn:aws:kms:us-east-1:111122223333:key/abcdabcd-abcd-abcd-abcd-abcdabcdabcd

10. Check for a condition that only permits uploads when objects employ a particular kind of server-side encryption, such as the following:

"Condition": {

  "StringEquals": {

    "s3:x-amz-server-side-encryption": "AES256"

  }

}

11. If your policy has this condition, the user must upload objects using a command similar to the following:

aws s3api put-object --bucket DOC-EXAMPLE-BUCKET --key examplefile.jpg --body c:\examplefile.jpg --server-side-encryption "AES256"

✧ Access allowed by a VPC endpoint policy

You must review the VPC endpoint policy if the IAM user is utilizing an Amazon Elastic Compute Cloud (Amazon EC2) instance to upload objects to Amazon S3 and that instance is routed to Amazon S3 through a VPC endpoint. Ensure that uploads to your bucket are permitted per the endpoint policy.

1. For example, the VPC endpoint policy listed below only permits access to DOC-EXAMPLE-BUCKET. Users cannot upload files to your bucket using the instance in the VPC if it is not marked as an authorized resource.

{

  "Statement": [{

    "Sid": "Access-to-specific-bucket-only",

    "Principal": "*",

    "Action": [

      "s3:PutObject"

    ],

    "Effect": "Allow",

    "Resource": "arn:aws:s3:::DOC-EXAMPLE-BUCKET/*"

  }]

}

2. Additionally, the VPC endpoint policy must also grant access to the s3:PutObjectAcl action, if users upload items with an ACL, which is comparable to:

{

  "Statement": [{

    "Sid": "Access-to-specific-bucket-only",

    "Principal": "*",

    "Action": [

      "s3:PutObject",

      "s3:PutObjectAcl",

    ],

    "Effect": "Allow",

    "Resource": "arn:aws:s3:::DOC-EXAMPLE-BUCKET/*"

  }]

}

Recommend: Batch upload data to Amazon S3 via backup software

Hope the above methods can help you solve the access denied problem and you can upload your files to Amazon S3 easily. But what if you want to backup batch data to Amazon S3 storage? In this case, you can turn to a backup software - AOMEI Cyber Backup. It helps you to back up your data to a safe location while uploading the backup to Amazon S3 storage.

► Easily backup VMware, Hyper-V, Windows PC and Server disk, partition, files, system, MS SQL databases, etc.
► Backup data to local share, external HDD, USB drive, NAS drive, network share.
► Automatically upload backups to Amazon S3 storage without complex configuration.
► Directly recover data from Amazon S3 storage.

Download Freeware Easiest backup software
Secure Download

Step 1.  Run AOMEI Cyber Backup. On its interface, click Souce Device Choose a device type. Here we choose Vmware for demonstration. Click Add VMware ESXi, and enter the VMware information to add all virtual machines.

add Vmware device

 Step 2. Click Backup Task on the left pane and choose Create New Task. Here is a dialogue to configure your backup settings. Complete them based on your need. And make sure Archiving backup version Amazon S3 is checked.

check archive option

Step 3. Then click Select+. Then input region, username, password, and Bucket to add your Amazon S3 storage and select. Then click Confirm.

add-s3

Step 4. Click the Start Backup button to commit and run the backup task. And the backups will be uploaded to Amazon S3 automatically.

start-backup

Conclusion

Uploading files to Amazon S3 is a crucial aspect of leveraging cloud storage for various applications. However, encountering Access Denied error during the upload process can be disruptive. With the right knowledge and troubleshooting steps, you can overcome Access Denied error and continue to utilize Amazon S3 efficiently for the storage needs.

Additionally, securing your data is also crucial, and you would better take appropriate measures to ensure your files are protected from unauthorized access.