Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

S3 Upload Failed - boto3.exceptions.S3UploadFailedError due to unsupported AWS-chunked transfer #4409

Open
1 task done
hailatGH opened this issue Jan 28, 2025 · 1 comment
Assignees
Labels
bug This issue is a confirmed bug. p2 This is a standard priority issue potential-regression Marking this issue as a potential regression to be checked by team member response-requested Waiting on additional information or feedback. s3

Comments

@hailatGH
Copy link

Describe the bug

When using the boto3 library to upload a file to an S3-compatible storage endpoint (Infomaniak Cloud in this case), I encountered an error indicating that transferring payloads in multiple chunks using aws-chunked is not supported.

Regression Issue

  • Select this option if this issue appears to be a regression.

Expected Behavior

I expected the file to upload successfully to the S3-compatible storage endpoint (https://s3.pub1.infomaniak.cloud) using the boto3 library's upload_file() method, without encountering errors related to unsupported aws-chunked transfer encoding. The operation should complete without issues regardless of the file size or chunking mechanism, as long as the endpoint and credentials are correctly configured.

Current Behavior

The file upload failed with an error indicating that the aws-chunked transfer encoding is not supported by the S3-compatible endpoint. The operation raised a boto3.exceptions.S3UploadFailedError, which internally referenced a botocore.exceptions.ClientError with the message:
An error occurred (NotImplemented) when calling the PutObject operation: Transferring payloads in multiple chunks using aws-chunked is not supported.
As a result, the file could not be uploaded to the S3 bucket.

Reproduction Steps

import boto3

URL = "https://s3.pub1.infomaniak.cloud"
A_KEY = "<>"
S_KEY = "<>"
BUCKET = "<>"
FILE = "<>"
KEY = "<>"

s3_client = boto3.client(
"s3",
endpoint_url=URL,
aws_access_key_id=A_KEY,
aws_secret_access_key=S_KEY,
)
s3_client.upload_file(FILE, BUCKET, KEY)

Possible Solution

No response

Additional Information/Context

No response

SDK version used

1.36.7

Environment details (OS name and version, etc.)

23.6.0 Darwin Kernel Version 23.6.0

@hailatGH hailatGH added bug This issue is a confirmed bug. needs-triage This issue or PR still needs to be triaged. labels Jan 28, 2025
@github-actions github-actions bot added the potential-regression Marking this issue as a potential regression to be checked by team member label Jan 28, 2025
@RyanFitzSimmonsAK RyanFitzSimmonsAK self-assigned this Jan 28, 2025
@RyanFitzSimmonsAK RyanFitzSimmonsAK added investigating This issue is being investigated and/or work is in progress to resolve the issue. s3 and removed needs-triage This issue or PR still needs to be triaged. labels Jan 28, 2025
@RyanFitzSimmonsAK
Copy link
Contributor

Hi @hailatGH, thanks for reaching out. The AWS SDKs, including Boto3, are intended for use with specifically AWS. Are you able to reproduce this behavior with AWS S3?

If this behavior started after upgrading to v1.36.0, then this likely is related to the changes to default checksum behavior described in #4392.

The AWS SDKs and CLI are designed for usage with official AWS services. We may introduce and enable new features by default, such as these new default integrity protections prior to them being supported or handled by third-party service implementations. You can disable the new behavior with the when_required value for the request_checksum_calculation and response_checksum_validation configuration options covered in Data Integrity Protections for Amazon S3.

@RyanFitzSimmonsAK RyanFitzSimmonsAK added response-requested Waiting on additional information or feedback. p2 This is a standard priority issue and removed investigating This issue is being investigated and/or work is in progress to resolve the issue. labels Jan 28, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug This issue is a confirmed bug. p2 This is a standard priority issue potential-regression Marking this issue as a potential regression to be checked by team member response-requested Waiting on additional information or feedback. s3
Projects
None yet
Development

No branches or pull requests

2 participants