Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

boto3 v1.36 is causing issues (S3) #1482

Open
Zerotask opened this issue Jan 17, 2025 · 15 comments
Open

boto3 v1.36 is causing issues (S3) #1482

Zerotask opened this issue Jan 17, 2025 · 15 comments

Comments

@Zerotask
Copy link

Zerotask commented Jan 17, 2025

After we updated our dependencies, specifically after the update from boto3 to the lastest 1.36 version, we noticed that some file related tests fail. Also manually uploading an image stopped working.
So there must be some breaking changes regarding S3.

boto3 Release Notes

@aq1
Copy link

aq1 commented Jan 17, 2025

Same here. I had to manually specify version in requirements.txt in my project.

boto3==1.35.54
botocore==1.35.54

@oliverhaas
Copy link

oliverhaas commented Jan 18, 2025

We observed probably the same error when trying to upload files during e.g. collectstatic:

botocore.exceptions.ClientError: An error occurred (SignatureDoesNotMatch) when calling the PutObject operation: Invalid argument.

Our setup is a little unusual (Google Cloud Storage with the S3 interface), but for us using signature_version = "s3" also solved the issue, but AWS mostly does not support this signature_version anymore, so ymmv.

Here is the related boto3 issue boto/boto3#4400.

@StefanBrand
Copy link

I get this error when calling File.save: ClientError: An error occurred (AccessDenied) when calling the PutObject operation: None

Resolved by downgrading to boto3==1.35.99

@TheDJVG
Copy link

TheDJVG commented Jan 23, 2025

Hitting the same problem here too. Reading the official documentation I get the impression that setting the env. vars AWS_REQUEST_CHECKSUM_CALCULATION and AWS_RESPONSE_CHECKSUM_VALIDATION to WHEN_REQUIRED should fix this, however so far it does not help for me (using Ceph radosgw here).

Is it possible that django-storages is not passing these through or is boto3 not reading them. We might need to expose these settings just like how other AWS related variables are exposed through Django settings.

@9128305
Copy link

9128305 commented Jan 23, 2025

Hitting the same problem here too. Reading the official documentation I get the impression that setting the env. vars AWS_REQUEST_CHECKSUM_CALCULATION and AWS_RESPONSE_CHECKSUM_VALIDATION to WHEN_REQUIRED should fix this, however so far it does not help for me (using Ceph radosgw here).

Is it possible that django-storages is not passing these through or is boto3 not reading them. We might need to expose these settings just like how other AWS related variables are exposed through Django settings.

Try using a custom botocore Config, but you also need this pr boto/s3transfer#329.

I tried with it and it works now.

@TheDJVG
Copy link

TheDJVG commented Jan 23, 2025

Hitting the same problem here too. Reading the official documentation I get the impression that setting the env. vars AWS_REQUEST_CHECKSUM_CALCULATION and AWS_RESPONSE_CHECKSUM_VALIDATION to WHEN_REQUIRED should fix this, however so far it does not help for me (using Ceph radosgw here).
Is it possible that django-storages is not passing these through or is boto3 not reading them. We might need to expose these settings just like how other AWS related variables are exposed through Django settings.

Try using a custom botocore Config, but you also need this pr boto/s3transfer#329.

I tried with it and it works now.

Thanks! I will have to try this release. I only tested when they just released 1.36 (and tests started to fail).

@terencehonles
Copy link
Contributor

surprisingly it seems we're not seeing this issue, but subscribing to this thread nonetheless.

Does anyone want to share a minimal reproducible config?

@9128305
Copy link

9128305 commented Jan 23, 2025

It does not work for 3rd party cloud providers

@9128305
Copy link

9128305 commented Jan 24, 2025

Thanks! I will have to try this release. I only tested when they just released 1.36 (and tests started to fail).

Released.

@TheDJVG
Copy link

TheDJVG commented Jan 24, 2025

Thanks! I will have to try this release. I only tested when they just released 1.36 (and tests started to fail).

Released.

Confirmed this fixed the problem and with the env. vars AWS_REQUEST_CHECKSUM_CALCULATION and AWS_RESPONSE_CHECKSUM_VALIDATION set to WHEN_REQUIRED makes it work again at least with ceph radosgw.

@gingantic
Copy link

gingantic commented Jan 31, 2025

Same here. I had to manually specify version in requirements.txt in my project.

boto3==1.35.54 botocore==1.35.54

If anyone wonder when using vercel with this error

Traceback (most recent call last):
File "/var/task/pastein/views.py", line 285, in test_s3
    saved_file_path = default_storage.save(file_name, ContentFile(file_content))
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/var/task/django/core/files/storage/base.py", line 49, in save
    name = self._save(name, content)
           ^^^^^^^^^^^^^^^^^^^^^^^^^
File "/var/task/storages/backends/s3.py", line 564, in _save
    obj.upload_fileobj(content, ExtraArgs=params, Config=self.transfer_config)
File "/var/task/boto3/s3/inject.py", line 731, in object_upload_fileobj
    return self.meta.client.upload_fileobj(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/var/task/boto3/s3/inject.py", line 642, in upload_fileobj
    return future.result()
           ^^^^^^^^^^^^^^^
File "/var/task/s3transfer/futures.py", line 103, in result
    return self._coordinator.result()
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/var/task/s3transfer/futures.py", line 264, in result
    raise self._exception
File "/var/task/s3transfer/tasks.py", line 135, in __call__
    return self._execute_main(kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/var/task/s3transfer/tasks.py", line 158, in _execute_main
    return_value = self._main(**kwargs)
                   ^^^^^^^^^^^^^^^^^^^^
File "/var/task/s3transfer/upload.py", line 762, in _main
    client.put_object(Bucket=bucket, Key=key, Body=body, **extra_args)
File "/var/task/botocore/client.py", line 569, in _api_call
    return self._make_api_call(operation_name, kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/var/task/botocore/client.py", line 1023, in _make_api_call
    raise error_class(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the PutObject operation: Invalid Access Key

try downgrade it on

boto3==1.35
botocore==1.35

Edit: Turns out it doesnt work for me, but i can use my custom storage backend.
if u interesting : pastein/core/storages.py

@db4y
Copy link

db4y commented Feb 3, 2025

Thanks! I will have to try this release. I only tested when they just released 1.36 (and tests started to fail).

Released.

Confirmed this fixed the problem and with the env. vars AWS_REQUEST_CHECKSUM_CALCULATION and AWS_RESPONSE_CHECKSUM_VALIDATION set to WHEN_REQUIRED makes it work again at least with ceph radosgw.

@TheDJVG where did you set these vars ? I tried a custom TransferConfig, a custom Config and setting them in STORAGES["default"]["OPTIONS"]. None of them worked for me....

@Zerotask
Copy link
Author

Zerotask commented Feb 3, 2025

If there is a environment variable missing to use aS3 (like) storage in the future, then there should be a AWS_S3_REQUEST_CHECKSUM_CALCULATION setting in django-storages

@TheDJVG
Copy link

TheDJVG commented Feb 4, 2025

Thanks! I will have to try this release. I only tested when they just released 1.36 (and tests started to fail).

Released.

Confirmed this fixed the problem and with the env. vars AWS_REQUEST_CHECKSUM_CALCULATION and AWS_RESPONSE_CHECKSUM_VALIDATION set to WHEN_REQUIRED makes it work again at least with ceph radosgw.

@TheDJVG where did you set these vars ? I tried a custom TransferConfig, a custom Config and setting them in STORAGES["default"]["OPTIONS"]. None of them worked for me....

I've set them as part of the environment the process runs (as that's what the boto3/s3transfer seems to read based on the sdk documentation).

in my case simply adding this to my k8s/container config:

            - name: AWS_REQUEST_CHECKSUM_CALCULATION
              value: WHEN_REQUIRED
            - name: AWS_RESPONSE_CHECKSUM_VALIDATION
              value: WHEN_REQUIRED

@9128305
Copy link

9128305 commented Feb 4, 2025

Thanks! I will have to try this release. I only tested when they just released 1.36 (and tests started to fail).

Released.

Confirmed this fixed the problem and with the env. vars AWS_REQUEST_CHECKSUM_CALCULATION and AWS_RESPONSE_CHECKSUM_VALIDATION set to WHEN_REQUIRED makes it work again at least with ceph radosgw.

@TheDJVG where did you set these vars ? I tried a custom TransferConfig, a custom Config and setting them in STORAGES["default"]["OPTIONS"]. None of them worked for me....

'OPTIONS': {
    'client_config': botocore.config.Config(
            request_checksum_calculation='when_required',
            response_checksum_validation='when_required',
    ),
}

ref: https://django-storages.readthedocs.io/en/latest/backends/amazon-S3.html

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

9 participants