Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

2.4.1 backports #341

Draft
wants to merge 1 commit into
base: v2.4.x
Choose a base branch
from
Draft

Conversation

hmaarrfk
Copy link
Contributor

@hmaarrfk hmaarrfk commented Feb 1, 2025

A few fixes to backport small lifts from the 2.5.X branch back to the 2.4.1 at user requests.

Checklist

  • Used a personal fork of the feedstock to propose changes
  • Bumped the build number (if the version is unchanged)
  • Reset the build number to 0 (if the version changed)
  • Re-rendered with the latest conda-smithy (Use the phrase @conda-forge-admin, please rerender in a comment in this PR for automated rerendering)
  • Ensured the license file is being packaged.

@hmaarrfk
Copy link
Contributor Author

hmaarrfk commented Feb 1, 2025

Workflow cancelled to save resources

@hmaarrfk hmaarrfk changed the title Relax pin for pytorch-gpu/pytorch-cpu package 2.4.1 backports Feb 1, 2025
@conda-forge-admin
Copy link
Contributor

Hi! This is the friendly automated conda-forge-linting service.

I just wanted to let you know that I linted all conda-recipes in your PR (recipe/meta.yaml) and found it was in an excellent condition.

I do have some suggestions for making it better though...

For recipe/meta.yaml:

  • ℹ️ The recipe is not parsable by parser conda-souschef (grayskull). This parser is not currently used by conda-forge, but may be in the future. We are collecting information to see which recipes are compatible with grayskull.
  • ℹ️ The recipe is not parsable by parser conda-recipe-manager. The recipe can only be automatically migrated to the new v1 format if it is parseable by conda-recipe-manager.

This message was generated by GitHub Actions workflow run https://github.com/conda-forge/conda-forge-webservices/actions/runs/13092701177. Examine the logs at this URL for more detail.

Copy link
Member

@h-vetinari h-vetinari left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks! I was considering doing this when the queue has cleared up a bit (after all the fixes, tensorflow and pytorch 2.6), especially since there are viable work-arounds in the meantime.

@hmaarrfk
Copy link
Contributor Author

hmaarrfk commented Feb 1, 2025

no problem ,i have a repo data patch (i think....)

Comment on lines +356 to +357
- pytorch {{ version }}=cuda*_{{ blas_impl }}*{{ PKG_BUILDNUM }} # [megabuild and cuda_compiler_version != "None"]
- pytorch {{ version }}=cpu_{{ blas_impl }}*{{ PKG_BUILDNUM }} # [megabuild and cuda_compiler_version == "None"]
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@h-vetinari i feel like this should actually be

Suggested change
- pytorch {{ version }}=cuda*_{{ blas_impl }}*{{ PKG_BUILDNUM }} # [megabuild and cuda_compiler_version != "None"]
- pytorch {{ version }}=cpu_{{ blas_impl }}*{{ PKG_BUILDNUM }} # [megabuild and cuda_compiler_version == "None"]
- pytorch {{ version }}=cuda*_{{ blas_impl }}*_{{ PKG_BUILDNUM }} # [megabuild and cuda_compiler_version != "None"]
- pytorch {{ version }}=cpu_{{ blas_impl }}*_{{ PKG_BUILDNUM }} # [megabuild and cuda_compiler_version == "None"]

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It doesn't make much of a difference (except exercising the new glob-matcher); the build number is always separated from the build hash, so there can be no spurious match even without the underscore. 🤷

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yeah i guess cpu_*5 can't match with cuda126*305

@h-vetinari
Copy link
Member

With conda-forge/conda-forge-repodata-patches-feedstock#956 merged, there shouldn't be an immediate need for this anymore AFAICT. We did a bunch of relevant backports in #322 and a pinning update in #325, so I think we can probably let v2.4.x go into well-deserved retirement...

@h-vetinari h-vetinari marked this pull request as draft February 2, 2025 05:25
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants