Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Shouldn't this be
...to enforce the use of 2.5? If other required packages in this file change (or their versions change), then pip may end up pulling 2.3.z or 2.4.z when trying to automatically resolve dependencies.
I'm referencing the title:
Update PyTorch to 2.5
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Having an older version is fine if that's a requirement for one component. That's unlikely to happen though, unless a library is downgrading its dependencies. And that's also what we've done for the instructlab/instructlab repo: instructlab/instructlab#2865.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Gotcha. So we're allowing 2.5 if other packages support it, but not enforcing the use of 2.5.
I was confused by the PR title because it suggested we were actually updating the version.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for hightling this @courtneypacheco ! Looking in the CI logs, even though this raises the cap it looks like CI is actually installing torch 2.4.x and running with that. So, we may have more work to do here to figure out what dependency is forcing our torch to 2.4.x instead of 2.5.x.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ahh, I see in the logs:
Our CI tests install instructlab from latest main, so we must have picked up instructlab before the torch bump was merged there. Kicking this off again to ensure it picks up torch 2.5.x now.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I suspect that it's because we're still on vLLM 0.6.2, which depends on PyTorch 2.4.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ahh, yes. So in other words, neither instructlab/instructlab nor instructlab/sdg can actually bump our torch version in use until we move to a newer vllm than 0.6.2 that depends on torch 2.5.x. Regardless of this version range, our vllm dependency specifies an exact dependency on torch 2.4.0.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, that seems to be the case. I have opened a new PR in instructlab to update vLLM to 0.6.6.post1: instructlab/instructlab#2892.