Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upgrade Transformers to 4.48.x #782

Merged
merged 12 commits into from
Feb 26, 2025
Merged

Conversation

TimoImhof
Copy link
Contributor

No description provided.

@TimoImhof TimoImhof mentioned this pull request Jan 13, 2025
3 tasks
@TimoImhof TimoImhof changed the title Upgrade Transformers to 4.48.x [WIP] Upgrade Transformers to 4.48.x Jan 13, 2025
TimoImhof and others added 9 commits February 23, 2025 18:30
- Fix Attention Mixin naming convention
- Add Sdpa with adapters class
- remove outdated individual attention mechanism implementations
- add generic implementation with additional adapter functionalities
- Remove outdated individual attention classes
- Include generic implementation with additional adapter functionalities
@TimoImhof TimoImhof marked this pull request as ready for review February 24, 2025 12:02
@calpt calpt changed the title [WIP] Upgrade Transformers to 4.48.x Upgrade Transformers to 4.48.x Feb 24, 2025
@TimoImhof TimoImhof merged commit e5a8689 into adapter-hub:main Feb 26, 2025
4 checks passed
@TimoImhof TimoImhof deleted the sync/v4.48.x branch February 26, 2025 11:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants