Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Register adversarially trained backbones in timm #2509

Open
wants to merge 9 commits into
base: main
Choose a base branch
from

Conversation

mzweilin
Copy link
Contributor

📝 Description

This PR registers 48 model weights from adversarial training in timm. Users can specify adversarially trained backbones with names like resnet18.adv_l2_0.1 and wide_resnet50_2.adv_linf_1. We remove the support to the __AT__ token in the backbone string to conform with the timm format.

Here're the all available weights:

model_names = ["resnet18", "resnet50", "wide_resnet50_2"]
l2_epsilons = [0, 0.01, 0.03, 0.05, 0.1, 0.25, 0.5, 1, 3, 5]
linf_epsilons = [0, 0.5, 1, 2, 4, 8]

The model weights are derived from https://huggingface.co/madrylab/robust-imagenet-models

Here is a simple test to verify that the model weights are loaded:

import torch
from anomalib.models.components.feature_extractors.timm import TimmFeatureExtractor

weights = torch.hub.load_state_dict_from_url("https://huggingface.co/mzweilin/robust-imagenet-models/resolve/main/wide_resnet50_2_l2_eps5.pth")
backbone = "wide_resnet50_2.adv_l2_5"

model =  TimmFeatureExtractor(backbone=backbone, layers=[], pre_trained=True)
assert torch.all(weights["conv1.weight"] == model.feature_extractor.conv1.weight)

✨ Changes

Select what type of change your PR is:

  • 🐞 Bug fix (non-breaking change which fixes an issue)
  • 🔨 Refactor (non-breaking change which refactors the code base)
  • 🚀 New feature (non-breaking change which adds functionality)
  • 💥 Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • 📚 Documentation update
  • 🔒 Security update

✅ Checklist

Before you submit your pull request, please make sure you have completed the following steps:

  • 📋 I have summarized my changes in the CHANGELOG and followed the guidelines for my type of change (skip for minor changes, documentation updates, and test enhancements).
  • 📚 I have made the necessary updates to the documentation (if applicable).
  • 🧪 I have written tests that support my changes and prove that my fix is effective or my feature works (if applicable).

For more information about code review checklists, see the Code Review Checklist.

…ples/notebooks/100_datamodules/101_btech.ipynb,examples/notebooks/100_datamodules/102_mvtec.ipynb,examples/notebooks/100_datamodules/103_folder.ipynb,examples/notebooks/100_datamodules/104_tiling.ipynb,examples/notebooks/200_models/201_fastflow.ipynb,examples/notebooks/400_openvino/401_nncf.ipynb,examples/notebooks/500_use_cases/501_dobot/501b_inference_with_a_robotic_arm.ipynb,examples/notebooks/600_loggers/601_mlflow_logging.ipynb,examples/notebooks/700_metrics/701a_aupimo.ipynb,examples/notebooks/700_metrics/701b_aupimo_advanced_i.ipynb,examples/notebooks/700_metrics/701c_aupimo_advanced_ii.ipynb,examples/notebooks/700_metrics/701d_aupimo_advanced_iii.ipynb,examples/notebooks/700_metrics/701e_aupimo_advanced_iv.ipynb: convert to Git LFS

Signed-off-by: Weilin Xu <[email protected]>
Signed-off-by: Weilin Xu <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant