How do I get a regular pytorch model from BoTorchModel? #2251
Replies: 4 comments 3 replies
-
Hi @cesare-montresor. By "regular PyTorch model" are you referring to getting a BoTorch |
Beta Was this translation helpful? Give feedback.
-
Thank you very much, I'll try both methods to better understand how it
works under the hood.
Il lun 18 mar 2024, 21:17 Sait Cakmak ***@***.***> ha scritto:
… Since you're using torch.save/load, the model you have should be a
ModelBridge object that has a trained BoTorch model attached to it, which
you can access with model.model.surrogate.model. I don't know what the
dataloader and inference_model are supposed to do, so I can't comment on
those.
If you want to predict outcomes from the model, I'd actually recommend
working with the Ax model directly. For this, you can call model.predict,
defined here
<https://github.com/facebook/Ax/blob/main/ax/modelbridge/base.py#L611>,
to predict the outcomes for any given parameterization. The underlying
BoTorch model lives in the transformed space, so working with it would
require manually applying the Ax transforms to your parameters before
passing them to the BoTorch model (and untransforming the model
predictions).
—
Reply to this email directly, view it on GitHub
<#2251 (reply in thread)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AADNK3OXVJLHWW5RNEDSK53YY5DVNAVCNFSM6AAAAABE4ESPUWVHI2DSMVQWIX3LMV43SRDJONRXK43TNFXW4Q3PNVWWK3TUHM4DQMZSG42DK>
.
You are receiving this because you were mentioned.Message ID:
***@***.***>
|
Beta Was this translation helpful? Give feedback.
-
@saitcakmak I'm starting to think that I misunderstood what I'm using Ax to do find the ideal hyperparameters for my CNN, so what I'm really looking for is probably not BoTorchModel, but the model generated and fitted during the "best experiment", the one trained using the best parameters. For context, those are the parameters I'm tuning: params = [
{"name": "lr", "value_type": "float", "type": "range", "bounds": [1e-6, 0.4], "log_scale": True}, #optimizer
{"name": "lr_max", "value_type": "float", "type": "range", "bounds": [1e-6, 0.4], "log_scale": True}, #LR scheduler
{"name": "batch_size", "value_type": "int", "type": "range", "bounds": [4, 128]}, #train loop
{"name": "num_epoch", "value_type": "int", "type": "range", "bounds": [1, 30]}, #train loop
{"name": "drop_out", "value_type": "float", "type": "range", "bounds": [0.0, 0.9]}, # model FC
{"name": "fc_hidden_num", "value_type": "int", "type": "range", "bounds": [0, 10]}, # model FC
{"name": "fc_hidden_size", "value_type": "int", "type": "range", "bounds": [64, 2048]}, # model FC
{"name": "resnet_size", "value_type": "int", "type": "choice", "values": [18, 34, 50, 101, 152], "sort_values": True, "is_ordered": True}, # model CNN
] |
Beta Was this translation helpful? Give feedback.
-
Nevermind, I realized i can dump any model withtin the train_eval function, also, given the best parameters, i can run one more training using the best_paramters and save the tuned model. |
Beta Was this translation helpful? Give feedback.
-
Hello there,
I'm new to BoTorch, I'm using it via Ax. I managed to run the optimization successfully, however, I ended up saving on the disk a BoTorchModel.
How do i get a regular PyTorch model out of a BoTorchModel ?
Thanks
Beta Was this translation helpful? Give feedback.
All reactions