-
Notifications
You must be signed in to change notification settings - Fork 159
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Using system generated seed in RandomSampler #1441
base: main
Are you sure you want to change the base?
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/data/1441
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 14fa418 with merge base fe6b405 ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
seed = 1 | ||
torch.manual_seed(seed) | ||
dl3 = StatefulDataLoader(self.dataset, batch_size=1, shuffle=True) | ||
data_dl3 = [] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we call this results3
? and similarly results1 and results2 above.
|
||
seed = 1 | ||
torch.manual_seed(seed) | ||
dl3 = StatefulDataLoader(self.dataset, batch_size=1, shuffle=True) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We can rename dl3
to dataloader3
. ditto for other dataloader variables.
) | ||
|
||
def test_seed_replicability(self): | ||
|
||
seed = 0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Instead of checking for specific seeds 0 and 1, we can generalize it to two randomly generated seeds.
And also add a assert to ensure both seeds are not equal.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
data_source: Sized, | ||
replacement: bool = False, | ||
num_samples: Optional[int] = None, | ||
generator=None, | ||
): | ||
if generator is None: | ||
# Ensure that underlying sampler has something repeatable |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
let's remove or update this comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Update/remove comment and then gogogo
Currently we are fixing the seed for
generator
inRandomSampler
as 1.This leads to the generator not changing even when torch.manual_seed() seed is changed.
For the RandomSampler in
torch.utils.data.sampler
, they useseed = int(torch.empty((), dtype=torch.int64).random_().item())
. Using the same here.Fixes #1440