Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tiny issues w.r.t data sampling in few-shot iMAML example #417

Closed
zaccharieramzi opened this issue Mar 28, 2023 · 3 comments · Fixed by #425
Closed

Tiny issues w.r.t data sampling in few-shot iMAML example #417

zaccharieramzi opened this issue Mar 28, 2023 · 3 comments · Fixed by #425

Comments

@zaccharieramzi
Copy link
Contributor

zaccharieramzi commented Mar 28, 2023

I was looking through the iMAML few-shot example , and noticed some tiny mistakes:

  • the phase is not set correctly in the sinusoid generation (atm it's more like a frequency): y_train = jnp.sin(phase * x_train) * amplitude instead of y_train = jnp.sin(x_train - phase) * amplitude
  • only a fixed number of pre-generated tasks is used to train the meta-learner, rather than tasks being sampled on-the-fly like in the original implementation (see here and here).

It turns out that I was also working on reimplementing MAML (not implicit) but with jaxopt for the unrolled gradient and my results on the sinusoid dataset match those of the paper. I'd be happy to implement the tiny changes, but also add the MAML version. I saw you implemented the iMAML example, wdyt @fabianp ?

@fabianp
Copy link
Collaborator

fabianp commented Mar 30, 2023

thanks @zaccharieramzi ! Your fixes would be most welcomed. I was actually concerned on why the example was so slow and not getting great results. This settles it then!

Can you send a pull request for the changes? Thanks again!

@zaccharieramzi
Copy link
Contributor Author

Yes I think on the slowness the problem is the for loop in the loss computation.
I also had implemented it via a for loop, but a vmap is much more efficient, it just means that the parameters should be duplicated at the leaf level (if you consider them a pytree) and not at the tree level.

I have some pressing things on my plate atm, in like 2 weeks I can definitely send a PR.

@fabianp
Copy link
Collaborator

fabianp commented Mar 31, 2023 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants