Replies: 2 comments 1 reply
-
I achieved 84% by training all layers on EffNetB0, adding the droputs layer and using ReduceLRonPlateau callback. I could try to get 1-2% more but it's not worth it. I don't have Titan X and 12 core i7 like guy in article. And he did it in a month. Therefore running more big scale experiments will consume too much time for me. |
Beta Was this translation helpful? Give feedback.
-
With a larger EffNetB4, 5, 6, 7 you should be able to get close to 90% using transfer learning on Google Colab in a few hours. I'll be doing some food modelling experiments later this month (and next) on my Twitch/YouTube, will be interesting to see where it goes. Won't necessarily be Food101 but a bunch of food datasets. Food is a hard one because of the diversity within a single class, for example, how many different ways could you view "eggs"? |
Beta Was this translation helpful? Give feedback.
-
This isn't a specific question for help, but was just curious how high anyone here has gotten on the full 101 classes of data. I was playing with it and got about 55% accuracy.
But check this out: this page shows someone getting accuracy up into the 90s using transfer learning and fine tuning.. Albeit they were much more aggressive in the number of layers they allowed to be trainable among other things.
When you think about it, it's astounding to even get 50% accuracy let alone 90% when the E(x) is really just 1/101 if it was done totally randomly.
How well has anyone else done with this?
Beta Was this translation helpful? Give feedback.
All reactions