-
I am trying to adapt character level TextVectorization on whole dataset "PubMed_200K_RCT", but during this process the amount of RAM provided by google colab free version is not enough and the session crashes. Is there any way to solve that or I have to try it on local machine? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
Hey Yash, If you can't access the full dataset, try resetting Google Colab to get a larger GPU (if it's GPU RAM which is running out, from my experience, this is most likely). You can check the size of your GPU using Otherwise, if you're still running into issues, it may be best using a smaller batch size (e.g. less than 32, which is what we use for most of the course). If you're still running into issues, try building a model with less data - I have a feeling Colab may have updated recently to give smaller compute resources by default. At least they're still free. |
Beta Was this translation helpful? Give feedback.
Hey Yash,
If you can't access the full dataset, try resetting Google Colab to get a larger GPU (if it's GPU RAM which is running out, from my experience, this is most likely).
You can check the size of your GPU using
!nvidia-smi
.Otherwise, if you're still running into issues, it may be best using a smaller batch size (e.g. less than 32, which is what we use for most of the course).
If you're still running into issues, try building a model with less data - I have a feeling Colab may have updated recently to give smaller compute resources by default. At least they're still free.