Google Drive and Google Colab. One of the drawbacks of Google Drive is that you cannot see how big a folder is. This can create problems when you want to download a subset of your cloud data but you are not sure how much space to allocate for your download. This can be done using Google Colab, Google’s free online coding platform. I faced the same issue while training the text generator machine learning model inside google colab. so what are the solutions to overcome this problem? I know you read those articles and post about how to increase the ram in google colab by saving it in drive and crashing it with the below code. a = [] while(1): a.append('1') but this wont work. Im not sure if colab uses an ssd or not. But one way to increase data loading speed is to copy the data files to the colab vm instead of the network mounted google drive. You can do this using cp command in linux. If your dataset is not big you can even load the entire dataset in memory. From my experience, for textual data it is usually Archon332 commented on May 3. After a few minutes colab started to cut you off from connection with GPU. There is no illegal code message and code can be rebooted, but I suppose Google started to turn UI off by force. If you want to use all the RAM available, you simply need to use a bigger dataset as mentionned by @ AEM (more data = more RAM usage). If you want to see the effects of more data on the RAM, you can try this simple code : data = [] while (1): data.append ('1234') This infinite loop makes you add more & more data into a list until you hit the Deepnote is a special-purpose notebook for collaboration. It is Jupyter-compatible. Deepnote has a free tier with limits on features; they also offer an enterprise tier. 6. Noteable. Noteable is a collaborative notebook platform and supports no-code visualization. Notable offers a free tier and a enterprise tier. 7. after get access a Google Drive file from Colab by: from google.colab import drive drive.mount ('/content/drive') you can use this command line to make copy from file or directory. First, navigate to the directory where the file you want to copy is located. You can use the %cd command to change to the directory. You will have to remove the objects that are still stored in ram such as your dataframes using 'del' and then try using `gc.collect ()` (import gc which is python garbage collector (I don't think it will affect that much as automatic garbage collection is always there)) 1. true. lJj0.