Begin your Deep Learning project for free (free GPU processing , free storage , free easy upload… by@theamrzaki

Begin your Deep Learning project for free (free GPU processing , free storage , free easy upload…

amr zaki HackerNoon profile picture

amr zaki

In this story i would go through how to begin a working on deep learning without the need to have a powerful computer with the best gpu , and without the need of having to rent a virtual machine , I would go through how to have a free processing on a GPU , and connect it to a free storage , how to directly add files to your online storage without the need to download then upload , and how to unzip file for free online .

I am going through how i am beginning my deep learning project using google colab that allows you to start working directly on a free Tesla K80 GPU using Keras, Tensorflow and PyTorch, and how i connect it to google drive for my data hosting , I would also share some techniques i have used to automatically download data to google drive without needing to first download them , and then uploading them to google drive , then I would go through how you can unzip your files online , without the need to unzip them in python code .

So to sum all things up i would be talking

  1. Google Colab
  2. Connecting to Google Drive
  3. Downloading data directly to google drive without need to download then upload (from an external link , another google drive)
  4. unzip files online

so lets start

Google Colab

google colab is a free to use Jupyter notebook , that allows you to use free Tesla K80 GPU it also gives you a total of 12 GB of ram , and you can use it up to 12 hours in row .

You can either

  1. begin a new google colab
  2. or upload a github ipyn notebook

so lets begin with how you can create a google colab

1- Begin a new google colab

1-a) go to

1-b) select Google Drive Tab (to save your new google colab to google drive)

2-c) select New Python 3 Notebook (you can also select python 2 notebook)

a blank notebook would be created to your google drive , it would look like this

You can change the runtime of your notebook from selecting the runtime button in the top menu , to

  1. change which python version you are using
  2. choose a hardware accelerator from ( GPU , TPU )
2- upload a github Jupyter notebook to google colab

you can automatically upload an already made Jupyter notebook from github automatically to colab , this truly helps in testing code on the fly , so lets try it out

2-a) this is a Jupyter notebook that i have created for text summurization

2-b) lets also go to , but this time we would select github tab

3-c) then we just paste the Jupyter notebook link , and click upload

You can even read more in this tutorial

As you have seen , its rather easy to start a colab project , but for any real world data project , it would require a truly huge datasets , there are many ways to do this , but to truly achieve the goal of having truly huge datasets , you would need the power of google drive .

So lets connect our google drive to our google colab notebook.

Connect to Google Drive

in the newly created notebook , add a new code cell

then paste this code in it

!apt-get install -y -qq software-properties-common python-software-properties module-init-tools
!dpkg -i google-drive-ocamlfuse_0.7.0-0ubuntu1_amd64.deb
!apt-get install -f
!apt-get -y install -qq fuse
from google.colab import auth
from oauth2client.client import GoogleCredentials
creds = GoogleCredentials.get_application_default()
import getpass
!google-drive-ocamlfuse -headless -id={creds.client_id} -secret={creds.client_secret} < /dev/null 2>&1 | grep URL
vcode = getpass.getpass()
!echo {vcode} | google-drive-ocamlfuse -headless -id={creds.client_id} -secret={creds.client_secret}
!mkdir -p drive
!google-drive-ocamlfuse drive

this would connect to your drive , and create a folder that your notebook can access your google drive from

It would ask you for access to your drive , just click on the link , and copy the access token , it would ask this twice

after writing this code , you run the code by clicking on the cell (shift enter) or by clicking the play button on the top of your code cell

then you can simply access any file by its path in form of

path = "drive/test.txt"

by this you can load truly huge file , as your google drive can hold up to 115GB and google colab offers a good ram of 12 GB so this would open the way for truly great data apps

But a new problem would arise , as most of the data that one would use in his data projects would be in the gigabytes , so to work with them in your google colab you would need to download them , then to upload them , this could really take much time , as most people have really slow upload speeds , which would make working on a multi gigabyte datasets quite an issue , so that what the next section discovers .

I have managed to use some services that allow you to remote download links directly to your google drive without need to download them , and then uploading again to your drive .

Also there is another service that allows you to have a copy of a file (that is hosted on another google drive ) directly to your google drive ,

so Lets begin

Downloading data directly to google drive without need to download then upload

primarily you have 2 scenarios , either the file that you need is

  1. hosted on a website , that you can download the dataset through a link
  2. the needed file is hosted on another google drive

lets discover the first scenario

1-a) Download from external link

Here we would use multcloud , multcloud is a service that contains a free plan that allows you to Manage your cloud services , one of its services that we would use is its ability to download files directly to your google drive ,

You just paste your link , and wait for the link to be downloaded directly to your google drive , using he free plan you would be able to transfer 50GB/month

After signing up

Fist Add Google Drive to multcloud
then authorize your account
after sign in , and after registering your google drive , click on your google drive
then click on upload
then click on upload url
then paste your link
then you can see your download progress by click on the top button , then url tasks
then a menu would appear , where you can track your progress , or even add urls from here
1-b) Download from another google drive

sometimes there is a case where your files actually are on another google drive , and you just try to have a copy of them to your google drive

I have tried using multcloud in this case , using the direct google link to download the files , but with no success , so i came across Copy, URL to Google Drive , which enables you to easily copy files between different google drives

paste your link , name it , then save to google drive
after authenticating , you just click save to google drive

But most of the datasets are found in a zip format , they would need to be unzipped to be able to deal with them in your project .

Some people unzip their files inside google colab using python code , but i find that there are much more productive ways , a way that don’t require writing code and maintaining the code for different zip formats , one of them is using cloudconvert

Unzip Files online for free

cloudconvert is an app that is available to use with google drive , that allows you to unizp your files for free , it can unzip a huge range of zip formats , so lets begin

Fist lets connect more apps in your google drive
search for cloudconvert , then hit enter , then add the app
then lets open the zip through cloudconvert
then you select the format to be extract , then you hit start converting (the checkbox of save to drive must be checked)

then your zip would be saved to your drive , this is a much convent way than writing code to unzip

through the above sections , we walked through how you can build a data project without need of a strong device , without need of renting one , or without having to have a huge expensive online storage , or without the need to download and then upload , and we presented a way of unziping files online

I truly hope this was beneficial to you , I am looking forward to hear your comments , and please tell me if it was beneficial to you .


Signup or Login to Join the Discussion


Related Stories