Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Can anyone see a reason why they wouldn’t just allow you to provision (and pay for) a persistent Google Cloud VM instead? (I currently do that manually and need port forwarding to a machine that runs Jupyter.)

It’s hard for me to understand why Colab would build such a vague pro tier instead of the simplest possible solution: let me pay for my compute.

There’s so much more potential, too; they could offer whole clusters on demand, with really simple Python integrations say using dask, or ray.



Yes, this is how I did this when I had easy access to GCP credit. Have a GPU VM with one of the ready to go images that launched jupyter on boot (or in a tmux session). Then just port forward 8888 over ssh. Worked very well and I found it far more easy to work with than colab.

This is not cheap though. A V100 runs you over $1.5k a month if you don't turn it off.


They do, and that service is called Platform AI Notebooks: https://cloud.google.com/ai-platform-notebooks


The Deep Learning VM (available as a native option when creating a new GCE VM) has a bunch of DL tools, and starts a JupyterLab instance on launch: https://cloud.google.com/deep-learning-vm

There is an easy native command for port forwarding to Jupyter: https://cloud.google.com/ai-platform/deep-learning-vm/docs/j...


Perhaps this is more profitable. I’ve heard that Colab has become very popular for people doing certain kinds of deep learning, but that the wait for GPU instances has been frustrating. It seems a paid tier to get priority is directly targeting those users.


I've recently used Colab to run some pretty cool StyleGAN notebooks. Finally I can replicate so many cool projects without bending my head how to setup gcp, virtualbox or install tensorflow into geforce macbook.

Doubt I'll pickup deep learning as a profession by this, but it's a step forward.


I doubt they are making much (if anything) from Colab. I think people have found it very convenient (which it is!) but increasingly complaining about the limitation, so the team probably decided to up the limits and try to cover the costs.


$10/mo doesn't pay for much compute in Google Cloud. Colab Pro is offering GPUs and high memory instances, presumably under the assumption that interactive users will be mostly idle.


Under the assumption that many users on one GPU eventually end up needing more compute and migrating along the easiest cost path to GCE and TPU's, which they earn lots from.


Because they're trying to introduce a different paradigm for computing. Notebooks are portable, purpose-oriented, and allow Google to make decisions about the underlying implementation allowing for a $10 monthly price point and an incredible level of performance not unlike what they've done for BigQuery.


I haven't used colab externally, but internally in Google it is used as a scratchpad for quick and dirty trying out an idea. By the time you realize the idea may lead up to something, you also hit some specific libraries or resources you need to get which are not available in colab. Then you go ahead and build and deploy a proper solution, which I understand what your Cloud VM would lead to.

As such, I don't think there would be a market for quick and dirty scratchpads backed by rock solid but expensive dedicated compute.


You could alternatively use the google AI Platform Notebooks which are basically the same thing and billed per hour. You can now Prebake a docker images with all of the libraries that you need.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: