Does your definition of "coders" include those of us with at most middle school mathematics knowledge? I've tried some machine learning courses before, but couldn't make sense of the vast majority of information.
Can the exercises in the fast.ai deep learning course be done in another language other than python? And how much time in weeks should it take to complete the course? If my laptop doesn't have a GPU, can I still do all exercises?
The whole course builds upon the fastai library written in python so there is no point in not using python. But I feel like you can gain a pretty good understanding of deep learning in general by watching the videos.
The live course was held over a time of 7 weeks. It should take around 70h to finish it in total but I really depends on how deep you want to dive in and how easy it is for you to understand.
> If my laptop doesn't have a GPU, can I still do all exercises?
To answer this, while technically you could do the exercises, it is possible that it will take longer (perhaps much longer) to train models and whatnot without a GPU. Where you'll really have a hard time will be anything for deep learning and large neural networks. What might take a few seconds or minutes to train on a GPU might take hours with just a CPU with a few cores.
But not to worry - in the course it explains that you can use a variety of "cloud based" GPU systems that are fairly cheap to access (provided you have a credit card). So if you can't get access to a GPU, there are other ways that won't put you in the poorhouse (just make absolutely sure to completely shutdown and/or delete your VM instances you spin up).
Also - while you could use another language, it is best to stick with Python for now, because it has become almost the defacto language for ML and Deep Learning, at least as far as for education and such. Mainly because it has such great libraries for scientific computing available, plus it is easily approachable. It also has a great interface to Tensorflow (and ultimately CUDA and NVidia GPUs - also, get used to the fact that if you do anything DL related, you will be using NVidia almost exclusively; while AMD and others have their own DL hardware, it isn't nearly as well supported - maybe it will be better supported in the future).
That said - Tensorflow does have decent support in a number of languages, especially C++ - so if you are more familiar with that language, you'll be set to do some amazing things once you complete the course.
But Python, again, is really where it's at. Tensorflow is a great library that abstracts away a lot of the pain in dealing with CUDA. But even it has pain - and there are other libraries layered over it that make things even easier to work with (ie - Keras). Seriously; in the training I've had with DL (mainly thru Udacity), we had to learn how to build a simplified DL "library" of methods for building simple neural networks.
Once you learn about Tensorflow, you will have an epiphany of "OMG so much easier!". Then - when you learn about Keras, it's yet another epiphany - it makes Tensorflow look "difficult". Of course, it's best to learn in this manner if you can, so that you understand how the lower levels work - instead of everything being a black-box.
Of course, at some level there will be "black-boxes"; for instance, you may or may not want to learn how Tensorflow interfaces with CUDA. Or how CUDA interfaces with the GPU. Or certain other methods in Tensorflow or some other library that perhaps wraps certain methods and functionality. Honestly, things are much nicer today than they were even 5 years ago. You can decide just what level of depth you want to explore down to.
Why not simply run on a cloud solution (like Google Colab), which can connect to GPU (and even TPU!) instances and is incredibly easy to work with out of the box?
I took the in-person version of this course in Fall 2018, and it's definitely approachable for someone who doesn't yet know any linear algebra or remembers calculus all that well. You just start doing things with deep learning right away. After these quick wins, you'll hopefully get some more intuitive breakthroughs of what's actually happening. Then and only then do they suggest digging into the math more deeply (I'm still doing this last part).