Machine Learning Journey: before the Masters

I’ve been asked a few times how I learnt about / got in to Machine Learning from the beginning.

I can’t remember what made me initially think ‘wow this is cool’ but it was probably browsing around on Coursera looking at all the things I could learn – one of my favourite things to do.

I look the Andrew Ng Machine Learning course a few years ago. And it took me a long time. Longer than the recommended time. It was tough, and time intensive. I watched and re-watched lectures. I coded a lot. I went over long-forgotten maths. I drew a LOT of matrices. I took a lot of notes. I took the quizzes, and when I got the answers wrong, I kept the wrong answers along with the right ones so I could learn from them.

And I thought – this is amazing. Actually, I love this.

Also – a huge thing for me was meeting James Weaver, and collaborating with him on the Machine Learning Exposed talks and workshops we gave. Along with the mentorship of a world-class speaker, he brought so many resources to my attention, and asked questions that forced me to dive deeper into individual elements of Machine Learning that I might have skimmed over, never truly understanding.

Things grew from that, to more courses, more reading, and more coding. Now I’m embarking on a Masters at UCD with a focus on AI and Machine Learning.
SO, from the start of my journey to a week before the Masters begins, here for you is a list of free resources that helped me:

1. Andrew Ng’s Machine Learning course.
Where? Coursera
Time commitment: Lectures and programming exercises. Expect about 10 hours a week.
Will I need maths: No but it will help.
Do I need a programming language other than Java/Ruby/Haskell…: Yes, Octave/Matlab but the course includes resources and guides on the language
Pro tip: switch sessions and do it at a slower pace to make sure you fully understand things if you want to take the quizzes and programming assignments. Draw LOTS of pictures.

1.1 Help with gradient descent derivation: 
Where? Chris McCormick’s blog
When? Week 2
Time commitment: 15-30 mins depending on if you want to work through things yourself.
Pro tip: Great to read at the start of Andrew Ng course when the term ‘gradient descent’ pops up.

1.2 Why are we talking about bias for Neural Networks?: 
Where? Stack Overflow
When? Week 5
Time commitment: 5 mins.
This just cleared things up for me on the bias-at-every-layer of NNs.

1.3 I still don’t get back propagation: 
Where? Michael Nielsen’s online book
When? Week 5
Time commitment: 45-60 mins
Pro tip: We’re departing from Andrew Ng’s notation here (instead of theta, w, instead of J for the cost function, C etc) so make a table of concepts. The theta/w notation, for example, is used interchangeably in lots of texts/blogs so it’s good to get used to it.

1.4 I want to know more about back propagation: 
Where? Explorations in Parallel Distributed Processing: A Handbook of Models, Programs, and Exercises
When? Kind of Week 5, until you’re comfortable with NNs, or kind of wait until you’re on GH’s course (below). The terminology and symbols change here and there (e.g. switches from thetas to w) and this is in line with GH. See how you find it, but also come back to it later.
Time commitment: During GH’s course, shorter. Otherwise a few hours.
Pro tip: It took me quite a few passes (and extra reading) to feel like, conceptually, I grasp back propagation in NNs. This is interesting, and I recommend doing a few things like trying to write your own NN for the XOR problem, for example, in Matlab. If you try the GH Coursera course below, you can compare using perceptrons for the XOR problem vs sigmoid neurons.
Warning: do not attempt to write your own gradient descent algorithm until much later. This caused me evenings and evenings of pain, and a diversion into maths I still don’t get yet (Polak-Ribiere conjugate and Wolfe conditions anyone).

2. Geoffrey Hinton et Al’s Neural Networks course
Where? Coursera
Time commitment: Lectures and programming exercises. Expect about 10 hours a week. Longer if, like me, you need to remind yourself about maths.
Will I need maths: Yes. Some of the quiz questions require you to work out partial differentials (not from scratch, but a solid grounding in calculus is needed). You can watch the course without doing the quizzes, but it isn’t as easy viewing as Andrew Ng’s course.
Do I need a programming language other than Java/Ruby/Haskell…: Yes, Octave/Matlab, and it is harder than the Andrew Ng course. However the course includes resources and guides on the language.
Pro tip: do this straight after the Andrew Ng course and it will make so much more sense.

3. Deep Learning by Ian Goodfellow and Yoshua Bengio and Aaron Courville
Where? Online version of the book
Time commitment: Huge. I’ll let you know when I finish it.
Pro tip: do this in little chunks as you get into the Geoffrey Hinton course. If you only read a tiny bit, read Ch1. It’s beautifully written.

Finally, interesting things to play with:
http://projector.tensorflow.org/
http://scs.ryerson.ca/~aharley/vis/
http://playground.tensorflow.org/
https://www.kaggle.com/

Ok that’s it for now, I’ll see what else I think of.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s