Deep Learning

After working in robotics for a while, it becomes apparent that despite all the recent progress, the underlying machine learning tools we have at our disposal are still quite primitive. Our standard stock of techniques like Support Vector Machines and boosting methods are both more than ten years old, and while you can do some neat things with them, in practice they are limited in the kind of things they can learn efficiently. There’s been lots of progress since the techniques were first published, particularly through careful design of features, but to get beyond the current plateau it feels like we’re going to need something really new.

For a glimmer of what “something new” might look like, I highly recommend this wonderful Google Tech Talk by Geoff Hinton: “The Next Generation of Neural Networks“, where he discusses restricted Boltzmann machines. There are some stunning results, and an entertaining history of learning algorithms, during which he amusingly dismisses SVMs as “a very clever type of Perceptron“. There’s a more technical version of the talk in this NIPS tutorial, along with a workshop on the topic. Clearly the approach scales beyond toy problems – they have an entry sitting high on the Netflix Prize leaderboard.

These results with deep architectures are very exciting. Neural network research has effectively been abandoned by most of the machine learning community for years, partly becuase SVMs work so well, and partly because there was no good way to train multi-layer networks. SVMs were very pleasant to work with – there was no parameter tuning and black magic involved, you just throw data at them and press start. However, it seems clear that to make real progress we’re going to have to return to multi-layer learning architectures at some point. It’s good to see progress in that direction.

Hat tip: Greg Linden

4 Responses to “Deep Learning”


  1. 1 Milan

    It’s good to see a new post. It has been a while, I think.

    How is your research going?

  2. 2 mark

    Yes, I’ve been neglecting this blog for a while – combination of holidays and deadlines. Research is going very well, thanks. Bunch of new results recently, keeps me happy. Must write a post about it soon. I can only aspire to your daily posting schedule though!
    How’s things in Ottawa? It looks rather chilly in your latest photos.

  3. 3 Milan

    Ottawa remains very chilly, though there seems to be some hope of sustained melting in the next month or so.

    A recent visit by a friend from Vancouver was really excellent, and I am delighted that she will be spending the summer in this most bureaucratic of cities.

    In any event, I hope we catch one another online soon.

  1. 1 Deep Learning on 16,000 cores at Educating Silicon
Comments are currently closed.