{"id":710,"date":"2015-12-15T11:00:52","date_gmt":"2015-12-15T16:00:52","guid":{"rendered":"http:\/\/minireference.com\/blog\/?p=710"},"modified":"2020-11-20T08:21:48","modified_gmt":"2020-11-20T13:21:48","slug":"impressions-from-nips-2015","status":"publish","type":"post","link":"https:\/\/minireference.com\/blog\/impressions-from-nips-2015\/","title":{"rendered":"Impressions from NIPS 2015"},"content":{"rendered":"<p>Last week I attended the <a href=\"https:\/\/nips.cc\/Conferences\/2015\">NIPS conference<\/a> and it felt like grappa shot: intense but good for brain function. There are so many advances in research, and industry is shipping ML in products, and GPUs make previously-impossible things possible. Definitely an exciting time to be.<\/p>\n<p><!--more--><\/p>\n<h3>Monday<\/h3>\n<p>The opening talk was about deep earning. In fact, a <em>lot<\/em> of the conference was about deep learning. Non-conformist as I am, I tried not to focus too much on that. All the new applications and interest from industry is great, but I don&#8217;t think the research is <em>that<\/em> revolutionary. I read this review paper <a href=\"http:\/\/www.nature.com\/nature\/journal\/v521\/n7553\/full\/nature14539.html\">Deep learning<\/a> (paywall, <a href=\"https:\/\/www.dropbox.com\/s\/fmc3e4ackcf74lo\/2015-lecun.pdf\">door<\/a>) and I&#8217;m going to limit myself to this level of understanding for now. With 4000 people in one place and 500+ posters to look, it&#8217;s hard enough to keep track of topic-modelling topics covered!<\/p>\n<h3>Saturday<\/h3>\n<p>I attended the <a href=\"https:\/\/sites.google.com\/site\/nipsbnp2015\/home\">Bayesian Nonparametrics workshop<\/a> which was the who-is-who of the community. I figured that was my only chance to be in a community where I&#8217;ll understand more than every second word said. The morning started with a very interesting &#8220;theory&#8221; talk by Peter Orbanz. I&#8217;m sure he&#8217;ll post the slides at some point, but in the meantime I found a 100pp PDF of lecture notes by him: <a href=\"http:\/\/stat.columbia.edu\/~porbanz\/papers\/porbanz_BNP_draft.pdf\">Notes on Bayesian Nonparametrics<\/a>. There&#8217;s also a <a href=\"https:\/\/www.youtube.com\/watch?v=F0_ih7THV94\">video of a workshop<\/a> from 4 years ago. This guy knows his stuff, and knows how to explain it too.<\/p>\n<p>Another excellent talk was by Mike Hughes on <a href=\"http:\/\/cs.brown.edu\/~sudderth\/papers\/aistats15hdpMemoized.pdf\">Scalable variational inference that adapts the number of clusters<\/a>. This looked like good ideas to manage fragmentation (too many topics) and finally starts to show BNP&#8217;s killer app &#8212; automatically learning the right number of topics for a given corpus.<\/p>\n<p>During the discussion panel, the question of open source code for BNP arose and the following projects were mentioned: <a href=\"https:\/\/bitbucket.org\/michaelchughes\/bnpy-dev\/\">bnpy<\/a> and <a href=\"https:\/\/github.com\/trappmartin\/BNP.jl\">BNP.jl<\/a>.<\/p>\n<p>Around lunch time I caught part of the talk by <a href=\"http:\/\/arxiv.org\/find\/stat\/1\/au:+Blei_D\/0\/1\/0\/all\/0\/1\">David Blei<\/a> which talked about the papers <a href=\"http:\/\/arxiv.org\/abs\/1401.0118\">Black Box Variational Inference<\/a> and <a href=\"http:\/\/arxiv.org\/abs\/1511.02386\">Hierarchical Variational Models<\/a>. Very interesting general-purpose methods. I should look into <a href=\"https:\/\/github.com\/Blei-Lab\">some source code<\/a>, to see if I can understand things a bit better.<\/p>\n<p>In the afternoon, Amr Ahemed gave an interesting talk about large-scale LDA and <a href=\"http:\/\/www.sravi.org\/pubs\/fastlda-kdd2014.pdf\">efficient LDA sampling using alias method<\/a>. First for data-parallelism, the workload can be split to thousands of machines, and each machine keeps topic-sparse word-in-topic &#8220;counts replica&#8221; on individual machines (that syncs asynchronously with shared-global state). If global topic model knows about K different topics, the local node x need to know only about the $k_x$ topics that occur on the documents it will be processing, since $k_x &lt;&lt; K$, this allows to push the $K$. Very neat. Another interesting trick they use is <b>alias sampling<\/b> which performs some preprocessing of any n-dimensional miltinomial distribution to allow to take samples from it efficiently. It doesn&#8217;t make sense if you want just one sample, but if you&#8217;re taking many samples then the upfront cost of creating the &#8220;alias distribution&#8221; is amortized overall. It feels like we&#8217;re seeing a 3rd-generation parallel LDA ideas start to come-up.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Last week I attended the NIPS conference and it felt like grappa shot: intense but good for brain function. There are so many advances in research, and industry is shipping ML in products, and GPUs make previously-impossible things possible. Definitely an exciting time to be.<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[4,8],"tags":[],"class_list":["post-710","post","type-post","status-publish","format-standard","hentry","category-computers","category-machine-learning"],"_links":{"self":[{"href":"https:\/\/minireference.com\/blog\/wp-json\/wp\/v2\/posts\/710","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/minireference.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/minireference.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/minireference.com\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/minireference.com\/blog\/wp-json\/wp\/v2\/comments?post=710"}],"version-history":[{"count":4,"href":"https:\/\/minireference.com\/blog\/wp-json\/wp\/v2\/posts\/710\/revisions"}],"predecessor-version":[{"id":1482,"href":"https:\/\/minireference.com\/blog\/wp-json\/wp\/v2\/posts\/710\/revisions\/1482"}],"wp:attachment":[{"href":"https:\/\/minireference.com\/blog\/wp-json\/wp\/v2\/media?parent=710"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/minireference.com\/blog\/wp-json\/wp\/v2\/categories?post=710"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/minireference.com\/blog\/wp-json\/wp\/v2\/tags?post=710"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}