My poster at NIPS 2014


This year I will be again at NIPS 2014 international conference, and I will defend a poster entitled “Deep Autoencoder Neural Networks for Prediction of Biomolecular Annotations” at two workshops. One is MCLB 2014 – Workshop on Machine Learing in Computational Biology, and the other one is MLCDA 2014 – Workshop on Machine Learning for Clinical Data Analysis, Healthcare and Genomics.

NIPS 2014 will be held in Montreal (Quebec, Canada) from 8th to 13th of December 2014.

See you in Montreal!

Our paper accepted at ACM BCB 2014


I just got notified that our paper entitled “Deep Autoencoder Neural Networks for Gene Ontology Annotation Predictions” has been accepted for ACM BCB 2014, the 5th ACM Conference on Bioinformatics, Computational Biology and Health Informatics.

This article was written by me, Peter J. Sadowski and Pierre Baldi from University of California Irvine, and reports my project developed during my six month stay in the Orange County.

ACM BCB 2014 conference will be held in Newport Beach (Southern California, USA) in late September 2014. See you there!

My favourite papers and talks at Nips 2013

I just came back from NIPS 2013, an illustrious world conference on machine learning in South Lake Tahoe, California.

Here’s the best papers and talks from the conference:

  • “Dropout training as adaptive regularization” by Stefan Wager, Sida Wang, Percy Liang (Stanford). Interesting research in which authors attribute the dropout algorithm training to an adaptive regularizer, and find common aspects with AdaGrad, an online learning method based on the adaptive gradient descent.
  • “Adaptive dropout for training deep neural networks” by Jimmy Ba, Brendan Frey (University of Toronto). Again on the dropout algorithm, authors investigate some alternatives to picking 0.5 as unity dropout probability, during the dropout training.
  • “Understanding dropout” by Pierre Baldi, Peter J. Sadowski (University of California Irvine). Authors investigate important mathematical issues of the dropout algorithm.
  • “Training and Analysing Deep Recurrent Neural Networks”  by Michiel Hermans, Benjamin Schrauwen (Universiteit Gent). Authors apply deep recurrent neural networks to sequence time series prediction, and show some interesting applications to the character sequence prediction of English Wikipedia text.
  • “Deep supervised and convolutional generative stochastic network for protein secondary structure prediction” by Jian Zhou and Olga Troyanskaya (Princeton), from the Deep Learning workshop. An interesting application of generative stochastic network (GSN) to the important issue of the protein secondary structure prediction.
  • “Tissue-dependent alternative splicing prediction using deep neural network” by Michaek K. K. Leung, Hui Yuan Xiong, Leo J. Lee and Brendan J. Frey (University of Toronto), from the Machine Learning in Computational Biology workshop. Authors apply the dropout algorithm in a deep neural network to predict new regions of tissue alternative splicing.

A really top-level conference with intriguing workshops… thanks a lot to all the organizers!

[EDIT: check out these blog posts by Paul Mineiro, hundalhh, Yisong Yue, Sebastien Bubeck, Memming]

MetaOptimize, a Q&A website on Machine Learning


Some time ago I signaled BioStars.org, a nice Q&A website on bioinformatics. Now I’d like to invite you to visit another very interesting questions and answers website: MetaOptimize, dedicate to Machine Learning, and other stuff.

I strongly think that this kind of websites could get a lot of help and effort to the scientific communitiy, because allow scientists and researchers  to interact directly in a very constructive way. Another great Q&A website platform that you probably know is StackExchange.com, with its primadonna StackOverflow.com

To get more and more useful, these websites need more and more users. That’s why I invite you all to join MetaOptimize and these other Q&A websites: you’ll get even more than what you imagine!

www.MetaOptimize.com/qa/