Variational Bayes Learning from Relevant Tasks Only (2008)
AUTHORS:
Peltonen Jaakko
,
Yaslan Yusuf,
Kaski Samuel
BOOKTITLE:
Learning from Multiple Sources Workshop, 13 December 2008, Whistler Canada
URL:
http://www.cis.hut.fi/projects/mi/papers/nips08_lms_rsl_abstract.pdf
@inproceedings{ Peltonen08, author = "Peltonen, Jaakko and Yaslan, Yusuf and Kaski, Samuel", responsibleauthor = "Kaski, Samuel", title = "Variational {B}ayes Learning from Relevant Tasks Only", url = "http://www.cis.hut.fi/projects/mi/papers/nips08_lms_rsl_abstract.pdf", booktitle = "Learning from Multiple Sources Workshop, 13 December 2008, Whistler Canada", corerank = "NA", note = "Proceedings at \url{http://web.mac.com/davidrh/LMSworkshop08/Schedule.html}", flags = "AIRC HIIT public copy", year = "2008", impactfactor = "D3", abstract = {We extend our recent work on relevant subtask learning, a new variant of multi-task learning where the goal is to learn a good classifier for a task-of-interest with too few training samples, by exploiting "supplementary data" from several other tasks. It is crucial to model the uncertainty about which of the supplementary data samples are relevant for the task-of-interest, that is, which samples are classified in the same way as in the task-of-interest. We have shown that the problem can be solved by careful mixture modeling: all tasks are modeled as mixtures of relevant and irrelevant samples, and the model for irrelevant samples is flexible enough so that the relevant model only needs to explain the relevant data. Previously we used simple maximum likelihood learning; now we extend the method to variational Bayes inference more suitable for high-dimensional data. We compare the method experimentally to a recent multi-task learning method and two naive methods.} }