Probabilistic Active Learning: A Short Proposition

by Georg Krempl, Daniel Kottke, Myra Spiliopoulou.

Active Mining of Big Data requires fast approaches that ideally select for a user-specified performance measure and arbitrary classifier the optimal instance for improving the classification performance. Existing generic approaches are either slow, like error reduction, or heuristics, like uncertainty sampling. We propose a novel, fast yet versatile approach that directly optimises any user-specified performance measure: Probabilistic Active Learning (PAL).

PAL follows a smoothness assumption and models for a candidate instance both the true posterior in its neighbourhood and its label as random variables. By computing for each candidate its expected gain in classification performance over both variables, PAL selects the candidate for labelling that is optimal in expectation. PAL shows comparable or better classification performance than error reduction and uncertainty sampling, has the same asymptotic linear time complexity as uncertainty sampling, and is faster than error reduction.

Accepted on European Conference on Artificial Intelligence (ECAI) 2014.

Link: http://ebooks.iospress.nl/volumearticle/37114
PDF: Author’s manuscript
BibTEX: here