Attention: A Machine Learning Perspective

Publication: Research - peer-reviewArticle in proceedings – Annual report year: 2012

View graph of relations

We review a statistical machine learning model of top-down task driven attention based on the notion of ‘gist’. In this framework we consider the task to be represented as a classification problem with two sets of features — a gist of coarse grained global features and a larger set of low-level local features. Attention is modeled as the choice process over the low-level features given the gist. The model takes its departure in a classical information theoretic framework for experimental design. This approach requires the evaluation over marginalized and conditional distributions. By implementing the classifier within a Gaussian Discrete mixture it is straightforward to marginalize and condition, hence, we obtained a relatively simple expression for the feature dependent information gain — the top-down saliency. As the top-down attention mechanism is modeled as a simple classification problem, we can evaluate the strategy simply by estimating error rates on a test data set. We illustrate the attention mechanism on a simple simulated visual domain in which the choice is over nine patches in which a binary pattern has to be classified. The performance of the classifier equipped with the attention mechanism is almost as good as one that has access to all low-level features and clearly improving over a simple ‘random attention’ alternative.
Original languageEnglish
Title of host publication2012 3rd International Workshop on Cognitive Information Processing (CIP)
Number of pages6
Publication date2012
ISBN (print)978-1-4673-1877-8
StatePublished - 2012
Event3rd International Workshop on Cognitive Information Processing (CIP) - Baiona, Spain


Workshop3rd International Workshop on Cognitive Information Processing (CIP)
Internet address
CitationsWeb of Science® Times Cited: No match on DOI
Download as:
Download as PDF
Select render style:
Download as HTML
Select render style:
Download as Word
Select render style:

ID: 10020568