Combining top-down processes to guide eye movements during real-world scene search

Malcolm, George L. ORCID: and Henderson, John M. (2010) Combining top-down processes to guide eye movements during real-world scene search. Journal of Vision, 10 (2). ISSN 1534-7362

Full text not available from this repository. (Request a copy)


Eye movements can be guided by various types of information in real-world scenes. Here we investigated how the visual system combines multiple types of top-down information to facilitate search. We manipulated independently the specificity of the search target template and the usefulness of contextual constraint in an object search task. An eye tracker was used to segment search time into three behaviorally defined epochs so that influences on specific search processes could be identified. The results support previous studies indicating that the availability of either a specific target template or scene context facilitates search. The results also show that target template and contextual constraints combine additively in facilitating search. The results extend recent eye guidance models by suggesting the manner in which our visual system utilizes multiple types of top-down information.

Item Type: Article
Uncontrolled Keywords: visual cognition,search,eye movements,scene recognition,visual-attention,time-course,perception,guidance,target,allocation,durations,selection,salience,vision
Faculty \ School: Faculty of Social Sciences > School of Psychology
UEA Research Groups: Faculty of Social Sciences > Research Groups > Cognition, Action and Perception
Faculty of Social Sciences > Research Groups > Developmental Science
Depositing User: Pure Connector
Date Deposited: 05 Apr 2016 11:00
Last Modified: 22 Oct 2022 00:58
DOI: 10.1167/10.2.4

Actions (login required)

View Item View Item