Category-specific learning of color, orientation, and position regularities guide visual search

J Exp Psychol Hum Percept Perform. 2023 Jun;49(6):907-922. doi: 10.1037/xhp0001098.

Abstract

In six experiments, we examined how object categories structure the learning of environmental regularities to guide visual search. Participants searched for pictures of exemplars from a set of real-world categories in a repeated search task modeled on the contextual cuing literature. Each trial began with a category label cue, followed by a search array of natural object photographs, with one target object matching the category label. Participants completed a series of search blocks, each containing one search trial per category. Individual categories were assigned either to the Repeated condition or to the Novel condition. For Repeated categories, a perceptual feature value of target objects remained constant across each search for that category: color (Experiments 1 and 3), orientation (Experiment 2), and position (Experiment 4). For Novel categories, the relevant feature value varied randomly for each search for that category. We observed a categorical cuing effect, with faster improvement in reaction time across blocks for Repeated compared with Novel categories. This effect reflected both the episodic retrieval of the immediately preceding search episode in that category and cumulative learning across multiple searches within a category. The cuing effect was observed from the very first repetition, a point in the experiment where the learning effect was not plausibly strategic. Finally, participants could reliably retrieve and report the repeated values in memory tests administered either at the end of the experiment or when the effect first emerged (Experiments 5 and 6), demonstrating that nonstrategic guidance of attention can be driven by explicitly available memory. (PsycInfo Database Record (c) 2023 APA, all rights reserved).

MeSH terms

  • Attention*
  • Cues
  • Humans
  • Learning*
  • Pattern Recognition, Visual
  • Reaction Time