Abstract
Navigation is instrumental to daily life and is often used to encode and locate objects, such as keys in one's house. Yet, little is known about how navigation works in more ecologically valid situations such as finding objects within a room. Specifically, it is not clear how vision vs. body movements contribute differentially to spatial memory in such small-scale spaces. In the current study, participants encoded object locations by viewing them while standing (stationary condition) or by additionally being guided by the experimenter while blindfolded (walking condition) after viewing the objects. They then retrieved the objects from the same or different viewpoint, creating a 2 × 2 within subject design. We simultaneously recorded participant eye movements throughout the experiment using mobile eye tracking. The results showed no statistically significant differences among our four conditions (stationary, same viewpoint as encoding; stationary, different viewpoint; walking, same viewpoint; walking, different viewpoint), suggesting that in a small real-world space, vision may be sufficient to remember object locations. Eye tracking analyses revealed that object locations were better remembered next to landmarks and that participants encoded items on one wall together, suggesting the use of local wall coordinates rather than global room coordinates. A multivariate regression analysis revealed that the only significant predictor of object placement accuracy was average looking time. These results suggest that vision may be sufficient for encoding object locations in a small-scale environment and that such memories may be formed largely locally rather than globally.
Original language | English (US) |
---|---|
Article number | 108565 |
Journal | Neuropsychologia |
Volume | 184 |
DOIs | |
State | Published - Jun 6 2023 |
Keywords
- Ecological validity
- Mobile eye tracking
- Navigation
- Spatial memory
ASJC Scopus subject areas
- Experimental and Cognitive Psychology
- Cognitive Neuroscience
- Behavioral Neuroscience