Full text loading...
Abstract
Walking has been the focus of much of the existing work on multitasking given its complexity as a cognitive process and importance in daily life. This complexity is evidenced by gait variation observed in dual-task contexts. However, an open question concerns how walking effects concurrently performed cognitive tasks. Thus, we use virtual reality to investigate how walking and visual distraction modulate language processing in an ecologically valid, yet controlled manner. In this novel experimental paradigm, we gradually increase the cognitive burden on participants’ lexical decision responses by adding visual distractors and concurrent walking demands. Accordingly, healthy, young participants performed a lexical decision task as either (1) a single-task+, while seated and with randomly appearing visual distractors, or as (2) a multitask, while walking on a self-paced treadmill through a VR city scape including visual distractors. Participants generally made faster lexical decisions while walking. However, in the single-task+ condition, participants made more errors when a distractor was present. These effects were somewhat modulated by individual differences in visual processing. Crucially, no clear dual-task cost was observed; rather, behavior adapted to increased demands within a specific domain. Overall, these findings suggest an interplay of both task-related and individual characteristics determining multitask performance.
Article metrics loading...
Full text loading...
References
Data & Media loading...