Newswise — Our hands perform tasks beyond mere object grasping. They also aid in the handling of visual input. As you mobilize your hands, your brain initially perceives and comprehends sensory data, followed by choosing the suitable action plan prior to initiating and carrying out the intended motion. The accomplishment of this endeavor is impacted by various factors, including effortlessness, the presence of external stimuli (distractions), and the frequency of previous task performances.

Consider, for instance, a baseball outfielder intercepting a ball. Their objective is to ensure that when the ball approaches them, it ultimately lands in their mitt (the hand-objective). As soon as the batter strikes the ball and it soars towards the outfielder, they commence the visual perception and determination of the optimal course of action (hand-preparation). Subsequently, they anticipate the appropriate placement of their hand and body relative to the ball, guaranteeing a successful catch (upcoming-hand position).

For years, scholars have contemplated the potential impact of the hand-objective on endogenous attention, which is often described as top-down attention—a self-directed focus that we control. It resembles a personal spotlight that we can direct at will, whether it involves searching for an object, minimizing distractions during work, or conversing in a noisy setting. Unraveling the mechanisms underlying hand movements and attention could contribute to the advancement of AI systems capable of facilitating the acquisition of intricate movements and manipulations.

Now, a team of researchers at Tohoku University has identified that the hand-movement goal attention acts independently from endogenous attention.

Satoshi Shioiri, a researcher from Tohoku University's Research Institute of Electrical Communication (RIEC) and co-author of the study, explained, "Through two experiments, our aim was to investigate whether hand-movement preparation leads to the redirection of endogenous attention towards the hand-movement goal, or if it is an independent process that enhances visual processing."

During the initial experiment, the researchers aimed to separate the attention focused on the hand-movement goal from the top-down visual attention. They achieved this by instructing participants to move their hands either to the same location as a visual target or to a different location than the visual target, based on provided cues. Importantly, the participants were unable to visually observe their own hands. To establish a baseline, there was also a control condition in which participants were not instructed to move their hands at all.

The second experiment examined whether the order in cues to the hand-movement goal and the visual target impacted visual performance.

To measure participants' brain activity, Satoshi and his team utilized an electroencephalogram (EEG). They specifically focused on studying the steady-state visual evoked potential (SSVEP). When an individual is presented with a visual stimulus, such as a flashing light or a moving pattern, their brain generates rhythmic electrical activity at the same frequency as the stimulus. SSVEP refers to the changes observed in the EEG signal, which serve as an indicator of the degree to which our brain selectively attends to and processes visual information. Essentially, SSVEP helps in assessing the spatial window of visual attention.

According to Satoshi, the findings from the experiments led to the conclusion that even when top-down attention is directed towards a location distant from the anticipated hand position, visual processing of the future hand location still takes place. Additionally, the researchers discovered that this process operates within a significantly narrower spatial window compared to top-down attention. These observations suggest that the processes of hand-movement preparation and top-down attention are distinct from each other.

The research team expresses optimism regarding the practical application of their study's findings in the development of attention management systems capable of maintaining suitable attention states across diverse circumstances.

Details of the research were published in the Journal of Cognitive Neuroscience on May, 8, 2023.

Journal Link: Journal of Cognitive Neuroscience