Grasping, Reaching and Spatial Perception experimental Research

Our Research

The aim of our research is to understand how we use perceptual processes for action planning and control and how our actions are moderated by visual and cognitive factors. Our studies and experiments are designed to provide insights into the behavioural and neuronal bottlenecks of perception-action processing with the aim of understanding the complex interplay between the perceptual, cognitive, and motor processing systems. Currently, we are pursuing a number of projects that can broadly be summarised in three major research questions:

  • How are our actions moderated by physical (obstacles, bio-mechanics) and cognitive (attention, memory) constraints?
  • What is the interrelationship of action and perception processes and what does this tell us about the organization of the human visual cortex?
  • Can we use human-object interactions as a non-verbal indicator for human perception of material and surface properties?

To answer these questions, we primarily apply experimental behavioural methods (psychophysics) but also occasionally conduct neuropsychological studies. Examining the visuomotor behaviour of patients who suffer from specific perceptual and spatial impairments (such as hemianopia, visual neglect, or visual form agnosia), provides another interesting approach to investigate how perceptual deficits are related to inaccurate visuomotor behaviour.

Our Research Themes


his research combines two of my main areas of interest and expertise: (1) the effects of attention on action planning and control and (2) the relationship between action and perception processing in the human brain.

While goal-directed movements have been extensively studied in distraction-free environments, much less is known about how movement preparation and control are accomplished in complex situations, requiring the distribution of attentional resources amongst several simultaneous tasks. In everyday-life situations, humans are often required to simultaneously pay attention to multiple tasks, some of them completely unrelated to an ongoing motor action (e.g., turning the car radio down whilst carefully observing the ongoing traffic). In our studies we aim to understand the processing limits of the sensory-motor system and the behavioural and neuronal bottlenecks of perception and action processing. We are also interested in the age-related changes during dual-tasking (when cognitive and motor resources are known to decline) as well as the beneficial effects of motor expertise, such as for example in athletes and musicians affect visuomotor performance (enhanced processing of motor and cognitive tasks).

In these studies, we use human hand movements as a tool to learn about the organisation of the human visual cortex. There is an ongoing debate on whether or not visual information for perception and action is processed fundamentally differently and in anatomically segregated pathways (perception-action model). In our lab, we critically test some of the classical paradigms these assumptions are based on (e.g., visual illusion studies, letter-posting and delay memory guided actions). We also test and develop alternative explanations for previous findings that do not rely on the assumption of functionally distinct processing mechanisms.

Photographs of stimuli we use to investigate the effect of material properties on reaching and grasping.

This project developed as a recent collaboration with Prof Julie Harris from the University of St Andrews and is funded by the Leverhulme Trust. Up to now, human material perception has exclusively been assessed via verbal ratings scales (i.e. judge on a scale from 1 to 7 how rough or hard something looks/feels). We currently explore if we can derive an ecologically relevant measure of material perception that is not language based. Specifically, we measure how humans adapt their movements when material properties change (e.g., we show avoidance responses to materials and surfaces that we judge as dangerous or unpleasant to touch). We also plan to combine behavioural grasping experiments with computational analyses of surface properties to predict how humans interact with objects. The long term aim of this project is to develop a metric that links movement kinematics to specific material properties which could be useful in computer/robot vision (e.g., observations of how humans interact with certain objects can reveal some of their properties and hence allow adaptations of an intelligent agent).